Super Level Sets and Exponential Decay: A Synergistic Approach to Stable Neural Network Training




Chaudary, Jatin; Nidhi, Dipak; Heikkonen, Jukka; Merisaari, Harri; Kanth, Rajiv

PublisherAI Access Foundation

2025

Journal of Artificial Intelligence Research

Journal of Artificial Intelligence Research

21

83

1076-9757

1943-5037

DOIhttps://doi.org/10.1613/jair.1.17272

https://doi.org/10.1613/jair.1.17272

https://research.utu.fi/converis/portal/detail/Publication/499842072



This paper presents a theoretically grounded optimization framework for neural network training that integrates an Exponentially Decaying Learning Rate with Lyapunov-based stability analysis. We develop a dynamic learning rate algorithm and prove that it induces connected and stable descent paths through the loss landscape by maintaining the connectivity of super-level sets π‘†πœ†={πœƒβˆˆR𝑛:L(πœƒ) β‰₯πœ†}. Under the condition that the Lyapunov function 𝑉(πœƒ)=L(πœƒ)satisfiesβˆ‡π‘‰(πœƒ)Β·βˆ‡L(πœƒ) β‰₯0, we establish that these super-level sets are not only connected but also equiconnected across epochs, providing uniform topological stability. We further derive convergence guarantees using a second-order Taylor expansion and demonstrate that our exponentially scheduled learning rate with gradient-based modulation leads to a monotonic decrease in loss. The proposed algorithm incorporates this schedule into a stability-aware update mechanism that adapts step sizes based on both curvature and energy-level geometry. This work formalizes the role of topological structure in convergence dynamics and introduces a provably stable optimization algorithm for high-dimensional, non-convex neural networks.


Jatin Chaudhary would like to acknowledge the University of Turku Graduate School’s grant for conducting thiswork.


Last updated on 2025-11-09 at 07:23