QIT Press - International Journal of Artificial Intelligence and Machine Learning Research and Development (QITP-IJAIMLRD)
Advancing Generalization in Deep Neural Networks through Theoretically Grounded Regularization and Geometric Optimization
Abstract
Deep neural networks have demonstrated powerful representational capabilities, yet their ability to generalize remains a central challenge in modern machine learning. This paper investigates the theoretical underpinnings of generalization by integrating principles of regularization and geometric optimization. We explore how norm-based constraints, implicit regularization via optimization paths, and flat minima contribute to better generalization bounds. By synthesizing insights from learning theory and empirical deep learning practices, we provide a principled perspective on scalable and robust model design.
Keywords
generalization
deep learning
regularization
optimization geometry
flat minima
norm constraints
implicit bias
learning theory
neural networks
Document Preview
Download PDF
https://scholar9.com/publication-detail/advancing-generalization-in-deep-neural-networks-t--36511
Details
Volume
1
Issue
1
Pages
1-5
ISSN
2384-719X
Lina Andersson
"Advancing Generalization in Deep Neural Networks through Theoretically Grounded Regularization and Geometric Optimization".
QIT Press - International Journal of Artificial Intelligence and Machine Learning Research and Development (QITP-IJAIMLRD),
vol: 1,
No. 1
Aug. 2020, pp: 1-5,
https://scholar9.com/publication-detail/advancing-generalization-in-deep-neural-networks-t--36511