Go Back Research Article August, 2020
QIT Press - International Journal of Artificial Intelligence and Machine Learning Research and Development (QITP-IJAIMLRD)

Advancing Generalization in Deep Neural Networks through Theoretically Grounded Regularization and Geometric Optimization

Abstract

Deep neural networks have demonstrated powerful representational capabilities, yet their ability to generalize remains a central challenge in modern machine learning. This paper investigates the theoretical underpinnings of generalization by integrating principles of regularization and geometric optimization. We explore how norm-based constraints, implicit regularization via optimization paths, and flat minima contribute to better generalization bounds. By synthesizing insights from learning theory and empirical deep learning practices, we provide a principled perspective on scalable and robust model design.

Keywords

generalization deep learning regularization optimization geometry flat minima norm constraints implicit bias learning theory neural networks
Document Preview
Download PDF
Details
Volume 1
Issue 1
Pages 1-5
ISSN 2384-719X