Benign Overfitting In Binary Classification Of Gaussian Mixtures
Ke Wang, Christos Thrampoulidis
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:10:04
Deep neural networks generalize well despite being exceedingly overparameterized, but understanding the statistical principles behind this so called benign-overfitting phenomenon is not yet well understood. Recently there has been remarkable progress towards understanding benign-overfitting in simpler models, such as linear regression and, even more recently, linear classification. This paper studies benign-overfitting for data generated from a popular binary Gaussian mixtures model (GMM) and classifiers trained by support-vector machines (SVM). Our approach has two steps. First, we leverage an idea introduced in [Muthukumar et al. 2020] to relate the SVM solution to the least-squares (LS) solution. Second, we derive novel non-asymptotic bounds on the test error of LS solution. Combining the two gives sufficient conditions on the overparameterization ratio and the signal-to-noise ratio that lead to benign overfitting. We corroborate our theoretical findings with numerical simulations.
Chairs:
Dionysios Kalogerias