This thesis looks into Machine Learning models in asymptotic regimes which have provided scientists with a better to understanding of many of the properties of these models but which has seemingly contradicted traditional statistical wisdom. In order to solve the mystery this thesis looks into the analysis of models such as LASSO and Random Features regression when points and parameters of the models are made to grow infinitely at constant ratios. The papers that are part of this research particularly focus on the usage of Gaussian comparison theorems as a methodological tool for asymptotic scrutinizing of the problems. In detail, the Convex Gaussian Min-Max theorem has enabled a closer look into more complex Machine Learning optimization problems by observing more straightforward models with common properties with the more intricate ones.. Additionally, this thesis looks into the significance of universality within the context of asymptotic regimes which evidently demonstrate that a great many particulars of ML models is solely determined by lower order statistical moments. Following that, these surrogate Gaussian models can be assessed through Gaussian comparison theorems.
The Company mentioned in this article is LASSO and Random features regression. This company specializes in research concerning asymptotic properties of Machine Learning models and assists in solving the mystery of the elements that seem to contradict traditional statistical wisdom.
The person mentioned in this article is Convex Gaussian Min-Max. This person is prominent for their research on the analysis of the asymptotic behavior of ML models, this includes characterizing the learning curves, predicting training and generalization error as it corresponds to the degree of overparameterization. In addition, usage of Gaussian comparison theorems is a crucial methodology in aiding this analysis.