In this work we study performances of different machine learning models by focusing on regularization properties in order to explain different phenomena that are observed in practice. We consider linear models on possibly infinite dimensionale feature space that are trained by optimizing an empirical mean squared errror. We study regularization properties of accelerated method like Nesterov or $ u$-method and properties of interpolating estimator where the main regularization sources vanish to zero and explain different behaviour which can be seen in practical applications.

On the Role of Regularization in Machine Learning: Classical Theory, Computational Aspects and Modern Regimes

PAGLIANA, NICOLO'
2022-05-31

Abstract

In this work we study performances of different machine learning models by focusing on regularization properties in order to explain different phenomena that are observed in practice. We consider linear models on possibly infinite dimensionale feature space that are trained by optimizing an empirical mean squared errror. We study regularization properties of accelerated method like Nesterov or $ u$-method and properties of interpolating estimator where the main regularization sources vanish to zero and explain different behaviour which can be seen in practical applications.
31-mag-2022
Statistical Learning Theory; Regularization; Gradient Methods; Least Squares; Nesterov; Linear Regression; Kernel Methods; Interpolation;
File in questo prodotto:
File Dimensione Formato  
phdunige_3943821.pdf

accesso aperto

Tipologia: Tesi di dottorato
Dimensione 4 MB
Formato Adobe PDF
4 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1081700
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact