In this letter, we target the problem of model selection for support vector classifiers through in-sample methods, which are particularly appealing in the small-sample regime. In particular, we describe the application of a trimmed hinge loss function to the Rademacher complexity andmaximal discrepancy-based in-sample approaches and show that the selected classifiers outperform the ones obtained with other in-sample model selection techniques, which exploit a soft loss function, in classifying microarray data.
In-Sample Model Selection for Trimmed Hinge Loss Support Vector Machine
ANGUITA, DAVIDE;GHIO, ALESSANDRO;ONETO, LUCA;RIDELLA, SANDRO
2012-01-01
Abstract
In this letter, we target the problem of model selection for support vector classifiers through in-sample methods, which are particularly appealing in the small-sample regime. In particular, we describe the application of a trimmed hinge loss function to the Rademacher complexity andmaximal discrepancy-based in-sample approaches and show that the selected classifiers outperform the ones obtained with other in-sample model selection techniques, which exploit a soft loss function, in classifying microarray data.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.