Majoration-Minimization for Sparse SVMs - Inria - Institut national de recherche en sciences et technologies du numérique
Chapitre D'ouvrage Année : 2024

Majoration-Minimization for Sparse SVMs

Résumé

Several decades ago, Support Vector Machines (SVMs) were introduced for performing binary classification tasks, under a supervised framework. Nowadays, they often outperform other supervised methods and remain one of the most popular approaches in the machine learning arena. In this work, we investigate the training of SVMs through a smooth sparse-promoting-regularized squared hinge loss minimization. This choice paves the way to the application of quick training methods built on majorization-minimization approaches, benefiting from the Lipschitz differentiabililty of the loss function. Moreover, the proposed approach allows us to handle sparsity-preserving regularizers promoting the selection of the most significant features, so enhancing the performance. Numerical tests and comparisons conducted on three different datasets demonstrate the good performance of the proposed methodology in terms of qualitative metrics (accuracy, precision, recall, and F1 score) as well as computational cost
Fichier principal
Vignette du fichier
ATOMI_Manuscript.pdf (708.79 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04719965 , version 1 (03-10-2024)

Licence

Identifiants

Citer

Alessandro Benfenati, Emilie Chouzenoux, Giorgia Franchini, Salla Latva-Äijö, Dominik Narnhofer, et al.. Majoration-Minimization for Sparse SVMs. Advanced Techniques in Optimization for Machine Learning and Imaging, 61, Springer Nature Singapore, pp.31-54, 2024, Springer INdAM Series - INdAM Workshop: Advanced Techniques in Optimization for Machine learning and Imaging, 978-981-97-6771-7. ⟨10.1007/978-981-97-6769-4_3⟩. ⟨hal-04719965⟩
32 Consultations
12 Téléchargements

Altmetric

Partager

More