Closed-Form Gaussian Spread Estimation for Small and Large Support Vector Classification
Loading...
Identifiers
ISSN: 2162-237X
E-ISSN: 2162-2388
Publication date
Advisors
Tutors
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE
Abstract
The support vector machine (SVM) with Gaussian kernel often achieves state-of-the-art performance in classification problems, but requires the tuning of the kernel spread. Most optimization methods for spread tuning require training, being slow and not suited for large-scale datasets. We formulate an analytic expression to calculate, directly from data without iterative search, the spread minimizing the difference between Gaussian and ideal kernel matrices. The proposed direct gamma tuning (DGT) equals the performance of and is one to two orders of magnitude faster than the state-of-the art approaches on 30 small datasets. Combined with random sampling of training patterns, it also runs on large classification problems. Our method is very efficient in experiments with 20 large datasets up to 31 million of patterns, it is faster and performs significantly better than linear SVM, and it is also faster than iterative minimization. Code is available upon paper acceptance from this link: http://persoal.citius.usc.es/ manuel.fernandez.delgado/papers/dgt/index.html and from CodeOcean: https://codeocean.com/capsule/4271163/tree/v1.
Description
Bibliographic citation
D. Isla-Cernadas, M. Fernández-Delgado, E. Cernadas, M. S. Sirsat, H. Maarouf and S. Barro, "Closed-Form Gaussian Spread Estimation for Small and Large Support Vector Classification," in IEEE Transactions on Neural Networks and Learning Systems, doi: 10.1109/TNNLS.2024.3377370
Relation
Has part
Has version
Is based on
Is part of
Is referenced by
Is version of
Requires
Sponsors
Rights
Atribución 4.0 Internacional
© 2024 The Authors
© 2024 The Authors








