RT Journal Article T1 Closed-Form Gaussian Spread Estimation for Small and Large Support Vector Classification A1 Isla-Cernadas, Diego A1 Fernández-Delgado, Manuel A1 Sirsat, Manisha S. A1 Cernadas García, Eva A1 Maarouf, Haitham A1 Barro Ameneiro, Senén K1 Classification K1 Efficient computing K1 Large-scale datasets K1 Model selection K1 Radial basis kernel K1 Support vector machine (SVM) AB The support vector machine (SVM) with Gaussian kernel often achieves state-of-the-art performance in classification problems, but requires the tuning of the kernel spread. Most optimization methods for spread tuning require training, being slow and not suited for large-scale datasets. We formulate an analytic expression to calculate, directly from data without iterative search, the spread minimizing the difference between Gaussian and ideal kernel matrices. The proposed direct gamma tuning (DGT) equals the performance of and is one to two orders of magnitude faster than the state-of-the art approaches on 30 small datasets. Combined with random sampling of training patterns, it also runs on large classification problems. Our method is very efficient in experiments with 20 large datasets up to 31 million of patterns, it is faster and performs significantly better than linear SVM, and it is also faster than iterative minimization. Code is available upon paper acceptance from this link: http://persoal.citius.usc.es/ manuel.fernandez.delgado/papers/dgt/index.html and from CodeOcean: https://codeocean.com/capsule/4271163/tree/v1. PB IEEE SN 2162-237X YR 2024 FD 2024 LK http://hdl.handle.net/10347/34260 UL http://hdl.handle.net/10347/34260 LA eng NO D. Isla-Cernadas, M. Fernández-Delgado, E. Cernadas, M. S. Sirsat, H. Maarouf and S. Barro, "Closed-Form Gaussian Spread Estimation for Small and Large Support Vector Classification," in IEEE Transactions on Neural Networks and Learning Systems, doi: 10.1109/TNNLS.2024.3377370 DS Minerva RD 30 abr 2026