Vapnik's quadratic programming (QP)-based support vector machine (SVM) is a state-of-the-art powerful classifier with high accuracy while being sparse. Moving one step further in the direction of sparsity, Vapnik proposed one more SVM that uses linear programming (LP) for the cost function. This machine, compared with the complex QP based one, is more sparse but offers similar accuracy, which is essential to work on any large dataset. However, further sparsity is optimum for computational savings as well as to work with very large and complicated datasets. Producing even more sparsity without reducing generalization capability of a detector is extremely challenging. In this dimension, we apply a distinct sequence of Mathematical Programming followed by slack variable analysis that leads to an exceptionally fast and accurate SVM based detector. Being immensely sparse and optimally complex, this Highly Effcient SVM (HESVM) can expertly work on very large and noise-effected complicated data. Experiments on Benchmark data shows that HESVM requires kernel execution as little as 6.8% of the classical QP based SVM while producing nearly the same classification accuracy on test data and demanding 42.7, 27.7 and 46.6% that of other three executed cutting-edge heavy-sparse machines while posing similar classification accuracy. It also claims the least Machine Accuracy Cost (MAC) value among all of these machines though producing very similar generalization performance, which is calculated statistically using the term Generalization Failure Rate (GFR). Being quite practical for contemporary technological development, it has become indispensable for optimum manipulation of the troublesome massive, and diffcult data.
Author (s) Details
Rezaul Karim
Uttara University, Bangladesh.
Amit Kumar Kundu
Uttara University, Bangladesh.
Ali Ahmed Ave
Uttara University, Bangladesh.
Please see the book here:- https://doi.org/10.9734/bpi/mcscd/v7/2725
No comments:
Post a Comment