×

A novel and quick SVM-based multi-class classifier. (English) Zbl 1102.68629

Summary: Use different real positive numbers \(p_{i}\) to represent all kinds of pattern categories, after mapping the inputted patterns into a special feature space by a non-linear mapping, a linear relation between the mapped patterns and numbers \(p_{i}\) is assumed, whose bias and coefficients are undetermined, and the hyper-plane corresponding to zero output of the linear relation is looked as the base hyper-plane. To determine the pending parameters, an objective function is founded aiming to minimize the difference between the outputs of the patterns belonging to a same type and the corresponding \(p_{i}\), and to maximize the distance between any two different hyper-planes corresponding to different pattern types. The objective function is same to that of support vector regression in form, so the coefficients and bias of the linear relation are calculated by some known methods such as SVM\(^{\mathrm{light}}\) approach. Simultaneously, three methods are also given to determine \(p_{i}\), the best one is to determine them in training process, which has relatively high accuracy. Experiment results of the IRIS data set show that, the accuracy of this method is better than those of many SVM-based multi-class classifiers, and close to that of DAGSVM (decision-directed acyclic graph SVM), emphatically, the recognition speed is the highest.

MSC:

68T10 Pattern recognition, speech recognition

Software:

SVMlight
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Angulo, C.; Parra, X.; Català, A., K-SVCR. A support vector machine for multi-class classification, Neurocomputing, 55, 1-2, 57-77 (2003)
[2] Hsu, C.-W.; Lin, C.-J., A comparison of methods for multi-class support vector machines, IEEE Trans. Neural Networks, 13, 2, 415-425 (2002)
[3] K. Duan, S.S. Keerthi, Which is the best multi-class SVM method? An empirical study, Proceedings of the Sixth International Workshop, MCS 2005, Seaside, CA, USA, June 13-15, 2005, pp. 278-285.; K. Duan, S.S. Keerthi, Which is the best multi-class SVM method? An empirical study, Proceedings of the Sixth International Workshop, MCS 2005, Seaside, CA, USA, June 13-15, 2005, pp. 278-285.
[4] Lee, Y.; Lin, Y.; Wahba, G., Multi-category support vector machines: theory and application to the classification of microarray data and satellite radiance data, J. Am. Statist. Assoc. Theory Methods, 99, 465, 67-81 (2004) · Zbl 1089.62511
[5] D. Anguita, S. Ridella, D. Sterpi, A new method for multi-class support vector machines. Proceedings of the IEEE International Joint Conference on Neural Networks, IJCNN 2004, Budapest, Hungary, July 2004, pp. 412-417.; D. Anguita, S. Ridella, D. Sterpi, A new method for multi-class support vector machines. Proceedings of the IEEE International Joint Conference on Neural Networks, IJCNN 2004, Budapest, Hungary, July 2004, pp. 412-417.
[6] Sebald, D. J.; Bucklew, J. A., Support vector machines and the multiple hypothesis test problem, IEEE Trans. Signal Process., 49, 11, 2865-2872 (2001)
[7] Bredensteiner, E. J.; Bennett, K. P., Multi-category classification by support vector machines, Comput. Optim. Appl., 12, 53-79 (1999) · Zbl 1040.90574
[8] Van Gestel, T.; Suykens, J. A.K.; Lanckriet, G., Multi-class LS-SVMs: moderated outputs and coding-decoding schemes, Neural Process. Lett., 15, 1, 45-58 (2002) · Zbl 1008.68739
[9] David, V.; Sánchez, A., Advanced support vector machines and kernel methods, Neurocomputing, 55, 1-2, 5-20 (2003)
[10] Burges, C. J.C., A tutorial on support vector machines for pattern recognition, Data Mining and Knowledge Discovery, 2, 121-167 (1998)
[11] Joachims, T., Making large-scale SVM learning practical, (Schölkopf, B.; Burges, C. J.C.; Smola, A. J., Advances in Kernel Methods—Support Vector Learning (1998), MIT Press: MIT Press Cambridge, USA), 41-56
[12] Karaçalí, B.; Ramanath, R.; Snyder, W. E., A comparative analysis of structural risk minimization by support vector machines and nearest neighbor rule, Pattern Recognition Lett., 25, 1, 63-71 (2004)
[13] Wang, W.; Xu, Z., A heuristic training for support vector regression, Neurocomputing, 61, 259-275 (2004)
[14] Maruyama, K.-I.; Maruyama, M.; Miyao, H.; Nakano, Y., A method to make multiple hypotheses with high cumulative recognition rate using SVMs, Pattern Recognition, 37, 2, 241-251 (2004) · Zbl 1059.68092
[15] Gao, J. B.; Gunn, S. R.; Harris, C. J., Mean field method for the support vector machine regression, Neurocomputing, 50, 391-405 (2003) · Zbl 1006.68818
[16] González, L.; Angulo, C.; Velasco, F.; Català, A., Unified dual for bi-class SVM approaches, Pattern Recognition, 38, 10, 1772-1774 (2005) · Zbl 1077.68799
[17] Platt, J. C.; Cristianini, N.; Taylor, J. S., Large margin DAGs for multi-class classification, (Solla, S. A.; Leen, T. K.; Miiller, K.-R., Advances in Neural Information Processing Systems, vol. 12 (2000), MIT Press: MIT Press Cambridge, MA), 547-553
[18] Vapnik, V.; Golowich, S.; Smola, A., Support vector method for function approximation, regression estimation, and signal processing, (Mozer, M.; Jordan, M.; Petsche, T., Advances in Neural Information Processing Systems, vol. 9 (1997), MIT Press: MIT Press Cambridge, MA), 281-287
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.