Utkin, L.V. and Coolen, F.P.A. (2014) 'Classification with support vector machines and Kolmogorov-Smirnov bounds.', Journal of statistical theory and practice., 8 (2). pp. 297-318.
Abstract
This article presents a new statistical inference method for classification. Instead of minimizing a loss function that solely takes residuals into account, it uses the Kolmogorov–Smirnov bounds for the cumulative distribution function of the residuals, as such taking conservative bounds for the underlying probability distribution for the population of residuals into account. The loss functions considered are based on the theory of support vector machines. Parameters for the discriminant functions are computed using a minimax criterion, and for a wide range of popular loss functions, the computations are shown to be feasible based on new optimization results presented in this article. The method is illustrated in examples, both with small simulated data sets and with real-world data.
Item Type: | Article |
---|---|
Keywords: | Classification, Imprecise probability, Kolmogorov–Smirnov bounds, Minimax, Support vector machines. |
Full text: | (AM) Accepted Manuscript Download PDF (576Kb) |
Status: | Peer-reviewed |
Publisher Web site: | http://dx.doi.org/10.1080/15598608.2013.788985 |
Publisher statement: | This is an Accepted Manuscript of an article published by Taylor & Francis Group in Journal of Statistical Theory and Practice on 24/03/2014, available online at: http://www.tandfonline.com/10.1080/15598608.2013.788985. |
Date accepted: | No date available |
Date deposited: | 28 November 2014 |
Date of first online publication: | March 2014 |
Date first made open access: | No date available |
Save or Share this output
Export: | |
Look up in GoogleScholar |