We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.

Durham Research Online
You are in:

Classification with support vector machines and Kolmogorov-Smirnov bounds.

Utkin, L.V. and Coolen, F.P.A. (2014) 'Classification with support vector machines and Kolmogorov-Smirnov bounds.', Journal of statistical theory and practice., 8 (2). pp. 297-318.


This article presents a new statistical inference method for classification. Instead of minimizing a loss function that solely takes residuals into account, it uses the Kolmogorov–Smirnov bounds for the cumulative distribution function of the residuals, as such taking conservative bounds for the underlying probability distribution for the population of residuals into account. The loss functions considered are based on the theory of support vector machines. Parameters for the discriminant functions are computed using a minimax criterion, and for a wide range of popular loss functions, the computations are shown to be feasible based on new optimization results presented in this article. The method is illustrated in examples, both with small simulated data sets and with real-world data.

Item Type:Article
Keywords:Classification, Imprecise probability, Kolmogorov–Smirnov bounds, Minimax, Support vector machines.
Full text:(AM) Accepted Manuscript
Download PDF
Publisher Web site:
Publisher statement:This is an Accepted Manuscript of an article published by Taylor & Francis Group in Journal of Statistical Theory and Practice on 24/03/2014, available online at:
Date accepted:No date available
Date deposited:28 November 2014
Date of first online publication:March 2014
Date first made open access:No date available

Save or Share this output

Look up in GoogleScholar