Moral-Garcia, S. and Abellan, J. and Coolen-Maturi, T. and Coolen, F.P.A. (2022) 'A Cost-Sensitive Imprecise Credal Decision Tree based on Nonparametric Predictive Inference.', Applied Soft Computing, 123 . p. 108916.
Abstract
Classifiers sometimes return a set of values of the class variable since there is not enough information to point to a single class value. These classifiers are known as imprecise classifiers. Decision Trees for Imprecise Classification were proposed and adapted to consider the error costs when classifying new instances. In this work, we present a new cost-sensitive Decision Tree for Imprecise Classification that considers the error costs by weighting instances, also considering such costs in the tree building process. Our proposed method uses the Nonparametric Predictive Inference Model, a nonparametric model that does not assume previous knowledge about the data, unlike previous imprecise probabilities models. We show that our proposal might give more informative predictions than the existing cost-sensitive Decision Tree for Imprecise Classification. Experimental results reveal that, in Imprecise Classification, our proposed cost-sensitive Decision Tree significantly outperforms the one proposed so far; even though the cost of erroneous classifications is higher with our proposal, it tends to provide more informative predictions.
Item Type: | Article |
---|---|
Full text: | Publisher-imposed embargo until 30 April 2023. (AM) Accepted Manuscript Available under License - Creative Commons Attribution Non-commercial No Derivatives 4.0. File format - PDF (471Kb) |
Status: | Peer-reviewed |
Publisher Web site: | https://doi.org/10.1016/j.asoc.2022.108916 |
Publisher statement: | © 2022. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/ |
Date accepted: | 17 April 2022 |
Date deposited: | 26 April 2022 |
Date of first online publication: | 30 April 2022 |
Date first made open access: | 30 April 2023 |
Save or Share this output
Export: | |
Look up in GoogleScholar |