Budka, M. and Gabrys, B., 2013. Density Preserving Sampling (DPS) for error estimation and model selection. IEEE Transactions on Neural Networks and Learning Systems, 24 (1), pp. 22-34.
Full text available as:
|PDF - Accepted Version|
Estimation of the generalization ability of a classification or regression model is an important issue, as it indicates expected performance on previously unseen data and is also used for model selection. Currently used generalization error estimation procedures like cross–validation (CV) or bootstrap are stochastic and thus require multiple repetitions in order to produce reliable results, which can be computationally expensive if not prohibitive. The correntropy–based Density Preserving Sampling procedure (DPS) proposed in this paper eliminates the need for repeating the error estimation procedure by dividing the available data into subsets, which are guaranteed to be representative of the input dataset. This allows to produce low variance error estimates with accuracy comparable to 10 times repeated cross–validation at a fraction of computations required by CV. The method can also be successfully used for model ranking and selection. This paper derives the Density Preserving Sampling procedure and investigates its usability and performance using a set of publicly available benchmark datasets and standard classifiers.
|Uncontrolled Keywords:||error estimation, model selection, sampling, cross–validation, bootstrap, correntropy|
|Subjects:||Generalities > Computer Science and Informatics > Artificial Intelligence|
Generalities > Computer Science and Informatics
|Group:||Faculty of Science and Technology|
|Deposited By:||Professor Bogdan Gabrys|
|Deposited On:||20 Apr 2010 14:00|
|Last Modified:||05 Aug 2015 11:11|
Downloads per month over past year
|Repository Staff Only -|