Zliobaite, I., Budka, M. and Stahl, F., 2015. Towards cost-sensitive adaptation: When is it worth updating your predictive model? Neurocomputing, 150 (Part A), 240 - 249 .
Full text available as:
stream_memory_neucomp_R2.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.
Our digital universe is rapidly expanding, more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams – cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows us to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.
|Uncontrolled Keywords:||Evolving data streams ; Concept drift ; Evaluation ; Cost-sensitive adaptation ; Utility of data mining|
|Group:||Faculty of Science & Technology|
|Deposited By:||Unnamed user with email symplectic@symplectic|
|Deposited On:||06 Nov 2015 14:45|
|Last Modified:||09 Nov 2015 12:03|
Downloads per month over past year
|Repository Staff Only -|