Jamil, W. and Bouchachia, A., 2020. Competitive Regularised Regression. Neurocomputing, 390 (May), 374-383.
Full text available as:
|
PDF
CompetitiveRegRegression.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial. 403kB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
DOI: 10.1016/j.neucom.2019.08.094
Abstract
Regularised regression uses sparsity and variance to reduce the complexity and over-fitting of a regression model. The present paper introduces two novel regularised linear regression algorithms: Competitive Iterative Ridge Regression (CIRR) and Online Shrinkage via Limit of Gibbs Sampler (OSLOG) for fast and reliable prediction on "Big Data" without making distributional assumption on the data. We use the technique of competitive analysis to design them and show their strong theoretical guarantee. Furthermore, we compare their performance against some neoteric regularised regression methods such as On-line Ridge Regression (ORR) and the Aggregating Algorithm for Regression (AAR). The comparison of the algorithms is done theoretically, focusing on the guarantee on the performance on cumulative loss, and empirically to show the advantages of CIRR and OSLOG.
Item Type: | Article |
---|---|
ISSN: | 0925-2312 |
Uncontrolled Keywords: | Regression; Regularisation; Online learning; Competitive analysis |
Group: | Faculty of Science & Technology |
ID Code: | 32713 |
Deposited By: | Symplectic RT2 |
Deposited On: | 05 Sep 2019 08:48 |
Last Modified: | 14 Mar 2022 14:17 |
Downloads
Downloads per month over past year
Repository Staff Only - |