Skip to main content

Competitive Regularised Regression.

Jamil, W. and Bouchachia, A., 2020. Competitive Regularised Regression. Neurocomputing. (In Press)

Full text available as:

[img] PDF
CompetitiveRegRegression.pdf - Accepted Version
Restricted to Repository staff only until 20 October 2020.
Available under License Creative Commons Attribution Non-commercial.

403kB

DOI: 10.1016/j.neucom.2019.08.094

Abstract

Regularised regression uses sparsity and variance to reduce the complexity and over-fitting of a regression model. The present paper introduces two novel regularised linear regression algorithms: Competitive Iterative Ridge Regression (CIRR) and Online Shrinkage via Limit of Gibbs Sampler (OSLOG) for fast and reliable prediction on "Big Data" without making distributional assumption on the data. We use the technique of competitive analysis to design them and show their strong theoretical guarantee. Furthermore, we compare their performance against some neoteric regularised regression methods such as On-line Ridge Regression (ORR) and the Aggregating Algorithm for Regression (AAR). The comparison of the algorithms is done theoretically, focusing on the guarantee on the performance on cumulative loss, and empirically to show the advantages of CIRR and OSLOG.

Item Type:Article
ISSN:0925-2312
Uncontrolled Keywords:Regression; Regularisation; Online learning; Competitive analysis
Group:Faculty of Science & Technology
ID Code:32713
Deposited By: Unnamed user with email symplectic@symplectic
Deposited On:05 Sep 2019 08:48
Last Modified:06 Nov 2019 12:03

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -