Skip to main content

Online Bayesian shrinkage regression.

Jamil, W. and Bouchachia, A., 2020. Online Bayesian shrinkage regression. Neural Computing and Applications, 32, 17759-17767.

Full text available as:

[thumbnail of OBSR.pdf]
Preview
PDF
OBSR.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial.

340kB

DOI: 10.1007/s00521-020-04947-y

Abstract

The present work introduces an original and new online regression method that extends the shrinkage via limit of Gibbs sampler (SLOG) in the context of online learning. In particular, we theoretically show how the proposed online SLOG (OSLOG) is obtained using the Bayesian framework without resorting to the Gibbs sampler or considering a hierarchical representation. Moreover, in order to define the performance guarantee of OSLOG, we derive an upper bound on the cumulative squared loss. It is the only online regression algorithm with sparsity that gives logarithmic regret. Furthermore, we do an empirical comparison with two state-of-the-art algorithms to illustrate the performance of OSLOG relying on three aspects: normality, sparsity and multicollinearity showing an excellent achievement of trade-off between these properties.

Item Type:Article
ISSN:0941-0643
Uncontrolled Keywords:Regression; Regularisation; Competitive analysis; Online learning
Group:Faculty of Science & Technology
ID Code:40849
Deposited By: Symplectic RT2
Deposited On:13 Mar 2025 16:36
Last Modified:13 Mar 2025 16:36

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -