Jamil, W. and Bouchachia, A., 2020. Online Bayesian shrinkage regression. Neural Computing and Applications, 32, 17759-17767.
Full text available as:
Preview |
PDF
OBSR.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial. 340kB |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
DOI: 10.1007/s00521-020-04947-y
Abstract
The present work introduces an original and new online regression method that extends the shrinkage via limit of Gibbs sampler (SLOG) in the context of online learning. In particular, we theoretically show how the proposed online SLOG (OSLOG) is obtained using the Bayesian framework without resorting to the Gibbs sampler or considering a hierarchical representation. Moreover, in order to define the performance guarantee of OSLOG, we derive an upper bound on the cumulative squared loss. It is the only online regression algorithm with sparsity that gives logarithmic regret. Furthermore, we do an empirical comparison with two state-of-the-art algorithms to illustrate the performance of OSLOG relying on three aspects: normality, sparsity and multicollinearity showing an excellent achievement of trade-off between these properties.
Item Type: | Article |
---|---|
ISSN: | 0941-0643 |
Uncontrolled Keywords: | Regression; Regularisation; Competitive analysis; Online learning |
Group: | Faculty of Science & Technology |
ID Code: | 40849 |
Deposited By: | Symplectic RT2 |
Deposited On: | 13 Mar 2025 16:36 |
Last Modified: | 13 Mar 2025 16:36 |
Downloads
Downloads per month over past year
Repository Staff Only - |