Jamil, W., 2020. Competitive regression. Doctoral Thesis (Doctoral). Bournemouth University.
Full text available as:
|
PDF
JAMIL, Waqas_Ph.D._2020.pdf 513kB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Abstract
This thesis is about investigating the predictive complexity of online regression. In essence, supervised learning from a data sequence consisting of n-dimensional input and the corresponding output is considered. In this work online learning scenario considered consists of sequential arrival of data, without making any stochastic assumptions on the nature of the arriving data. At each trial, the learner receives the input, it produces a prediction. Then as a second step, the true output is observed, and the learner suffers a loss which is consequently used to learn. The goal of online learning regression in this thesis, is to minimise the regret suffered when considering the loss of the minimum of the sum of squares. In the present work, three novel algorithms (Online Shrinkage via Limit of Gibbs sampler (OSLOG), Competitive Iterated Ridge Regression (CIRR) and Competitive Normalised Least Squares (CNLS)) are derived and analysed. The development of these algorithms is driven by Kolmogorov complexity (known also as “competitive analysis”). OSLOG appraises the Bayesian approach, CIRR relies on game theory, whereas CNLS makes use of gradient descent methods. The analysis of the algorithms investigates two aspects: (1) formulating the upper bound on the cumulative square loss and (2) identifying the precise conditions under which they perform better than other algorithms. In fact, the theoretical results indicate that they have a better guarantee than the state-of-the-art algorithms. The empirical study conducted on real-world datasets show also that the performance of the proposed algorithms is better than the state-of-the-art algorithms and close to the minimum of the sum of squares.
Item Type: | Thesis (Doctoral) |
---|---|
Additional Information: | If you feel that this work infringes your copyright please contact the BURO Manager. |
Uncontrolled Keywords: | regression; predictive complexity; online learning |
Group: | Faculty of Science & Technology |
ID Code: | 34279 |
Deposited By: | Symplectic RT2 |
Deposited On: | 10 Jul 2020 13:28 |
Last Modified: | 14 Mar 2022 14:23 |
Downloads
Downloads per month over past year
Repository Staff Only - |