Stubbs, R., Rostami, S. and Wilson, K., 2019. Hyper-parameter Optimisation by Restrained Stochastic Hill Climbing. In: UKCI: 19th Annual UK Workshop on Computational Intelligence, 4-6 September 2019, Portsmouth, UK.
Full text available as:
|
PDF
UKCI2019_Paper__Rhys_.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial. 1MB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Official URL: https://www.ukci2019.port.ac.uk/
Abstract
Abstract. Machine learning practitioners often refer to hyper-parameter optimisation (HPO) as an art form and a skill that requires intuition and experience; Neuroevolution (NE) typically employs a combination of manual and evolutionary approaches for HPO. This paper explores the integration of a stochastic hill climbing approach for HPO within a NE algorithm. We empirically show that HPO by restrained stochastic hill climbing (HORSHC) is more effective than manual and pure evolutionary HPO. Empirical evidence is derived from a comparison of: (1) a NE algorithm that solely optimises hyper-parameters through evolution and (2) a number of derived algorithms with random search optimisation integration for optimising the hyper-parameters of a Neural Network. Through statistical analysis of the experimental results it has been revealed that random initialisation of hyper-parameters does not significantly affect the final performance of the Neural Networks evolved. However, HORSHC, a novel optimisation approach proposed in this paper has been proven to significantly out-perform the NE control algorithm. HORSHC presents itself as a solution that is computationally comparable in terms of both time and complexity as well as outperforming the control algorithm.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Uncontrolled Keywords: | hyper-parameter optimisation; global optimisation; neuroevolution; artificial neural networks; random search; stochastic hill climbing |
Group: | Faculty of Science & Technology |
ID Code: | 32508 |
Deposited By: | Symplectic RT2 |
Deposited On: | 10 Jul 2019 14:01 |
Last Modified: | 14 Mar 2022 14:16 |
Downloads
Downloads per month over past year
Repository Staff Only - |