Skip to main content

Randomising the Simple Recurrent Network: a lightweight, energy-efficient RNN model with application to forecasting problems.

Khennour, M. E., Bouchachia, A., Kherfi, M. L. and Bouanane, K., 2023. Randomising the Simple Recurrent Network: a lightweight, energy-efficient RNN model with application to forecasting problems. Neural Computing and Applications. (In Press)

Full text available as:

[img] PDF
RSRN.pdf - Accepted Version
Restricted to Repository staff only until 1 July 2024.
Available under License Creative Commons Attribution Non-commercial.

587kB

DOI: 10.1007/s00521-023-08775-8

Abstract

Multi-variate time-series (MTS) forecasting is the prediction of future for a sequence of data. The process of analysing obtained data can benefit the community financially and securely, for instance observing stock exchange trends and predicting malicious attacks whenabout. MTS forecasting models face many problems including data and model complexity, energy constraints and computational cost. These problems could affect budget allocation, latency and carbon emission. Recurrent neural networks are one of these models, which are known for their computational complexity due to slow learning process which requires more energy to train. Contributing to green AI, in this paper, we propose a competitive and energy-efficient lightweight recurrent neural network based on a hybrid neural architecture that combines Random Neural Network (RaNN) and Simple Recurrent Network (SRN), namely Random Simple Recurrent Network (RSRN). We consider RaNN for its distinctive probabilistic properties and SRN for adding lightweight recurrent ability to the RaNN to process sequential data. The paper shows how RSRN is trained using adapted and optimised versions of back propagation (BP), back propagation through time (BPTT) and truncated BPTT (TBPTT). The latter two algorithms use penalised gradient descent to prevent gradient explosion problems by employing the average of total gradient over time. Evaluated on several datasets, RSRN achieves best performance when using TBPTT. Moreover, we performed a comparative study against well-known recurrent models showing its superiority compared to the state-of-the-art models, while requiring much less computational time and training parameters. In addition, we investigated the multi-layer architecture and its properties.

Item Type:Article
ISSN:0941-0643
Uncontrolled Keywords:Random Neural Network (RaNN); Simple Recurrent Network; Random Simple Recurrent Network (RSRN); Time-series data
Group:Faculty of Science & Technology
ID Code:38885
Deposited By: Symplectic RT2
Deposited On:14 Aug 2023 09:47
Last Modified:14 Aug 2023 09:47

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -