Presentation + Paper
31 May 2022 Novel neural network architecture for energy prediction
Hae Jin Kim, Arthur C. Depoian II, Colleen P. Bailey, Parthasarathy Guturu
Author Affiliations +
Abstract
Several classical statistical methods are commonly used for forecasting time-series data. However, due to a number of nonlinear characteristics, forecasting time-series data remains a challenge. Machine learning methods are better able to solve problems with high nonlinearity. RNNs (recurrent neural networks) are frequently used for time-series forecasting because their internal state, or memory, allows them to process a sequence of inputs. Specifically, LSTM (long short term memory), a type of RNN, is particularly useful, as it has both long-term and short-term components. Due to its feedback connections, ability to process a sequence of data of varying lengths, and ability to reset its own state, LSTMs are less sensitive to outliers and more forgiving to varying lags in time. Consequently, LSTMs are able to extract vital information and learn trends to forecast time-series data with high accuracy. We propose a novel neural network architecture using a combination of long short term memory and convolutional layers to predict time-series energy data with higher accuracy than comparable networks.
Conference Presentation
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Hae Jin Kim, Arthur C. Depoian II, Colleen P. Bailey, and Parthasarathy Guturu "Novel neural network architecture for energy prediction", Proc. SPIE 12097, Big Data IV: Learning, Analytics, and Applications, 1209705 (31 May 2022); https://doi.org/10.1117/12.2619143
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Machine learning

Binary data

Detection and tracking algorithms

Optimization (mathematics)

Error analysis

Feature selection

Back to Top