Raubitzek, S. (2023). Chaos, complexity and neural network time series predictions [Dissertation, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2023.117442
E194 - Institut für Information Systems Engineering
-
Date (published):
2023
-
Number of Pages:
295
-
Keywords:
Chaos; Complexity; Neural Networks; Time Series Prediction; Time Series Analysis; Time Series Interpolation; Stochastic Processes; Phase Space Reconstruction; LSTM
en
Abstract:
Many of today’s most successful approaches for predicting time series data use machine and/or deep learning approaches such as different neural network architectures. These approaches strongly depend on the data available to train the employed algorithm. For, e.g., agricultural or environmental relevant applications, long-term data sets are rare and often sparsely sampled. Apart from that are ofte...
Many of today’s most successful approaches for predicting time series data use machine and/or deep learning approaches such as different neural network architectures. These approaches strongly depend on the data available to train the employed algorithm. For, e.g., agricultural or environmental relevant applications, long-term data sets are rare and often sparsely sampled. Apart from that are often difficult to predict because of numerous influences that affect these data sets. Thus, these data sets have an inherent randomness to them.The problem of sparsely sampled data can be overcome by employing different interpolation techniques, such as linear, polynomial, or fractal interpolation. On the other hand, the inherent randomness of difficult time series data can be treated by employing ensemble predictions.This research attempts to combine interpolation techniques and neural network ensemble predictions and further improve these combined approaches by taking into account the complexity and chaotic properties of the underlying data. The presented research introduces two interpolation techniques. One is a Hurst-exponent- based fractal interpolation considering the fluctuating nature of stochastic time series data. And the other one is a stochastic interpolation method that considers the reconstructed phase space properties of chaotic time series to produce an interpolation with a rather smooth phase space trajectory. Further, this research presents an ensemble technique that takes into account the com- plexity and/or reconstructed phase-space properties of the data under study. This is achieved by randomly parameterizing a multitude of long short-term memory neural networks (LSTM), having them produce an autoregressive prediction, and afterward filtering this multitude of different predictions based on their signal complexity and/or reconstructed phase space properties. First, the results show that neural network time-series predictions can be improved by employing the discussed interpolation techniques. And second, predictions can effectively be filtered based on their inherent complexity and phase space properties to improve ensemble predictions.