Raubitzek, S., & Neubauer, T. (2021). A fractal interpolation approach to improve neural network predictions for difficult time series data. Expert Systems with Applications, 169, Article 114474. https://doi.org/10.1016/j.eswa.2020.114474
Chaos; Deep learning; Fractal dimension; Hurst exponent; Hyperchaos; Linear interpolation; LSTM; Lyapunov exponent; Machine learning; R/S analysis; Time series analysis; Time series data; Time series prediction
Deep Learning methods, such as Long Short-Term Memory (LSTM) neural networks prove capable of predicting real-life time series data. Crucial for this technique to work is a sufficient amount of data. This can either be very long time-series data or fine-grained time-series data. If the data is insufficient, in terms of length or complexity, LSTM approaches perform poorly. We propose a fractal interpolation approach to generate a more fine-grained time series out of insufficient data sets. The interpolation is dynamically adapted to the time series using a time-dependent complexity feature so that the complexity properties of the interpolated time series are related to that of the original one. Also we perform a linear interpolation with the same number of interpolation points to compare results. This paper shows that predictions of fractal interpolated and linear interpolated time series clearly outperform the ones of the original data for a test fit on unknown data. Though predictions of linear and fractal interpolated time series data perform very similar, the fractal interpolated ones outperform the linear interpolated ones on difficult time series data. Also, though the complexities of sub-intervals are tailored to match the one from the original data, the interpolated time series shows a much higher degree of persistency and (in terms of the Hurst exponent) a much higher degree of long term memory.
Beyond TUW-research foci: 20% Environmental Monitoring and Climate Adaptation: 20% Information Systems Engineering: 60%