Date of Thesis

Spring 2020


Artificial neural networks have been used for time series modeling and forecasting in many domains. However, they are often limited in their handling of nonlinear and chaotic data. More recently, reservoir-based recurrent neural net systems, most notably Echo State Networks (ESN), have made substantial improvements for time series modeling. Their shallow nature lend themselves to an efficient training method, but have limitations on non-stationary, non-linear chaotic time series, particularly large, multidimensional time series.

In this paper, we propose a novel approach for forecasting time series data based on an additive decomposition (AD) applied to the time series as a preprocessor to a deep echo state network. We compare the performance of our method, AD-DeepESN, on popular neural net architectures used for time series prediction. Stationary and non-stationary data sets are used to evaluate the performance of the methods. Our results are compelling, demonstrating that AD-DeepESN has superior performance, particularly on the most challenging time series that exhibit non-stationarity and chaotic behavior compared to existing methods.


Time series forecasting, Reservoir computing, Echo state network, Additive decomposition, Recurrent neural network

Access Type

Honors Thesis (Bucknell Access Only)

Degree Type

Bachelor of Science in Computer Science and Engineering


Computer Science

First Advisor

Brian King

Second Advisor

Joshua Stough