Forecasting Non-Stationary Time Series without Recurrent Connections AP Engelbrecht Department of Industrial Enigneering, and Computer Science Division Stellenbosch University South Africa engel@sun.ac.za Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 1 / 30
Presentation Outline I Introduction 1 The Time Series Used 2 Recurrent Neural Networks 3 Dynamic Optimization Problems 4 Particle Swarm Optimization 5 PSO Training of NNs 6 Empirical Analysis 7 Conclusions 8 Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 2 / 30
Introduction The main goal of this study was to investigate if recurrent connections or time delays are necessary when training neural networks (NNs) for non-stationary time series prediction using a dynamic particle swarm optimization (PSO) algorithm Consider training of the NN as a dynamic optimization problem, due to the statistical properties of the time series changing over time The quantum-inspired PSO (QSO) is a dynamic PSO with the ability to track optima in changing landscapes Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 3 / 30
The Time Series Plots International Airline Passengers Australian Wine Sales (AWS) (AIP) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 4 / 30
The Time Series Plots (cont) US Accidental Death (USD) Sunspot Annual Measure (SAM) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 5 / 30
The Time Series Plots (cont) Daily Minimum Temperature Hourly Internet Traffic (HIT) (DMT) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 6 / 30
The Time Series Plots (cont) Mackey Glass (MG) Logistic Map (LM) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 7 / 30
Feedforward Neural Networks Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 8 / 30
Recurrent Neural Networks Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 9 / 30
Dynamic Optimization Problems Training of a NN is an optimization problem, with the objective to find best values for weights and biases such that a given error function is minimized Forecasting a non-stationary time series is a dynamic optimization process, due to the statistical properties of the time series changing over time Dynamic optimization problems: search landscape properties change over time optima change over time, in value and in position new optima may appear existing optima may disappear changes further characterized by change severity and change frequency Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 10 / 30
Dynamic Optimization Problems (cont) Implications Optimization Algorithms: Need to adjust values assigned to decision variables in order to track changing optima, without re-optimizing For NN training, need to adapt weight and bias values to cope with concept drift, without re-training Should have the ability to escape local minima Need to continually inject diversity into the search Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 11 / 30
Particle Swarm Optimization Introduction What is particle swarm optimization (PSO)? a simple, computationally efficient optimization method population-based, stochastic search individuals follow very simple behaviors: emulate the success of neighboring individuals, but also bias towards own experience of success emergent behavior: discovery of optimal regions within a high dimensional search space Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 12 / 30
Particle Swarm Optimization Main Components What are the main components? a swarm of particles each particle represents a candidate solution elements of a particle represent parameters to be optimized The search process: Position updates x i ( t + 1 ) = x i ( t ) + v i ( t + 1 ) , x ij ( 0 ) ⇠ U ( x min , j , x max , j ) Velocity (step size) drives the optimization process reflects experiential knowledge of the particles and socially exchanged information about promising areas in the search space Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 13 / 30
Particle Swarm Optimization Inertia Weight PSO used either the star (gbest PSO) or social (lbest PSO) topology velocity update per dimension: v ij ( t + 1 ) = wv ij ( t ) + c 1 r 1 j ( t )[ y ij ( t ) � x ij ( t )] + c 2 r 2 j ( t )[ˆ y ij ( t ) � x ij ( t )] v ij ( 0 ) = 0 w is the inertia weight c 1 , c 2 are positive acceleration coefficients r 1 j ( t ) , r 2 j ( t ) ⇠ U ( 0 , 1 ) note that a random number is sampled for each dimension Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 14 / 30
Particle Swarm Optimization PSO Algorithm Create and initialize an n x -dimensional swarm, S ; repeat for each particle i = 1 , . . . , S . n s do if f ( S . x i ) < f ( S . y i ) then S . y i = S . x i ; end for each particle ˆ i with particle i in its neighborhood do if f ( S . y i ) < f ( S . ˆ y ˆ i ) then S . ˆ i = S . y i ; y ˆ end end end for each particle i = 1 , . . . , S . n s do update the velocity and position; end until stopping condition is true ; Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 15 / 30
Particle Swarm Optimization Quantum-Inspired PSO (QSO) Developed to find and track an optimum in changing search landscapes Based on quantum model of an atom, where orbiting electrons are replaced by a quantum cloud which is a probability distribution governing the position of each electron Swarm contains neutral particles following standard PSO updates charged, or quantum particles, randomly placed within a multi-dimensional sphere ⇢ x i ( t ) + v i ( t + 1 ) if Q i = 0 x i ( t + 1 ) = B ˆ y ( r cloud ) if Q i 6 = 0 Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 16 / 30
Particle Swarm Optimization Cooperative PSO For large-scale optimization problems, a divide-and-conquer approach to address the curse of dimensionality: Each particle is split into K separate parts of smaller dimension Each part is then optimized using a separate sub-swarm If K = n x , each dimension is optimized by a separate sub-swarm Cooperative quantum PSO (CQSO) uses QSO in the sub-swarms Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 17 / 30
PSO Training of NNs When using PSO to train a NN: each particle represents the weights and biases of one NN objective function is a cost function, e.g. SSE to prevent hidden unit saturation, use ReLU any activation function in the output units For non-stationary time series prediction: Used cooperative PSO with QSO in sub-swarms RNNs used modified hyperbolic tangent: f ( net ) = 1 . 7159 tanh( 1 3 net ) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 18 / 30
Control Parameters Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 19 / 30
Dynamic Scenarios Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 20 / 30
Performance Measure Used the collective mean error, P T t = 1 F ( t ) F mean ( t ) = T where F ( t ) is the MSE at time t Number of independent runs: 30 Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 21 / 30
Results MG (Mackey Glass) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 22 / 30
Results HIT (Hourly Internet Traffic) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 23 / 30
Results DMT (Daily Minimum Temperature) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 24 / 30
Results SAM (Sunspot Annual Measure) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 25 / 30
Results LM (Logistic Map) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 26 / 30
Results AWS (Australian Wine Sales) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 27 / 30
Results AIP (International Airline Passengers) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 28 / 30
Results USD (US Accidental Death) Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 29 / 30
Recommend
More recommend