Model Structure Selection Tartu 2008
Neuron • Takes number of inputs • Processes them • Uses the result for activation function http://www.bordalierinstitute.com/images/ Neuron.JPG
Neural network • Simple neural network • Fully connected two layer feedforward network • Input layer • Hidden layer • Output layer
Recurrent networks • Output of the hidden units are fed back as inputs to the network • Time delay • Later considered networks do not have internal feedback
System identification • Goals – High degree of automation – Numberical reliability – Computational e ffj ciency • Experiment • Select model structure • Estimate model • Validate model
Experiment • Purpose – Data collection – Varying input(s) to observe the impact on ouputs • Main issues – Choice of sampling frequency – Design of suitable input signal – Preprocessing of data (noise removal, nonlinearity tests, disturbances)
Estimate & validate model • Neural network community -> training or learning • Picking the model • Checking the requirements
Used terms • Poles and zeros – Consequences • Linear model structures • Linearity y(t) = G(q -1 )u(t) + H(q -1 )e(t) • One-step ahead prediction: ŷ (t|t-1) = H -1 (q -1 )G(q -1 )u(t) + [1 - H -1 (q -1 )]y(t) • True system y(t) = G 0 (q -1 )u(t) + H 0 (q -1 )e 0 (t)
Used terms vol 2 • Model structure • Predictor form:
Used terms vol 3 • Basic requirement: • Model is simply a particular choice of parameter vector, say,
Another form of a model • Rewriting general model structure as:
The Finite Impulse Response FIR • Simplest type, let’s choose: • The predictor: • The parameter vector
ARX model structure (AutoRegressive, eXternal input) • The model corresponds to the choice: • The predictor takes the form: • The parameter vector is:
ARMAX model structure (AutoRegressive, Moving Average, external input) • Corresponds to the choice: • The predictor takes the form: • The parameter vector is:
Output error omdel structure OE • Used if only noise = white measurement noise • Corresponding to the choice of G and H
Output error model structure continues • Regression vector • Parameter vector
The State Space Innovations Form (SSIF) • Widely used alternative • Assume that system is described: • Optimal one-step ahead predictor:
SSIF • Also known as Kalman filter • Matrix is known as Kalman gain • Poles of the predictor:
Nonlinear Model Structures Based on Neural Networks • Selecting model structure more complicated • Family of model structures -> MLP networks – With this choice 2 issues • Inputs to the network • Internal network architecture – Often used approach reusing input structures from linear models
Nonlinear Model Structures Based on Neural Networks • Structural decisions are reasonable to handle • Suitable to design control systems
NNFIR and NNARX • Predictors are stable • This is important as the stability issue is much more complex than for linear systems • Deterministic & noise level insignificant
NNARMAX • Despite the feedforward network, the predictor has feedback • Regressors: • Recurrent network • Stability as a local properties
NNOE • Some regressors are the predictions of past outputs • Has the same problems as NNARMAX
NNSSIF • Like NNOE and NNARMAX • Same problems with NNSSIF as with SSIF – Problem solving: two separate networks
Stability • Very important in control theory • Necessary that the control system is stable • Stability during training • Stability
Asymptotic stability
Exponential stability
Recommend
More recommend