in situ adaptive tabulation for real time control
play

In Situ Adaptive Tabulation for Real-Time Control J. D. Hedengren - PowerPoint PPT Presentation

In Situ Adaptive Tabulation for Real-Time Control J. D. Hedengren T. F. Edgar The University of Texas at Austin 2004 American Control Conference Boston, MA Outline Model reduction and computational reduction Introduction to ISAT


  1. In Situ Adaptive Tabulation for Real-Time Control J. D. Hedengren T. F. Edgar The University of Texas at Austin 2004 American Control Conference Boston, MA

  2. Outline • Model reduction and computational reduction • Introduction to ISAT • ISAT theory • Application #1: Combined Approach • Application #2: ISAT vs. Neural Nets • Conclusions

  3. Model Reduction • Optimally reduce the number of model variables • Linear combination of states that retain the most important dynamics • Methods – Proper Orthogonal Decomposition (or PCA) – Balanced Covariance Matrices

  4. Model Reduction ( , ) = x f x u ɺ (1) Original ODE model ( ) = y h x 1 1 ( − ) ( − ( ), ) = x T T f T Tx u Determine a similarity ɺ (2a) transform to optimally 1 ( − ( )) = y h T Tx reduce the model states _ __ _ ( , ) = x f x u ɺ x = Tx Transformed states (2b) __ _ ( ) = y h x

  5. Model Reduction Binary distillation model reduction shows the relative weighting of the 32 original states in the top 3 transformed states. x 1 Inputs States Distillate x 2 _ RR x 9 . 1 0.015 x  1  1 _ ⋯       49 . 5 - 0.060 = x (3) Feed 2       x 17 ⋯ ⋮ _ 4 . 9 - 0.202  −    x x   3 32     ⋯         x 31 Bottoms x 32

  6. Model Reduction Truncation Residualization __ _ __ _ _ _ ( , ) ( , ) f x u f x u x   x    1  1 1  1  ɺ _ __ _ __ _   ɺ     _ x ( , ) ( , ) f x u   f x u x 2       2 2 2 ɺ _   ɺ __ _ _ __ _    x    ( , ) ( , ) f x u = f x u 3  x  = 3   ɺ 3  3   _  ɺ   __ _ x 0 0       4 ( , ) f x u   ɺ   4             _ ⋮ ⋮   ⋮ __ _ x  0  ⋮   0   32 ( , )   f x u ɺ         32      

  7. Computational Reduction • Retain all the of dynamics • Storage and retrieval to reduce the computational cost • Methods – Artificial neural networks – In situ adaptive tabulation (ISAT)

  8. Combined Approach • Combined approach for NMPC – Model reduction first – Computational reduction second First Reduced Storage and Balanced ISAT Principles model retrieval of Covariance Model reduced Matrices model integrations

  9. ISAT Introduction φ f Desired Integration Approximation Error Αδφ 0 φ 0 u   φ = x   φ f δφ 0 ISAT 0   φ 0 ISAT Nearby ISAT Record Fig. 1. Approximation of the desired integration final state with a nearby ISAT record.

  10. ISAT Search • Binary Tree Architecture – Search times are O(log 2 (N)) compared with O(N) for a sequential search = φ − φ v 2 1 φ φ 2 φ + φ α = T  2 1  v   2   T φ > α v φ 1 T φ < α v

  11. Binary Trees Branch φ 2 Cutting Plane φ 1 v φ = α φ 2 φ 1 Leaves Fig. 2. An illustration of the binary tree structure in the cutting plane format (on the left) and the tree format (on the right).

  12. Binary Tree Growth Before After φ 2 φ 1 φ 2 φ 1 φ 3 Fig. 3. Binary tree growth. A tree with one branch and two leaves is grown to include another leaf.

  13. Binary Trees • To increase the accuracy of the binary tree search, multiple binary trees are searched. • This increases the probability of finding a better record. • Number of binary trees is a tuning parameter that balances search speed with search accuracy.

  14. ISAT Integration • Scenario #1: Inside the region of accuracy φ T M ( ) ( ) φ − φ φ − φ ≤ ε 1 1 tol φ 1

  15. ISAT Integration • Scenario #2: Outside the region of accuracy but within the error tolerance φ T M ( ) ( ) φ − φ φ − φ > ε 1 1 tol Compute M new so that the new region is a symmetric, minimum φ 1 volume ellipsoid that includes φ

  16. ISAT Integration • Scenario #3: Outside the region of accuracy and outside the error tolerance Define cutting plane = φ − φ v φ 1 φ + φ α = T  1  v   2   φ 1 Find a conservative estimate for the region of accuracy around φ

  17. Application #1: Binary Distillation 32 state ODE 5 state Storage and Balanced ISAT model of reduced retrieval of Covariance binary model integrations Matrices distillation Fig. 4. Model and computational reduction flowchart.

  18. Closed-loop Response set point 0.94 5 states/ISAT 5 states Distillate Composition (x A ) 32 states 32 states/Linear 0.93 0.92 0.91 0 5 10 15 20 25 Time (min) Fig. 5. Closed loop response comparison for nonlinear MPC with ISAT with 5 states, nonlinear MPC with 5 states, nonlinear MPC with 32 states, and linear MPC.

  19. CPU times 120 5 states/ISAT 5 states 100 32 states 32 states/Linear Speed-up Factor 80 0.26 sec avg 60 40 0.77 sec avg 20 9.3 sec avg 22.2 sec avg 0 1 2 3 4 5 Optimization # Fig. 6. Speed-up factor for each of the optimizations shown in Fig. 5. The number above each curve indicates the average optimization cpu time on a 2 GHz processor.

  20. Application #2: ISAT vs. neural net • Dual CSTR model Feed V 1 V 2 Reaction Q T 1 T 2 A B C A1 C A2 q Product Fig. 7. Diagram of two CSTRs in series with a first order reaction. The manipulated variable is the cooling rate to the first CSTR

  21. Artificial Neural Network 7 Layer 1 Layer 2 6 Hyperbolic Linear O I tangent transfer u n sigmoid function t p transfer p u 6 neurons function u t t s 20 neurons s Fig. 8. Neural net with one hidden layer and one output layer. The hidden layer is a hyperbolic tangent function and the output layer is a linear function. This neural net relates 7 inputs to 6 outputs.

  22. Open-loop Response 460 440 420 Temperature (K) Actual Neural Net 400 ISAT 380 ISAT Retrieval ISAT Growth 360 ISAT Addition 340 0 1 2 3 4 5 6 7 8 9 10 Time (min) Fig. 9. The error control of ISAT indicates that additional records must be added, thereby avoiding extrapolation error.

  23. Closed-loop Response #1 454 set point 6 states/ISAT Reactor #2 Temperature (K) 452 6 states 6 states/Neural Net 450 448 446 444 0 0.5 1 1.5 2 Time (min) Fig. 10. Small closed loop set point change within the training domain.

  24. Closed-loop Response #2 455 set point 6 states/ISAT Reactor #2 Temperature (K) 6 states 450 6 states/Neural Net 445 440 435 0 0.5 1 1.5 2 Time (min) Fig. 11. Large closed loop set point change outside of the training domain.

  25. Summary and Conclusions • Combined approach includes model reduction followed by computational reduction • ISAT is a storage and retrieval method • With a 32 state binary distillation, the CPU time for NMPC is reduced by 85 times

  26. Summary and Conclusions • ISAT indicates when the retrieval is outside of the storage domain • ISAT incorporates automatic error control to avoid extrapolation errors

Recommend


More recommend