fundamental limitation of noise cancellation imposed by
play

Fundamental Limitation of Noise Cancellation Imposed by Causality, - PowerPoint PPT Presentation

Fundamental Limitation of Noise Cancellation Imposed by Causality, Stability, and Channel Capacity Munther A. Dahleh Laboratory for Information and Decision Systems Massachusetts Institute of Technology Dahleh, MTNS, Kyoto 2006 Acknowledgment


  1. Fundamental Limitation of Noise Cancellation Imposed by Causality, Stability, and Channel Capacity Munther A. Dahleh Laboratory for Information and Decision Systems Massachusetts Institute of Technology Dahleh, MTNS, Kyoto 2006

  2. Acknowledgment : This work is based on the PhD work as well as subsequent collaborations with Nuno Martins. References : N.C. Martins and M.A. Dahleh. Feedback Control of Noisy Channels: Bode-Like Fundamental Limitation of Performance, Revised and resubmitted to IEEE Trans. AC. N. Martins, M.A. Dahleh, and J. Doyle. Fundamental Limitations of Disturbance Attenuation with Side Information. Accepted for publication in IEEE Trans. A-C. Dahleh, MTNS, Kyoto 2006

  3. Outline Outline • Information Theory and its relations to control theory • Limitations Imposed by Causality • Limitations Imposed by Stability • Noise Cancellation with noisy measurements • Measurements are communicated through a noisy channel • Noise reduction over all possible encoders and decoders • Feedforward and Feedback setups: • Gaussian Case • General Case • Fundamental “flux” inequality • New lower bound Dahleh, MTNS, Kyoto 2006

  4. A Simple Network Problem y e d PLANT + u Encoder Decoder Objective: • Stability and disturbance rejection Requires: • Stability � Channel sends reliable estimate of the state • Performance � Channel has to send info about disturbance Dahleh, MTNS, Kyoto 2006

  5. Dahleh, MTNS, Kyoto 2006 Information theory

  6. IT: Entropy (1) • Information Theory Basic Quantities: Entropy { } = 1 ,..., Z M • Consider a random variable z with alphabet [ ] ⎡ ⎤ log E 2 z • What is the expected size of its binary expansion? ( ) ⎡ ⎤ − log • There is another representation where we assign p z z bits to each z and the expected size becomes: [ ] ( ) ⎡ − ⎤ log z E p z • Entropy is the expected size of the most bit-economic representation: [ ] ( ) ( ) = - log z z H E p z (valid asymptotically for sequences) Dahleh, MTNS, Kyoto 2006

  7. IT: Entropy (2) Properties of Entropy: ( ) ≥ 0 z H ( ) z H 2 Z can be represented as unif dist. symbols Measure of randomness ( ) ( ) ( ) p z z p z z p z z ( ) ( ) = = 0 log z H z M H 2 < < M M z z M z Dahleh, MTNS, Kyoto 2006

  8. IT: Differential Entopy “Continuous” random variables = ℜ n W Assume that w is a random variable with alphabet Differential entropy or “entropy density”: [ ] ( ) ( ( ) ) = Δ H f w h w 2 lim 2 Δ Δ → 0 unif. dist. volume This many elements unif. dist. f Δ is a quantizer Δ sensitivity of the quantizer Dahleh, MTNS, Kyoto 2006

  9. IT: Mutual Information “Continuous” random variables def ( ) ( ( ) ( ) ) ( ) ( ) = = − , lim , | I w w I f w f w h w h w w Δ Δ 1 2 1 2 1 1 2 Δ → 0 The function of the quantizer is to extract as “much” information as possible from the continuous random variables. “Continuous” and discrete random variables ( ) ( ( ) ) ( ) ( ) = = − , lim , | I z w I f z w h z h z w Δ Δ → 0 Dahleh, MTNS, Kyoto 2006

  10. IT: Information Rate Definition in terms of rates: Consider two “continuous” or discrete stochastic processes: ( ) w = k ( 0 ),..., ( ) w w k ( ) z = k ( 0 ),..., ( ) z z k ( ) k k , Maximum reliable ( ) I z w = , lim w z I (Information rate) bit-rate: used to define ∞ → ∞ k k capacity. ( ) k ( ) z h = lim h z (Entropy rate) ∞ → ∞ k k Dahleh, MTNS, Kyoto 2006

  11. IT: Properties of Mutual Information Properties: I (Positivity) ( ) = ≥ , ( , ) 0 I w z I z w II (Kolmogorov’s Formula) “On average”, given v , how much more information about z can I get from w ? ( ) = − = − , | ( ( , ), ) ( , ) ( | ) ( | , ) I w z v I w v z I v z h z v h z w v III (spectral bound) π 1 [ ] ( ) ( ) ∫ ≤ π ω ω log 2 h z eS d ∞ π z 4 − π ( ) ( ) ) 1 ≤ π k k log( 2 det z z h e Cov 2 Dahleh, MTNS, Kyoto 2006

  12. Mutual Information: Uncertainty reduction Properties of Entropy: Conditional Entropy [ ] ( ) ( ) ( ) ( ) = − = − | , log | H z z H z z H z E p z z 1 2 1 2 2 Z | Z 1 2 1 2 Mutual Information: How much information does z 2 carry about z 1 ? ( ) ( ) ( ) = − , | I z z H z H z z 1 2 1 1 2 Dahleh, MTNS, Kyoto 2006

  13. Dahleh, MTNS, Kyoto 2006 P(Y|Z) Channel Y Z Information Transmission: Shannon Decoder Channel Channel Encoder ) Y , Z Compression ( Decompress I ) sup Z ( P < ) X X ^ X ( Source H

  14. Bode Integral Formula: Causality Constraints Dahleh, MTNS, Kyoto 2006

  15. Bode’s Integral Limitation (LTI) Linear Feedback Scheme: Assume that d is asymptotically stationary: ( ) ω ( ) S Sensitivity Function ω = e S ( ) ω S d 1 π ( ) ( ) ∑ ∫ ω ω ≥ λ Basic Result: log log S d A π i − π 2 ( ) λ > 1 A i Dahleh, MTNS, Kyoto 2006

  16. Bode’s Integral Limitation (Extensions) Extensions: multivariable, time-varying … • Freudenberg, Middleton, Seron, Braslavsky, Jie Chen, … Information Theoretic Interpretation: Extensions to classes of Non-Linear Systems • Pablo Iglesias, et all Under the assumption of convergence of the Fourier Transform (Deterministic) • Goncalves and Doyle Question: e d y P P 1 π ( ) ( ) ∑ ∫ ? ω ω ≥ λ can we beat log log Using S d A u π i − π 2 ( ) λ > 1 A i Causal Dahleh, MTNS, Kyoto 2006

  17. Bode’s Integral Limitation (Extension 1) e d Arbitrary deterministic Causal w/ delay u ( ) ( ) = k k h e h d Property III π 1 [ ] ( ) ( ) ∫ ( ) ( ) ≤ π ω ω log 2 h z eS d = h ∞ π z h e d 4 − π ∞ ∞ π π 1 1 ( ) ( ) ∫ ∫ ω ω ≥ ω ω log log S d S d π π e d 4 4 − π − π π ( ) 1 ω ( ) ( ) ∫ S ω ω ≥ ω = log 0 e S d S ( ) ω π S 2 d − π Dahleh, MTNS, Kyoto 2006

  18. Proof of Extension (1) e d Arbitrary deterministic Causal w/ delay u ( ) ( ) = k k h e h d Proof: ( ) 1 = − k k ( ), | 0 I d k u d ( ) − − − = − 1 1 1 k k k k k ( ), | ( ( ) | ) ( ( ) | , ) I d k u d h d k d h d k d u − − − − = = = 1 1 1 1 k k k k k k k ( ( ) | , ) ( ( ) | , ) ( ( ) | , ) ( ( ) | ) h d k d u h e k d u h e k e u h e k e − − = k 1 k 1 ( ( ) | ) ( ( ) | ) h d k d h e k e Dahleh, MTNS, Kyoto 2006

  19. Limitations Imposed By Stability X (0) e . P P Arbitrary A necessary condition for asymptotic stability: ( ( ) ) ( ) ∑ ≥ ≥ λ 0 , Rate of Transmission I x e log i A ∞ unstable Dahleh, MTNS, Kyoto 2006

  20. Bode’s Integral Limitation (Extension 2) How can Bode be derived from Information Theory? X (0) I.T. Bode e d P P Arbitrary π 1 ( ) ( ) ∑ ∫ π ω ω ≥ ≥ λ log 2 log i A eS e d π 4 unstable − π ( ) ( ) ( ( ) ) ≥ + 0 , h e h d I x e ∞ ∞ ∞ π ( ) ω 1 ( ) ( ) ∑ ( ) def S ∫ ω ω ≥ λ ω = log log Bode’s inequality e S d A S ( ) π ω i 2 S − π d unstable Dahleh, MTNS, Kyoto 2006

  21. Bode Integral Limitation: Push- -Pop Pop Bode Integral Limitation: Push ( ) ( ) d t e t Comparison with Bode’s integral formula: P S K ( ) π e log ω 1 S ∑ ∫ S ω ≥ e log log d poles d ( ) π ω 4 S − π . . O L unstable d ω Dahleh, MTNS, Kyoto 2006

  22. Feedforward Noise Cancellation Dahleh, MTNS, Kyoto 2006

  23. Feedforward Noise Cancellation Noise Cancellation Feedforward ( ) ( ) d k u k Disturbance Gaussian Noise ( z ) G m delay + m delay ( ) ( ) ( ) v k r k q k E D E D + Gaussian Channel ( ) c k ( ) inf h u ∞ , E D 2 ≤ 1 E , Causal, LTI with Ev D Dahleh, MTNS, Kyoto 2006

  24. Feedforward Noise Cancellation Noise Cancellation Feedforward • Observe: π 1 ( ) ( ) ∫ = π ω ω log 2 h u eS d ∞ π u 4 − π • If we search over linear, causal and time-invariant E, D, we get: − m ∗ = G z − ∗ = 1 , E D G 1 σ + 2 c • Optimal noise cancellation: ( ) π ⎛ + ⎞ ω 1 1 1 S ∫ ⎜ ⎟ ω = − min u log log 1 d ⎜ ⎟ ( ) π ω σ 2 ∈ 4 2 ⎝ ⎠ S , E D LTI − π d c ( ) = − • Equivalently: ( ) h u h d C ∞ ∞ Dahleh, MTNS, Kyoto 2006

  25. Observations (Zang Zang & & Iglesias Iglesias) ) Observations ( • How does Entropy change by stable filtering? • Consider LTI , causal, with • If are the unstable zeros of H Dahleh, MTNS, Kyoto 2006

Recommend


More recommend