Observer convergence: From necessary to sufficient conditions Laurent PRALY and Vincent ANDRIEU IHP, June 2016
§ 1 / 12 Observation Problem 1
§ 1.1 Observation Problem model dx dt ( t ) = ˙ x = f ( x, t, u ( t )) ( ∗ ) ↑ Exogenous actions ? − → y ( t ) = h ( x, t ) reality ✡ ✡ ✣ ✡ ✡ ✡ ✡ Observation problem: ✡ Find (=estimate) x ( t ) solution of (*) to make this hold. 2
§ 1.2 Observation Problem Observer = any device solving this problem. Focus on observers which are dynamical systems of the following form: ˙ ξ = ϕ ( ξ, y ) , ˆ x = η ( ξ, y ) with y the measurement, ˆ x the estimate and ξ finite dimensional. Very simplified/degraded form of observer Facing data-loss and unable to give any information on confidence. Compare with stochastic filters or set-valued observers. Interested only in convergence ˆ x ( t ) → x ( t ) → ∞ t 3
§ 2 / 12 Context and main property 4
§ 2.1 The technical context • The model is defined by two functions f and h : x = f ( x ) ˙ y = h ( x ) , (1) with x in an open set S x of R n and y in R p . Its solutions are denoted X ( x, t ) . • The observer is defined by two functions ϕ and η : ˙ ξ = ϕ ( ξ, y ) , x = η ( ξ, y ) ˆ (2) with ξ in an open set S ξ of R m and ˆ x in S x . Its solutions are denoted Ξ y ( ξ, t ) or Ξ(( x, ξ ) , t ) . • d x , d ξ and d y are distances on R n , R m and R p respectively. We omit time-dependence to simplify notations. No loss of generality as long as we do not want to consider families of models indexed by inputs (exogenous actions). 5
§ 2.2 The technical context The functions f , h , ϕ and η are assumed to be such that : • When ( x, ξ ) is in S x × S ξ , the corresponding solution of (1),(2), ( X ( x, t ) , Ξ(( x, ξ ) , t )) , is unique and defined, with values in S x × S ξ , maximally on ] σ − S x ×S ξ ( x, ξ ) , σ + S x ×S ξ ( x, ξ )[ . • If η does not depend on y it is uniformly continuous. • If η depends on y , both h and η are uniformly continuous. 6
§ 2.3 Main property Conv ˆ x Property “Conv”: ❅ Z = { ( x, ξ ) ∈ S x × S ξ : x = η ( ξ, h ( x )) } ❅ ❘ The zero error set contains an asymptotically stable subset Z ω with S x × S ξ as domain of attraction. More precisely, there exist i) a function β of class 1 KL ii) a continuous function γ : S x × S ξ → R + such that : ∀ ( x, ξ ) ∈ S x × S ξ , σ + S x ×S ξ ( x, ξ ) = σ + S x ( x ) and, for all t in [0 , σ + S x ( x )) , ւ not “equi” � � � � � � ( X ( x, t ) , Ξ(( x, ξ ) , t )) , Z ω ≤ γ ( x, ξ ) ( x, ξ ) , Z ω d x,ξ β d x,ξ , t . Notation: S x ω = Cartesian projection of Z ω on S x = set of estimable points x . 1 A function β : R + × R + → R + is said of class KL if, for each s in R + , the function s �→ β ( r, s ) is continuous, strictly increasing and zero at zero, and, for each r in R + , the function s �→ β ( r, s ) is strictly decreasing and satisfies lim s →∞ β ( r, s ) = 0 . 7
§ 2.4 Variation of main property Conv tune Property “Conv tune ”: Given an integer m and an open subset S ξ of R m , for any compact subset C of R m and for all pairs ( ε s , ε t ) of strictly positive real numbers, we can find i) a compact subset Γ of S ξ , ii) a locally Lipschitz function ϕ : R m × R p → R m iii) a uniformly continuous function η : R m × R p → S x such that, for the observer given by these functions, we have, for all ( x, ξ ) in S x × R m , 1. σ + S x × R m ( x, ξ ) = σ + S x ( x ) , 2. For all ( x, ξ ) in C × Γ , such that σ + S x ( x ) > ε t , we have : ∀ t ∈ [ ε t , σ + d x ( X ( x, t ) , η (Ξ(( x, ξ ) , t ) , h ( X ( x, t ))) ≤ ε s S x ( x )) . As fast as we want ( ε t ) to as small as we want ( ε s ), but maybe not convergence. 8
Part I Necessary conditions 9
§ 3 / 12 Necessary condition 1 for Conv : Detectability 10
§ 3.1 Necessary condition 1 : Detectability Proposition 1: [V. Andrieu, G. Besan¸ con, U. Serres] Under Assumption Conv , the model is forward detectable, i.e. for all x a and x b in S x satisfying : � � � � X ( x a , t ) = h X ( x b , t ) ∀ t ≥ 0 , h we have : � � t → + ∞ d x lim X ( x a , t ) , X ( x b , t ) = 0 . 11
§ 3.2 Necessary condition 1 : Detectability X ( x a , t ) X ( x b , t ) h ( X ( x a , t )) = h ( X ( x b , t )) x b h ( x a ) = h ( x b ) x a the distance converges to 0 12
§ 4 / 12 Necessary condition 2 for Conv : “The graph of an injective set-valued map is asymptotically stable” 13
§ 4.1 Necessary condition 2 : Asymp. stability of graph of map Proposition 2: Under Assumption Conv , there exists an injective set-valued map x ∈ S x ω → η inv ( x ) 1. which is a right inverse of η given h , i.e. ∀ x ∈ S x ∀ ξ ∈ η inv ( x ) , η ( ξ, h ( x )) = x ω . 2. Its graph is Z ω , i.e. ω × S ξ : ξ ∈ η inv ( x ) � � ( x, ξ ) ∈ S x Z ω = , and therefore is asymptotically stable with S x × S ξ as domain of attrac- tion. 14
§ 4.2 Necessary condition 2 : Asymp. stability of graph of map Away from singularities and with sufficient differentiability . . . it is necessary to have: dimension m of ξ (observe state) ≥ dimension n of x (model state) − dimension p of y (measurement) 15
§ 4.3 Supplementary assumption Assumption Obs Assumption “Obs”: The system: ˙ ξ = ϕ ( ξ, y, t ) , ˆ x = η ( ξ, y, t ) (= observer) x is instantaneously observable at S ξ with state ξ , input y and output ˆ uniformly in y in h ( S x ) i.e. for each continuous time function s �→ y ( s ) ∈ h ( S x ) , there is no pair of distinct points ξ a and ξ b in S ξ such that ∃ σ in [0 , min { σ + S ξ ( ξ a ) , σ + S ξ ( ξ b ) } ) such that we have : η (Ξ y ( ξ a , t ) , y ( t ) , t ) = η (Ξ y ( ξ b , t ) , y ( t ) , t ) ∀ t ∈ [0 , σ ] . 16
§ 4.4 Necessary condition 2 with Obs : Asymp. stability of graph of map Proposition 2 continued: If besides Assumption Conv , Assumption Obs holds also then η inv is single- valued. Hence, it is necessary that there exists a function x ∈ S x ω → η inv ( x ) ∈ S ξ 1. which is left invertible given h , i.e. ∀ x ∈ S x η ( η inv ( x ) , h ( x )) = x ω . 2. Its graph is asymptotically stable with S x × S ξ as domain of attraction. 17
§ 4.5 Necessary condition 2 with Obs : Asymp. stability of graph of map Luenberger wrote in 1962 Instead of requiring that the observer reconstructs the state vector x itself, we require only that it reconstructs some [ ] transformation [ η inv ] of the state vector . . . It is clear that it would then be possible to reconstruct the state vector itself, provided that the transforma- tion were invertible. 18
§ 5 / 12 Necessary condition 3 for Conv tune : Instantaneous observability 19
§ 5.1 Necessary condition 3 : Instantaneous Observability Proposition 3: [V. Andrieu, G. Besan¸ con, U. Serres] Under Assumption Conv tune , the model is instantaneously observable at S x . 20
§ 5.2 Supplementary Assumption Assumption Inj Assumption “Inj”: The observer output function η is injective given h , i.e.: There exists a function α η of class 1 K such that: � �� � d ξ ( ξ a , ξ b ) ≤ α η d x η ( ξ a , h ( x a )) , η ( ξ b , h ( x a )) ր ∈ Z ω × S ξ . � � = x a ∀ ( x a , ξ a ) , ξ b 1 A function α : R + → R + is said of class K if it is continuous, strictly increasing and zero at zero. It is of class K ∞ if it is onto R + . 21
§ 5.3 Supplementary Assumption Assumption Inj Away from singularities and with sufficient differentiability . . . Inj requires: dimension m of ξ (observe state) ≤ dimension n of x (model state) ≈ ( ξ, y ) can be used as (maybe non independent) coordinates for x . 22
§ 5.4 Supplementary Assumption Assumption Inj Example (Frequency estimation): For the model: x 2 = − x 1 x 3 , x 1 = x 2 , ˙ ˙ x 3 = 0 ˙ , y = x 1 , an observer satisfying Conv is: ˙ ξ j = ϕ j ( ξ, y ) = λ j [ ξ j − y ] j = 1 , . . . , m x = η ( ξ ) = ( M ( ξ ) T M ( ξ )) − 1 M ( ξ ) T N ( ξ ) ˆ where the λ j are complex numbers with strictly negative real parts and M ( ξ ) and N ( ξ ) are defined as: λ 2 λ 2 λ 1 ξ 1 1 ξ 1 1 . . . . . . . . M = , N ( ξ ) = . . . . λ 2 λ 2 m λ m ξ m m ξ m Given ˆ x , ( M ( ξ ) T M ( ξ )) ˆ x = M ( ξ ) T N ( ξ ) are 3 polynomial equations of degree 4 in m unknowns ⇒ infinitely many solutions when m ≥ 4 . Hence the observer output function η is not injective. ⇒ Assumption Inj does not hold. 23
§ 6 / 12 Necessary condition 4 for Conv and Inj : Weak differential detectability 24
§ 6.1 Necessary condition 4 : Weak differential detectability Proposition 4: [R. Sanfelice, L. P.] � ∂η ∂η � ∂ξ ( ξ, y ) ∂y ( ξ, y ) If Assumptions Conv and Inj hold and the matrix is invertible for each ( ξ, y ) in S ξ × h ( S x ) , then the model is weakly differentially detectable, i.e. + satisfying 1 : there exists a covariant 2 -tensor P : S x ω → P n ω × S n : ∂h ∀ ( x, v ) ∈ S x v ⊤ L f P ( x ) v ≤ 0 ∂x ( x ) v = 0 . ∂ P ( x ) ∂f ∂x ( x ) + ∂f � � � � v T L f P ( x ) v = v T P ( x ) v f ( x ) + v T 1 ∂x ( x ) P ( x ) v . ∂x 25
Recommend
More recommend