Optimum Sequential Procedures for Detecting Changes in Processes George V. Moustakides INRIA-IRISA, Rennes, France
Outline • The change detection (disorder) problem • Overview of existing results • Lorden’s criterion and the CUSUM test • A modified Lorden criterion • Optimality of CUSUM for Itˆ o processes Moustakides: Optimum sequential procedures for detecting changes in processes. 1
The Change Detection (Disorder) Problem We are observing sequentially a process ξ t with the following statistics ξ t ∼ for 0 ≤ t ≤ τ P ∞ ∼ for τ < t P 0 – Change time τ : deterministic (but unknown) or random. – Probability measures P ∞ , P 0 : known. Detect the change “as soon as possible”. Applications include: systems monitoring; quality control; financial decision making; remote sensing (radar, sonar, seismology); occurrence of industrial accidents; speech/image/video segmentation; etc. Moustakides: Optimum sequential procedures for detecting changes in processes. 2
The observation process ξ t is available sequentially; this can be expressed through the filtration: F t = σ { ξ s : 0 ≤ s ≤ t } . For detecting the change we are interested in sequential schemes. Any sequential detection scheme can be represented by a stopping time T (the time we stop and declare that the change took place). The stopping time T is adapted to F t . In other words, at every time instant t we perform a test (whether to stop and declare a change or continue sampling) using only the available information up to time t . Moustakides: Optimum sequential procedures for detecting changes in processes. 3
Overview of Existing Results P τ : the probability measure induced, when the change takes place at time τ . E τ [ · ] : the corresponding expectation. P ∞ : all data under nominal r´ egime. P 0 : all data under alternative r´ egime. Optimality Criteria They are basically comprised of two parts: – The first measures the detection delay – The second the frequency of false alarms Possible approaches are Baysian and Min-max. Moustakides: Optimum sequential procedures for detecting changes in processes. 4
Bayesian Approach (Shiryayev): τ is random and exponentially distributed T { c E [( T − τ ) + ] + P [ T < τ ] } inf The Shiryayev test consists in computing the statistics π t = P [ τ ≤ t |F t ] ; and stop when T S = inf t { t : π t ≥ ν } . T S is optimum (Shiryayev 1978): – In discrete time: when ξ n is i.i.d. before and after the change. – In continuous time: when ξ t is a Brownian Motion with constant drift before and after the change. Moustakides: Optimum sequential procedures for detecting changes in processes. 5
Min-Max Approach (Shiryayev-Roberts-Pollak): τ is deterministic and unknown inf T sup τ E τ [( T − τ ) + | T > τ ]; subject E ∞ [ T ] ≥ γ. Optimality results exists only for discrete time when ξ n is i.i.d. before and after the change. Specifically if we define the statistics S n = ( S n − 1 + 1) f 0 ( ξ n ) f ∞ ( ξ n ) , where f ∞ ( · ) , f 0 ( · ) the common pdf of the data before and after the change then (Yakir 1997) the stopping time T SRP = inf n { n : S n ≥ ν } is optimum. Moustakides: Optimum sequential procedures for detecting changes in processes. 6
Lorden’s Criterion and the CUSUM Test An alternative min-max approach consists in defining the following performance measure (Lorden 1971) τ essup E τ [( T − τ ) + |F τ ] J ( T ) = sup and solve the min-max problem inf T J ( T ); subject to E ∞ [ T ] ≥ γ. The test closely related to Lorden’s criterion and being to most popular one used in practice is the Cumulative Sum (CUSUM) test. Moustakides: Optimum sequential procedures for detecting changes in processes. 7
Define the CUSUM statistics y t as follows: � d P 0 � u t = log ( F t ) ; m t = 0 ≤ s ≤ t u s inf d P ∞ y t = u t − m t . The CUSUM stopping time (Page 1954): T C = inf t { t : y t ≥ ν } . Optimality results: – Discrete time: when ξ n is i.i.d. before and after the change (Moustakides 1986, Ritov 1990). – Continuous time: when ξ t is a Brownian Motion with constant drift before and after the change (Shiryayev 1996, Beibel 1996). Moustakides: Optimum sequential procedures for detecting changes in processes. 8
A modified Lorden criterion Our goal is to extend the optimality of CUSUM to Itˆ o processes. For this it will be necessary to modify Lorden’s criterion using the Kullback-Leibler Divergence (KLD). Similar extension was proposed for the SPRT by Liptser and Shiryayev (1978). Consider the process ξ t dw t , 0 ≤ t ≤ τ dξ t = α t dt + dw t , τ < t where w t is a standard Brownian motion with respect to F t = σ ( ξ s ; 0 ≤ s ≤ t ) ; α t is adapted to F t and τ denotes the time of change. Moustakides: Optimum sequential procedures for detecting changes in processes. 9
To ξ t we correspond the process u t defined by du t = α t dξ t − 0 . 5 α 2 t dt which we like to play the role of the log-likelihood ratio u t = log( d P 0 /d P ∞ ( F t )) . We therefore need to impose the following conditions: �� t �� t � � 0 α 2 0 α 2 s ds < ∞ = P ∞ s ds < ∞ = 1 1. P 0 � t n t n − 1 α 2 2. A “Novikov” condition, i.e. E ∞ [exp( s ds )] < ∞ where t n strictly increasing with t n → ∞ . �� ∞ �� ∞ α 2 α 2 � � s ds = ∞ = P ∞ s ds = ∞ = 1 3. P 0 0 0 Moustakides: Optimum sequential procedures for detecting changes in processes. 10
From conditions 1 & 2 we have validity of Girsanov’s theorem, therefore d P 0 d P τ ( F t ) = e u t − u τ . ( F t ) = e u t ; d P ∞ d P ∞ Furthermore for the KLD we can write � d P τ � � � � log ( F t ) � F τ E τ � d P ∞ �� t � t � 1 � 2 α 2 = α s dw s + s ds � F τ E τ � τ τ �� t 1 � � 2 α 2 = s ds � F τ , for 0 ≤ τ ≤ t < ∞ , E τ � τ Moustakides: Optimum sequential procedures for detecting changes in processes. 11
This suggests the following modification in Lorden’s criterion � T � � 1 � 2 α 2 J ( T ) = sup essup E τ 1 l { T >τ } t dt � F τ , � τ ∈ [0 , ∞ ) τ and the corresponding min-max optimization �� T � 1 2 α 2 inf T J ( T ); subject E ∞ t dt ≥ γ. 0 The original and the modified criterion coincide when ξ t is a Brownian motion with constant drift. Moustakides: Optimum sequential procedures for detecting changes in processes. 12
Let us form the CUSUM statistics y t for the Itˆ o process α t dξ t − 0 . 5 α 2 du t = t dt m t = 0 ≤ s ≤ t u s inf y t = u t − m t and the optimum CUSUM test is �� T C � 1 2 α 2 T C = inf t { t : y t ≥ ν } ; where E ∞ t dt = γ. 0 Since y t has continuous paths we conclude that when the y T C = ν. CUSUM test stops we will have: Moustakides: Optimum sequential procedures for detecting changes in processes. 13
Optimality of CUSUM for Itˆ o processes ν u t m t T c u t ≥ m t therefore y t = u t − m t ≥ 0 . m t is nonincreasing and dm t � = 0 only when u t = m t or y t = 0 . � ∞ If f ( y ) continuous; f (0) = 0 , then f ( y t ) dm t = 0 . 0 Moustakides: Optimum sequential procedures for detecting changes in processes. 14
If f ( y ) is a twice continuously differentiable function with f ′ (0) = 0 , using standard Itˆ o calculus we have f ′ ( y t )( du t − dm t ) + 0 . 5 α 2 t f ′′ ( y t ) dt d f ( y t ) = f ′ ( y t ) du t + 0 . 5 α 2 t f ′′ ( y t ) dt = Theorem 1: T C is a.s. finite and � T C � � � 1 2 α 2 1 l { T C >τ } t dt � F τ = [ g ( ν ) − g ( y τ )]1 l { T C >τ } E τ � τ � T C � � � 1 2 α 2 1 l { T C >τ } t dt � F τ = [ h ( ν ) − h ( y τ )]1 l { T C >τ } . E ∞ � τ where g ( y ) = y + e − y − 1; h ( y ) = e y − y − 1 . Moustakides: Optimum sequential procedures for detecting changes in processes. 15
Since g ( y ) , h ( y ) are increasing and strictly convex with g (0) = h (0) = 0 , we now conclude �� T C � α 2 J ( T C ) = sup τ essup E τ s ds |F τ τ = sup τ essup[ g ( ν ) − g ( y τ )]1 l { T C >τ } = g ( ν ) − g (0) = g ( ν ) Similarly �� T C � α 2 s ds = h ( ν ) − h (0) = h ( ν ) = γ. E ∞ 0 The threshold can thus be computed: e ν − ν − 1 = γ . Moustakides: Optimum sequential procedures for detecting changes in processes. 16
Using again standard Itˆ o calculus we have the following generalization of Theorem 1. Corollary: �� T � � 1 2 α 2 t dt � F τ = E τ [ g ( y T ) − g ( y τ ) |F τ ] 1 l { T >τ } E τ � τ �� T � � 1 2 α 2 t dt � F τ = E ∞ [ h ( y T ) − h ( y τ ) |F τ ] 1 l { T >τ } E ∞ � τ where T stopping time. Remark 1: The false alarm constraint can be written as �� T � 1 2 α 2 t dt = E ∞ [ h ( y T )] ≥ γ E ∞ 0 Moustakides: Optimum sequential procedures for detecting changes in processes. 17
Remark 2: We can limit ourselves to stopping times that satisfy the false alarm constraint with equality, that is, �� T � 1 2 α 2 t dt = E ∞ [ h ( y T )] = γ = h ( ν ) . E ∞ 0 Remark 3: The modified performance measure J ( T ) can be suitably lower bounded as follows � T � � 1 � 2 α 2 J ( T ) = sup τ essup E τ 1 l { T >τ } t dt � F τ � τ E ∞ [ e y T g ( y T )] ≥ . E ∞ [ e y T ] Moustakides: Optimum sequential procedures for detecting changes in processes. 18
Recommend
More recommend