Outline Markov Chains are . . . Markov Chains under . . . Under Interval and Fuzzy From the . . . Symmetric Markov Chains Uncertainty, Symmetric In General, Symmetry . . . Proof of NP-Hardness Markov Chains Are Proof (cont-d) More Difficult to Predict Acknowledgments Title Page Roberto Araiza, Gang Xiang, ◭◭ ◮◮ Olga Kosheleva University of Texas at El Paso ◭ ◮ El Paso, Texas 79968, USA raraiza@utep.edu, gxiang@utep.edu Page 1 of 10 Damjan ˇ Go Back Skulj Faculty of Social Sciences Full Screen University of Ljubljana 100 Ljubljana, Slovenia Close Damjan.Skulj@Fdv.Uni-Lj.Si Quit
Outline 1. Outline Markov Chains are . . . Markov Chains under . . . • Markov chains: important tool for solving practical From the . . . problems. Symmetric Markov Chains • Traditional approach: assumes that we know the exact In General, Symmetry . . . transition probabilities p ij . Proof of NP-Hardness • In reality: we often only know these transition proba- Proof (cont-d) bilities with interval (or fuzzy) uncertainty. Acknowledgments Title Page • Symmetry: in some situations, the Markov chain is ◭◭ ◮◮ symmetric: p ij = p ji . ◭ ◮ • In general: symmetry simplifies computations. Page 2 of 10 • New result: for Markov chains under interval and fuzzy uncertainty, symmetry has the opposite effect – it makes Go Back the computational problems more difficult. Full Screen Close Quit
Outline 2. Markov Chains are Important Markov Chains are . . . Markov Chains under . . . • What is a Markov chain: when the probability p ij of From the . . . going from state i to state j Symmetric Markov Chains – depends only on these two states, and In General, Symmetry . . . – does not depend on the previous history. Proof of NP-Hardness • Example: described gene-related processes in bioinfor- Proof (cont-d) matics. Acknowledgments Title Page • Reminder: for each state i , the probabilities p ij of going to different states j = 1 , . . . , n should add up to one: ◭◭ ◮◮ n � p ij = 1 . ◭ ◮ j =1 Page 3 of 10 • Computational advantage: we can determine the prob- n Go Back abilities p (2) ij of 2-step transitions as p (2) � ij = p ik · p kj . k =1 Full Screen • Similarly, we can then define the probabilities of 3-step Close transitions, etc. Quit
Outline 3. Markov Chains under Interval and Fuzzy Uncer- Markov Chains are . . . tainty Markov Chains under . . . From the . . . • In practice, we often do not know the exact values of Symmetric Markov Chains the transition probabilities p ij . In General, Symmetry . . . • Instead, we sometimes know the intervals [ p ij , p ij ] of Proof of NP-Hardness possible values of p ij . Proof (cont-d) • Even more generally, we know fuzzy numbers µ ij which Acknowledgments describe these probabilities. Title Page • A natural question: what can we conclude about the ◭◭ ◮◮ 2-step transition probabilities? ◭ ◮ • Interval case: we would like to know the intervals Page 4 of 10 � n n � p (2) � � ij = p ik · p kj : p ab ∈ p ab , p ab = 1 . Go Back k =1 b =1 Full Screen • Fuzzy case: we would like to know the fuzzy sets cor- Close responding to p (2) ij . Quit
Outline 4. From the Computational Viewpoint, It Is Sufficient Markov Chains are . . . to Consider Interval Uncertainty Markov Chains under . . . From the . . . • Known: a fuzzy number µ ij ( p ) can be described by its Symmetric Markov Chains def α -cuts p ij ( α ) = { p | µ ij ( p ) ≥ α } . In General, Symmetry . . . • Conclusion: a fuzzy number µ ij ( p ) can be viewed as a Proof of NP-Hardness family of nested intervals p ij ( α ). Proof (cont-d) • Our objective: compute the fuzzy number µ (2) corre- Acknowledgments ij Title Page sponding to the desired value p (2) ij . ◭◭ ◮◮ • Known: each α -cut of µ (2) ij can be computed based on ◭ ◮ the α -cuts p ij ( α ) of the inputs. Page 5 of 10 • Example: to describe 10 different levels of uncertainty, we solve 10 interval computation problems. Go Back • Conclusion: from the computational viewpoint, it is Full Screen sufficient to produce an efficient algorithm for the in- Close terval case. Quit
Outline 5. Symmetric Markov Chains Markov Chains are . . . Markov Chains under . . . • Known: efficient algorithms for computing 2-step tran- From the . . . sition probabilities under interval uncertainty. Symmetric Markov Chains • Symmetric Markov chains: in some practical situa- In General, Symmetry . . . tions, we have symmetric (T-invariant) Markov chains. Proof of NP-Hardness • Definition: the probability p ij of going from state i to Proof (cont-d) state j is always equal to the probability of going from Acknowledgments state j to state i : p ij = p ji . Title Page • Example: mutations and other transitions in bioinfor- ◭◭ ◮◮ matics. ◭ ◮ • Symmetric Markov chains under interval uncertainty: Page 6 of 10 in the formula describing this range, we impose this Go Back additional symmetry requirement: � n Full Screen n � p (2) � � ij, sym = p ik · p kj : p ab ∈ p ab , p ab = p ba , p ab = 1 . Close k =1 b =1 Quit
Outline 6. In General, Symmetry Helps, but Not for Markov Markov Chains are . . . Chains under Interval Uncertainty Markov Chains under . . . From the . . . • Known: symmetry assumption usually enables us to Symmetric Markov Chains speed up computations. In General, Symmetry . . . • First reason: because of symmetry, we need to store Proof of NP-Hardness fewer data values p ij . Proof (cont-d) • Second reason: for symmetric matrices (like p ij ), there Acknowledgments are often faster algorithms. Title Page • It is reasonable to expect: that under interval uncer- ◭◭ ◮◮ tainty, symmetry will also be helpful. ◭ ◮ • Our result: contrary to these expectations, under inter- Page 7 of 10 val uncertainty, symmetry makes Markov chain com- Go Back putations more complex. Full Screen • Precise result: computing the (endpoints of the) exact range p (2) Close ij, sym of 2-step probabilities is NP-hard. Quit
Outline 7. Proof of NP-Hardness Markov Chains are . . . Markov Chains under . . . • Main idea: we reduce, to our problem, a known NP- From the . . . hard subset problem: Symmetric Markov Chains – given n positive integers s 1 , . . . , s n , In General, Symmetry . . . n – find the values ε i ∈ {− 1 , 1 } for which � ε i · s i = 0. Proof of NP-Hardness i =1 Proof (cont-d) • Reduction: to each instance of the subset problem, we Acknowledgments assign a Markov chain with Title Page � 1 n − α · s i , 1 � [ p 1 i , p 1 i ] = n + α · s i ◭◭ ◮◮ ◭ ◮ for some small α > 0. Page 8 of 10 • How to select α : we want to to guarantee that proba- bilities are non-negative: p 1 i = 1 Go Back n − α · s i ≥ 0. Full Screen 1 • Example: α = . Close n · max s i i Quit
Outline 8. Proof (cont-d) Markov Chains are . . . Markov Chains under . . . n n • Due to symmetry: p (2) ( p 1 i ) 2 . � � 11 , sym = p 1 i · p i 1 = From the . . . i =1 i =1 Symmetric Markov Chains = 1 � p 1 i − 1 � def • Auxiliary notation: ∆ i α · , then In General, Symmetry . . . n Proof of NP-Hardness p 1 i = p i 1 = 1 n + α · ∆ i and ∆ i ∈ [ − s i , s i ] . Proof (cont-d) Acknowledgments n n Title Page • Since � p 1 i = 1, we have � ∆ i = 0, and i =1 i =1 ◭◭ ◮◮ n � 2 n � 1 = 1 n + α 2 · p (2) � � ∆ 2 11 , sym = n + α · ∆ i ◭ ◮ i . i =1 i =1 Page 9 of 10 n 11 , sym = 1 • Easy to prove: p (2) � Go Back n + α 2 · s 2 i is possible ⇔ the i =1 Full Screen original instance of a subset problem has a solution. Close • Reduction is proven, so our problem is indeed NP-hard. Quit
9. Acknowledgments Outline Markov Chains are . . . This work was supported in part: Markov Chains under . . . From the . . . • by the Texas Department of Transportation grant No. Symmetric Markov Chains In General, Symmetry . . . 0-5453 and Proof of NP-Hardness Proof (cont-d) • by the Texas Advanced Research Program Grant No. Acknowledgments 003661-0008-2006. Title Page ◭◭ ◮◮ ◭ ◮ Page 10 of 10 Go Back Full Screen Close Quit
Recommend
More recommend