fast and near optimal algorithms for approximating
play

Fast and Near-Optimal Algorithms for Approximating Distributions by - PowerPoint PPT Presentation

Fast and Near-Optimal Algorithms for Approximating Distributions by Histograms Jayadev Acharya 1 Ilias Diakonikolas 2 Chinmay Hegde 1 Jerry Li 1 Ludwig Schmidt 1 1 MIT 2 University of Edinburgh June 1, 2015 1 / 30 Introduction Motivating


  1. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . Proof. We will show E [ � f − ˆ f m � 2 2 ] ≤ ǫ. � n � E [ � f − ˆ f m � 2 � ( f ( i ) − ˆ f m ( i )) 2 2 ] = E i =1 n � f m ( i )) 2 � � ( f ( i ) − ˆ = E i =1 10 / 30

  2. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . Proof. We will show E [ � f − ˆ f m � 2 2 ] ≤ ǫ. � n � E [ � f − ˆ f m � 2 � ( f ( i ) − ˆ f m ( i )) 2 2 ] = E i =1 n � f m ( i )) 2 � � ( f ( i ) − ˆ = E i =1 n � � � ˆ = Var f m ( i ) . i =1 10 / 30

  3. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . n Proof. � � E [ � f − ˆ f m � 2 � ˆ 2 ] = Var f m ( i ) . i =1 11 / 30

  4. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . n Proof. � � E [ � f − ˆ f m � 2 � ˆ 2 ] = Var f m ( i ) . i =1 But ˆ f m ( i ) ∼ 1 m Bin ( m , f ( i )), and Var ( Bin ( n , p )) = np (1 − p ). 11 / 30

  5. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . n Proof. � � E [ � f − ˆ f m � 2 � ˆ 2 ] = Var f m ( i ) . i =1 But ˆ f m ( i ) ∼ 1 m Bin ( m , f ( i )), and Var ( Bin ( n , p )) = np (1 − p ). n 1 E [ � f − ˆ f m � 2 � 2 ] = m 2 mf ( i )(1 − f ( i )) i =1 11 / 30

  6. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . n Proof. � � E [ � f − ˆ f m � 2 � ˆ 2 ] = Var f m ( i ) . i =1 But ˆ f m ( i ) ∼ 1 m Bin ( m , f ( i )), and Var ( Bin ( n , p )) = np (1 − p ). n 1 E [ � f − ˆ f m � 2 � 2 ] = m 2 mf ( i )(1 − f ( i )) i =1 n 1 � ≤ f ( i ) m i =1 11 / 30

  7. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . n Proof. � � E [ � f − ˆ f m � 2 � ˆ 2 ] = Var f m ( i ) . i =1 But ˆ f m ( i ) ∼ 1 m Bin ( m , f ( i )), and Var ( Bin ( n , p )) = np (1 − p ). n 1 E [ � f − ˆ f m � 2 � 2 ] = m 2 mf ( i )(1 − f ( i )) i =1 n 1 f ( i ) = 1 � ≤ m m i =1 11 / 30

  8. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma If m = O ( 1 ǫ ) , then � f − ˆ f m � 2 2 ≤ ǫ with probability 99 / 100 . n Proof. � � E [ � f − ˆ f m � 2 � ˆ 2 ] = Var f m ( i ) . i =1 But ˆ f m ( i ) ∼ 1 m Bin ( m , f ( i )), and Var ( Bin ( n , p )) = np (1 − p ). n 1 E [ � f − ˆ f m � 2 � 2 ] = m 2 mf ( i )(1 − f ( i )) i =1 n 1 f ( i ) = 1 � ≤ m ≤ ǫ . m i =1 11 / 30

  9. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma ǫ ) , then � f − ˆ If m = O ( 1 f m � 2 2 ≤ ǫ with probability 99 / 100 . 12 / 30

  10. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma ǫ ) , then � f − ˆ If m = O ( 1 f m � 2 2 ≤ ǫ with probability 99 / 100 . Corollary Let m = O (1 /ǫ ) , and let h be so that � h − ˆ 2 ≤ β · OPT k (ˆ f m � 2 f m ) . Then w.h.p. � h − f � 2 2 ≤ β · OPT k ( f ) + ǫ . 12 / 30

  11. An O (1 /ǫ ) Sample Upper Bound An O (1 /ǫ ) Sample Upper Bound (cont.) Lemma ǫ ) , then � f − ˆ If m = O ( 1 f m � 2 2 ≤ ǫ with probability 99 / 100 . Corollary Let m = O (1 /ǫ ) , and let h be so that � h − ˆ 2 ≤ β · OPT k (ˆ f m � 2 f m ) . Then w.h.p. � h − f � 2 2 ≤ β · OPT k ( f ) + ǫ . This reduces to a completely deterministic problem! 12 / 30

  12. The Greedy Merging Algorithm Outline of Rest of Talk An O (1 /ǫ ) Sample Upper Bound 1 The Greedy Merging Algorithm 2 Analysis 3 Experimental Evaluation 4 Conclusions 5 13 / 30

  13. The Greedy Merging Algorithm Main Result 14 / 30

  14. The Greedy Merging Algorithm Main Result Main Algorithmic Result An algorithm which, given k ∈ N and q : [ n ] → R supported on m elements, runs in time O ( m ), and outputs a 5 k -histogram h so that � h − q � 2 2 ≤ 2 · OPT k ( q ) . 14 / 30

  15. The Greedy Merging Algorithm Main Result Main Algorithmic Result An algorithm which, given k ∈ N and q : [ n ] → R supported on m elements, runs in time O ( m ), and outputs a 5 k -histogram h so that � h − q � 2 2 ≤ 2 · OPT k ( q ) . Corollary An algorithm for learning histogram approximations that takes O (1 /ǫ ) samples and runs in time O (1 /ǫ ) which achieves α = 5 and β = 2 . 14 / 30

  16. The Greedy Merging Algorithm Main Result Main Algorithmic Result An algorithm which, given k ∈ N and q : [ n ] → R supported on m elements, runs in time O ( m ), and outputs a 5 k -histogram h so that � h − q � 2 2 ≤ 2 · OPT k ( q ) . Corollary An algorithm for learning histogram approximations that takes O (1 /ǫ ) samples and runs in time O (1 /ǫ ) which achieves α = 5 and β = 2 . Proof. Draw m = O (1 /ǫ ) samples and form the empirical ˆ f m . 14 / 30

  17. The Greedy Merging Algorithm Main Result Main Algorithmic Result An algorithm which, given k ∈ N and q : [ n ] → R supported on m elements, runs in time O ( m ), and outputs a 5 k -histogram h so that � h − q � 2 2 ≤ 2 · OPT k ( q ) . Corollary An algorithm for learning histogram approximations that takes O (1 /ǫ ) samples and runs in time O (1 /ǫ ) which achieves α = 5 and β = 2 . Proof. Draw m = O (1 /ǫ ) samples and form the empirical ˆ f m . Run the above algorithm on ˆ f m . 14 / 30

  18. The Greedy Merging Algorithm Flattening 15 / 30

  19. The Greedy Merging Algorithm Flattening Definition Let q : [ n ] → R , and and let I ⊆ [ n ] be an interval. 1 Let q I be the constant function on I which is identically � i ∈ I q ( I ). | I | We call this the flattening of q over I . i ∈ I ( q ( i ) − q I ( i )) 2 . Let flat-err q ( I ) = � 15 / 30

  20. The Greedy Merging Algorithm Flattening Definition Let q : [ n ] → R , and and let I ⊆ [ n ] be an interval. 1 � Let q I be the constant function on I which is identically i ∈ I q ( I ). | I | We call this the flattening of q over I . i ∈ I ( q ( i ) − q I ( i )) 2 . Let flat-err q ( I ) = � 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0.5 0.0 0.5 1.0 1.5 2.0 2.5 15 / 30

  21. The Greedy Merging Algorithm Flattening Definition Let q : [ n ] → R , and and let I ⊆ [ n ] be an interval. 1 � Let q I be the constant function on I which is identically i ∈ I q ( I ). | I | We call this the flattening of q over I . i ∈ I ( q ( i ) − q I ( i )) 2 . Let flat-err q ( I ) = � 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0.5 0.0 0.5 1.0 1.5 2.0 2.5 15 / 30

  22. The Greedy Merging Algorithm Flattening Definition Let q : [ n ] → R , and and let I ⊆ [ n ] be an interval. 1 Let q I be the constant function on I which is identically � i ∈ I q ( I ). | I | We call this the flattening of q over I . i ∈ I ( q ( i ) − q I ( i )) 2 . Let flat-err q ( I ) = � Lemma i ∈ I ( q ( i ) − φ ( i )) 2 . For any flat function φ on I, flat-err q ( I ) ≤ � 15 / 30

  23. The Greedy Merging Algorithm Partitions 16 / 30

  24. The Greedy Merging Algorithm Partitions Definition A partition of [ n ] is a set of disjoint intervals I 1 , . . . , I r so that � I i = [ n ]. 16 / 30

  25. The Greedy Merging Algorithm Partitions Definition A partition of [ n ] is a set of disjoint intervals I 1 , . . . , I r so that � I i = [ n ]. Any k -histogram induces a partition of [ n ]. 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 16 / 30

  26. The Greedy Merging Algorithm Partitions Definition A partition of [ n ] is a set of disjoint intervals I 1 , . . . , I r so that � I i = [ n ]. Any k -histogram induces a partition of [ n ]. 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 16 / 30

  27. The Greedy Merging Algorithm Partitions Definition A partition of [ n ] is a set of disjoint intervals I 1 , . . . , I r so that � I i = [ n ]. Any partition induces a unique k -histogram (for our purposes) 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 16 / 30

  28. The Greedy Merging Algorithm Partitions Definition A partition of [ n ] is a set of disjoint intervals I 1 , . . . , I r so that � I i = [ n ]. Any partition induces a unique k -histogram (for our purposes) 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 16 / 30

  29. The Greedy Merging Algorithm Partitions Definition A partition of [ n ] is a set of disjoint intervals I 1 , . . . , I r so that � I i = [ n ]. Any partition induces a unique k -histogram (for our purposes) 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 16 / 30

  30. The Greedy Merging Algorithm Algorithm Description 17 / 30

  31. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. 17 / 30

  32. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : 17 / 30

  33. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. 17 / 30

  34. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. Form J 1 = ( I 1 ∪ I 2 ) , J 2 = ( I 3 ∪ I 4 ) , . . . , J r / 2 = ( I r − 1 ∪ I r ) 17 / 30

  35. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. Form J 1 = ( I 1 ∪ I 2 ) , J 2 = ( I 3 ∪ I 4 ) , . . . , J r / 2 = ( I r − 1 ∪ I r ) For ℓ = 1 , . . . , r / 2, compute e ℓ = flat-err q ( J ℓ ). Let L ⊆ { 1 , . . . , r / 2 } be the set of 2 k indices with largest e ℓ . 17 / 30

  36. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. Form J 1 = ( I 1 ∪ I 2 ) , J 2 = ( I 3 ∪ I 4 ) , . . . , J r / 2 = ( I r − 1 ∪ I r ) For ℓ = 1 , . . . , r / 2, compute e ℓ = flat-err q ( J ℓ ). Let L ⊆ { 1 , . . . , r / 2 } be the set of 2 k indices with largest e ℓ . Form I ′ by: 17 / 30

  37. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. Form J 1 = ( I 1 ∪ I 2 ) , J 2 = ( I 3 ∪ I 4 ) , . . . , J r / 2 = ( I r − 1 ∪ I r ) For ℓ = 1 , . . . , r / 2, compute e ℓ = flat-err q ( J ℓ ). Let L ⊆ { 1 , . . . , r / 2 } be the set of 2 k indices with largest e ℓ . Form I ′ by: For ℓ ∈ L , include I 2 ℓ − 1 and I 2 ℓ . 17 / 30

  38. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. Form J 1 = ( I 1 ∪ I 2 ) , J 2 = ( I 3 ∪ I 4 ) , . . . , J r / 2 = ( I r − 1 ∪ I r ) For ℓ = 1 , . . . , r / 2, compute e ℓ = flat-err q ( J ℓ ). Let L ⊆ { 1 , . . . , r / 2 } be the set of 2 k indices with largest e ℓ . Form I ′ by: For ℓ ∈ L , include I 2 ℓ − 1 and I 2 ℓ . For ℓ �∈ L , include J ℓ . 17 / 30

  39. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. Form J 1 = ( I 1 ∪ I 2 ) , J 2 = ( I 3 ∪ I 4 ) , . . . , J r / 2 = ( I r − 1 ∪ I r ) For ℓ = 1 , . . . , r / 2, compute e ℓ = flat-err q ( J ℓ ). Let L ⊆ { 1 , . . . , r / 2 } be the set of 2 k indices with largest e ℓ . Form I ′ by: For ℓ ∈ L , include I 2 ℓ − 1 and I 2 ℓ . For ℓ �∈ L , include J ℓ . Set I ← I ′ 17 / 30

  40. The Greedy Merging Algorithm Algorithm Description q is m -sparse ⇒ q is an O ( m )-histogram. Let I be the partition of [ n ] that the jumps of q induce. While |I| ≥ 5 k : Let I = { I 1 , . . . , I r } where the I j are in order. Form J 1 = ( I 1 ∪ I 2 ) , J 2 = ( I 3 ∪ I 4 ) , . . . , J r / 2 = ( I r − 1 ∪ I r ) For ℓ = 1 , . . . , r / 2, compute e ℓ = flat-err q ( J ℓ ). Let L ⊆ { 1 , . . . , r / 2 } be the set of 2 k indices with largest e ℓ . Form I ′ by: For ℓ ∈ L , include I 2 ℓ − 1 and I 2 ℓ . For ℓ �∈ L , include J ℓ . Set I ← I ′ Output the flattening of q over the intervals in I . 17 / 30

  41. The Greedy Merging Algorithm Example ( k = 2) Input: 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 18 / 30

  42. The Greedy Merging Algorithm Example ( k = 2) Iteration 0: 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 18 / 30

  43. The Greedy Merging Algorithm Example ( k = 2) Iteration i : 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 18 / 30

  44. The Greedy Merging Algorithm Example ( k = 2) Iteration i : 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 18 / 30

  45. The Greedy Merging Algorithm Example ( k = 2) Iteration i : 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 18 / 30

  46. The Greedy Merging Algorithm Example ( k = 2) Iteration i + 1: 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 0 10 20 30 40 50 18 / 30

  47. Analysis Outline of Rest of Talk An O (1 /ǫ ) Sample Upper Bound 1 The Greedy Merging Algorithm 2 Analysis 3 Experimental Evaluation 4 Conclusions 5 19 / 30

  48. Analysis Runtime Analysis 20 / 30

  49. Analysis Runtime Analysis Theorem The greedy merging algorithm runs in time O ( m ) . 20 / 30

  50. Analysis Runtime Analysis Theorem The greedy merging algorithm runs in time O ( m ) . Proof Sketch. 20 / 30

  51. Analysis Runtime Analysis Theorem The greedy merging algorithm runs in time O ( m ) . Proof Sketch. Each iteration can be performed in time proportional to the number of intervals left in the partition in that iteration. 20 / 30

  52. Analysis Runtime Analysis Theorem The greedy merging algorithm runs in time O ( m ) . Proof Sketch. Each iteration can be performed in time proportional to the number of intervals left in the partition in that iteration. Let s j be the number of intervals after the j th iteration of the algorithm. 20 / 30

  53. Analysis Runtime Analysis Theorem The greedy merging algorithm runs in time O ( m ) . Proof Sketch. Each iteration can be performed in time proportional to the number of intervals left in the partition in that iteration. Let s j be the number of intervals after the j th iteration of the algorithm. Then s j +1 = s j − 4 k + 4 k = s j + 4 k ≤ 9 10 s j 2 2 as long as s j ≥ 5 k . 20 / 30

  54. Analysis Runtime Analysis Theorem The greedy merging algorithm runs in time O ( m ) . Proof Sketch. Each iteration can be performed in time proportional to the number of intervals left in the partition in that iteration. Let s j be the number of intervals after the j th iteration of the algorithm. Then s j +1 = s j − 4 k + 4 k = s j + 4 k ≤ 9 10 s j 2 2 as long as s j ≥ 5 k . Thus the runtime is dominated by the runtime of the first iteration, which runs in time O ( m ). 20 / 30

  55. Analysis Error Analysis 21 / 30

  56. Analysis Error Analysis Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. 21 / 30

  57. Analysis Error Analysis Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Let h ∗ be an optimal k -histogram; i.e. � h ∗ − q � 2 2 = OPT k ( q ). 21 / 30

  58. Analysis Error Analysis Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Let h ∗ be an optimal k -histogram; i.e. � h ∗ − q � 2 2 = OPT k ( q ). Let I = { I 1 , . . . , I 5 k } be the set of intervals we produce. Partition I : 21 / 30

  59. Analysis Error Analysis Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Let h ∗ be an optimal k -histogram; i.e. � h ∗ − q � 2 2 = OPT k ( q ). Let I = { I 1 , . . . , I 5 k } be the set of intervals we produce. Partition I : Let F be the set of intervals in I on which h ∗ has no jumps 21 / 30

  60. Analysis Error Analysis Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Let h ∗ be an optimal k -histogram; i.e. � h ∗ − q � 2 2 = OPT k ( q ). Let I = { I 1 , . . . , I 5 k } be the set of intervals we produce. Partition I : Let F be the set of intervals in I on which h ∗ has no jumps Let J be the set of intervals in I on which h ∗ has jumps 21 / 30

  61. Analysis Error Analysis Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Let h ∗ be an optimal k -histogram; i.e. � h ∗ − q � 2 2 = OPT k ( q ). Let I = { I 1 , . . . , I 5 k } be the set of intervals we produce. Partition I : Let F be the set of intervals in I on which h ∗ has no jumps Let J be the set of intervals in I on which h ∗ has jumps � h − q � 2 � � 2 = flat-err q ( I ) + flat-err q ( I ) . I ∈F I ∈J 21 / 30

  62. Analysis Error Analysis Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Let h ∗ be an optimal k -histogram; i.e. � h ∗ − q � 2 2 = OPT k ( q ). Let I = { I 1 , . . . , I 5 k } be the set of intervals we produce. Partition I : Let F be the set of intervals in I on which h ∗ has no jumps Let J be the set of intervals in I on which h ∗ has jumps � h − q � 2 � � 2 = flat-err q ( I ) + flat-err q ( I ) . I ∈F I ∈J We will bound each term separately. 21 / 30

  63. Analysis Error Analysis (cont.) Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . 22 / 30

  64. Analysis Error Analysis (cont.) Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Error on F : 22 / 30

  65. Analysis Error Analysis (cont.) Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Error on F : Fix I ∈ F . 22 / 30

  66. Analysis Error Analysis (cont.) Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Error on F : Fix I ∈ F . Since h ∗ is flat on I , flat-err q ( I ) ≤ � i ∈ I ( q ( i ) − h ∗ ( I )) 2 . 22 / 30

  67. Analysis Error Analysis (cont.) Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Error on F : Fix I ∈ F . Since h ∗ is flat on I , flat-err q ( I ) ≤ � i ∈ I ( q ( i ) − h ∗ ( I )) 2 . Thus the squared error we have on F is at most the squared error of h ∗ on F , i.e. 22 / 30

  68. Analysis Error Analysis (cont.) Theorem Let h be the output of our algorithm. Then � h − q � 2 2 ≤ 2 · OPT k ( q ) . Proof. Error on F : Fix I ∈ F . Since h ∗ is flat on I , flat-err q ( I ) ≤ � i ∈ I ( q ( i ) − h ∗ ( I )) 2 . Thus the squared error we have on F is at most the squared error of h ∗ on F , i.e. � flat-err q ( I ) ≤ I ∈F 22 / 30

Recommend


More recommend