mix your contexts well
play

Mix Your Contexts Well Opportunities Unleashed by Recent Advances in - PowerPoint PPT Presentation

Mix Your Contexts Well Opportunities Unleashed by Recent Advances in Scaling Context-Sensitivity Manas Thakur 1 and V. Krishna Nandivada 2 CC 2020 1 IIT Mandi, India 2 IIT Madras, India Feb 23, 2020 * V. Krishna Nandivada Mix Your Contexts


  1. Mix Your Contexts Well Opportunities Unleashed by Recent Advances in Scaling Context-Sensitivity Manas Thakur 1 and V. Krishna Nandivada 2 CC 2020 1 IIT Mandi, India 2 IIT Madras, India Feb 23, 2020 * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 1 / 20

  2. Context sensitivity Your response should be sensitive to the context – Anonymous A popular way to improve the precision of program analysis, specially for OO programs. Compared to context- insensitive analyses: Usually more precise Usually unscalable * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 2 / 20

  3. Context sensitivity Your response should be sensitive to the context – Anonymous A popular way to improve the precision of program analysis, specially for OO programs. Compared to context- insensitive analyses: Usually more precise Usually unscalable A method may be analyzed multiple times Once in each unique context from which it may be called. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 2 / 20

  4. Many context abstractions ”What is right” depends on the context – Upanisheds Several popular context abstractions in literature: • Call-site sensitivity [Sharir & Pnueli 1978] • Object-sensitivity [Milanova et al. 2005] • Value contexts [Khedker & Karkare 2008] • LSRV contexts [Thakur & Nandivada 2019] • All above with heap cloning [Nystrom et al. 2004] The choice of context abstraction plays an important role in determining the precision and scalability of the analysis. Relative advantages in terms of precision/scalability not well established. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 3 / 20

  5. Call-site sensitivity 1. class A { 2. A f1,f2; 2 contexts for bar 3. void foo() { foo 5 4. A a,b,c,d; foo 6 ... 5. c.bar(a); 4 contexts for fb 6. d.bar(b); 7. } foo 5+bar 11 void bar(A p) { 8. foo 5+bar 12 9. A x = new A(); foo 6+bar 11 10. p.f1.f2 = x; foo 6+bar 12 11. p.fb(); 12. p.fb(); } } In case of recursion? 13. // Assume fb doesn’t // access caller’s heap * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 4 / 20

  6. Value contexts [CC’08] 1. class A { Points-to graph 2. A f1,f2; 3. void foo() { Value-context O i f1 a O a 4. A a,b,c,d; f1 O j ... O i f1 p f1 O a 5. c.bar(a); O b O k b f2 f1 O j O m ... f1 6. d.bar(b); c O c O l f2 f1 7. } this O c O l f1 O m ... f1 d void bar(A p) { 8. (Line 5) (Line 5) 9. A x = new A(); 10. p.f1.f2 = x; O i f2 f1 11. p.fb(); O 8 a O a 12. p.fb(); f1 O j f2 f1 p O b O k } } f2 13. f1 O b O k O m ... b f2 f1 O m ... f1 O l f2 this c O c O l f2 // Assume fb doesn’t f1 (Line 6) // access caller’s heap d (Line 6) bar : analyzed twice, but fb : only once; generally scales better. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 5 / 20

  7. Recent advance: LSRV contexts [CC’19] 1. class A { (Level Summarized Relevant Value Contexts) 2. A f1,f2; 3. void foo() { Line 5: 4. A a,b,c,d; ... 5. c.bar(a); O i f1 O a p 6. d.bar(b); f1 f1 O j O D O D p 7. } Relevant value-context LSRV context void bar(A p) { 8. 9. A x = new A(); Line 6: 10. p.f1.f2 = x; 11. p.fb(); 12. p.fb(); O k f1 O b } } p 13. f1 f1 O l O D O D p Relevant value-context LSRV context // Assume fb doesn’t // access caller’s heap Result: bar and fb both analyzed only once! * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 6 / 20

  8. Another popular choice: Object sensitivity 1. class A { Points-to graph 2. A f1,f2; 3. void foo() { O i f1 4. A a,b,c,d; a O a Object-sensitivity : ... f1 O j 2 contexts for bar : 5. c.bar(a); f1 O b O k b f2 6. d.bar(b); O m ... f1 f2 7. } c O c O l f1 void bar(A p) { 8. d 9. A x = new A(); (Line 5) 10. p.f1.f2 = x; Line 5: Receiver O c 11. p.fb(); O i f2 f1 O 8 a O a 12. p.fb(); f2 f1 O j } } 13. f1 O b O k b f2 O m ... f1 f2 c O c O l f1 d Line 6: Receiver O l (Line 6) * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 7 / 20

  9. What we know What I know is limited, what I don’t is unlimited! – Common folklore. kcsH kobjH k cs = valcs k obj = lsrv object-sensitivity based call-string based k -call-site-sensitive, value contexts and LSRV contexts have the same precision [CC’08, CC’19]. Adding heap cloning improves the precision of call-site- as well as object-sensitivity. The relative precisions of object-sensitivity and call-site/value-contextx/LSRV-contexts are incomparable. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 8 / 20

  10. What we know What I know is limited, what I don’t is unlimited! – Common folklore. kcsH kobjH k cs = valcs k obj = lsrv object-sensitivity based call-string based k -call-site-sensitive, value contexts and LSRV contexts have the same precision [CC’08, CC’19]. Adding heap cloning improves the precision of call-site- as well as object-sensitivity. The relative precisions of object-sensitivity and call-site/value-contextx/LSRV-contexts are incomparable. Our Goal : Get the best of both worlds. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 8 / 20

  11. Adding heap cloning Heap cloning: Specializes objects (allocation sites) with the context in which they are created. Object allocated at line l in context c represented as O l c . Improves the partitioning efficacy of context-sensitivity. Usually generates more optimization opportunities, but with an increased analysis cost. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 9 / 20

  12. Surprise #1 with heap cloning: kcsH vs valcsH Expectations reduce the joy, surprise enhances joy! - Anonymous Recall: valcs ≡ precision kcs 1 class D { 2 void m1() { 3 P p = new P(); 4 Q q1 = p.m2(); Q q2 = p.m2(); } 5 void m2() { 6 return new Q(); 7 } /*m2*/ 8 9 } /*class D*/ * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 10 / 20

  13. Surprise #1 with heap cloning: kcsH vs valcsH Expectations reduce the joy, surprise enhances joy! - Anonymous Recall: valcs ≡ precision kcs kcs : m2 is analyzed twice. 1 class D { valcs : m2 is analyzed once. 2 void m1() { 3 P p = new P(); Both report that q1 and q2 4 Q q1 = p.m2(); are aliases after line 5. Q q2 = p.m2(); } 5 void m2() { 6 return new Q(); 7 } /*m2*/ 8 9 } /*class D*/ * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 10 / 20

  14. Surprise #1 with heap cloning: kcsH vs valcsH Expectations reduce the joy, surprise enhances joy! - Anonymous Recall: valcs ≡ precision kcs kcs : m2 is analyzed twice. 1 class D { valcs : m2 is analyzed once. 2 void m1() { 3 P p = new P(); Both report that q1 and q2 4 Q q1 = p.m2(); are aliases after line 5. Q q2 = p.m2(); } 5 With heap-cloning: void m2() { 6 return new Q(); 7 kcs : q1 and q2 are not } /*m2*/ 8 aliases. 9 } /*class D*/ valcs : q1 and q2 are aliases. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 10 / 20

  15. Surprise #1 with heap cloning: kcsH vs valcsH Expectations reduce the joy, surprise enhances joy! - Anonymous Recall: valcs ≡ precision kcs kcs : m2 is analyzed twice. 1 class D { valcs : m2 is analyzed once. 2 void m1() { 3 P p = new P(); Both report that q1 and q2 4 Q q1 = p.m2(); are aliases after line 5. Q q2 = p.m2(); } 5 With heap-cloning: void m2() { 6 return new Q(); 7 kcs : q1 and q2 are not } /*m2*/ 8 aliases. 9 } /*class D*/ valcs : q1 and q2 are aliases. valcsH ≤ precision kcsH * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 10 / 20

  16. Surprise #2 with heap cloning: lsrvH vs valcsH I love surprises as long as I like the outcome! - Anonymous Recall: lsrv ≡ precision valcs lsrvH ≤ precision valcsH ∗ ∗ Reasoning in the paper. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 11 / 20

  17. Surprise #2 with heap cloning: lsrvH vs valcsH I love surprises as long as I like the outcome! - Anonymous Recall: lsrv ≡ precision valcs lsrvH ≤ precision valcsH ∗ ≤ precision kcsH ∗ Reasoning in the paper. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 11 / 20

  18. Surprise #2 with heap cloning: lsrvH vs valcsH I love surprises as long as I like the outcome! - Anonymous Recall: lsrv ≡ precision valcs lsrvH ≤ precision valcsH ∗ ≤ precision kcsH Moreover: Recall: lsrv �≡ precision kobj lsrvH �≡ precision kobjH ∗ (incomparable). ∗ Reasoning in the paper. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 11 / 20

  19. Our idea: Mix the contexts An insight has little value till it leads to a workable idea – Anonymous Insight: Heap cloning alters the precision relations quite a bit. Existing context abstractions miss cases covered by each other Some approaches (e.g., LSRV variants) scale very well. Q: Why not use the abstractions together? Idea: Merge abstractions c 1 and c 2 to get c 1 • 2 such that c 1 • 2 covers the optimization opportunities covered by both c 1 and c 2 . In the resultant version, a method is analyzed if any of c 1 or c 2 is different from the previous contexts. * V. Krishna Nandivada Mix Your Contexts Well Feb 23 2020 12 / 20

Recommend


More recommend