parity models erasure coded resilience for prediction
play

Parity Models Erasure-Coded Resilience for Prediction Serving - PowerPoint PPT Presentation

Parity Models Erasure-Coded Resilience for Prediction Serving Systems Jack Kosaian Rashmi Vinayak Shivaram Venkataraman Rashmi Vinayak Shivaram Venkataraman 2 Inference: using a trained ML model 3 Inference: using a trained ML model 4


  1. Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) 77

  2. Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) 78

  3. Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) F P (P) = F(X 1 ) + F(X 2 ) 79

  4. Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) Learn a parity model F P (P) = F(X 1 ) + F(X 2 ) 80

  5. Training a parity model P = X 1 + X 2 Desired output: F(X 1 ) + F(X 2 ) 81

  6. Training a parity model queries 1. Sample inputs and encode P = X 1 + X 2 Desired output: F(X 1 ) + F(X 2 ) 82

  7. Training a parity model queries 1. Sample inputs and encode P = X 1 + X 2 Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 83

  8. Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 F P (P) 1 Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 84

  9. Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss F P (P) 1 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 85

  10. Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss F P (P) 1 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 86

  11. Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss 5. Repeat F P (P) 1 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 87

  12. Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss 5. Repeat F P (P) 2 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.15 0.8 0.05 0.3 0.5 0.2 88

  13. Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss 5. Repeat F P (P) 3 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.03 0.02 0.95 0.3 0.3 0.4 89

  14. Training a parity model: higher parameter k P = X 1 + X 2 + X 3 + X 4 Can use higher code parameter k F P (P) 1 Desired output: F(X 1 ) + F(X 2 ) + F(X 3 ) + F(X 4 ) 90

  15. Training a parity model: different encoders P = F P (P) 1 91

  16. Training a parity model: different encoders P = F P (P) 1 92

  17. Training a parity model: different encoders P = F P (P) 1 93

  18. Training a parity model: different encoders P = Can specialize encoders and decoders to inference task at hand F P (P) 1 94

  19. Learning results in approximate reconstructions 95

  20. Learning results in approximate reconstructions Appropriate for machine learning inference 96

  21. Learning results in approximate reconstructions Appropriate for machine learning inference 1. Predictions resulting from inference are approximations 97

  22. Learning results in approximate reconstructions Appropriate for machine learning inference 1. Predictions resulting from inference are approximations 2. Inaccuracy only at play when predictions otherwise slow/failed 98

  23. Parity models in action in Clipper queries Frontend Encoder Decoder parity query parity model 99

  24. Evaluation 1. How accurate are reconstructions using parity models? 2. By how much can parity models help reduce tail latency ? 100

Recommend


More recommend