Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) 77
Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) 78
Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) F P (P) = F(X 1 ) + F(X 2 ) 79
Designing parity models Goal: transform parities into a form that enables decoder to reconstruct unavailable predictions P = X 1 + X 2 X 2 X 1 parity model (F P ) F(X 2 ) = F P (P) – F(X 1 ) Learn a parity model F P (P) = F(X 1 ) + F(X 2 ) 80
Training a parity model P = X 1 + X 2 Desired output: F(X 1 ) + F(X 2 ) 81
Training a parity model queries 1. Sample inputs and encode P = X 1 + X 2 Desired output: F(X 1 ) + F(X 2 ) 82
Training a parity model queries 1. Sample inputs and encode P = X 1 + X 2 Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 83
Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 F P (P) 1 Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 84
Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss F P (P) 1 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 85
Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss F P (P) 1 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 86
Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss 5. Repeat F P (P) 1 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.8 0.15 0.05 0.2 0.7 0.1 87
Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss 5. Repeat F P (P) 2 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.15 0.8 0.05 0.3 0.5 0.2 88
Training a parity model queries 1. Sample inputs and encode 2. Perform inference with parity model P = X 1 + X 2 3. Compute loss 4. Backpropogate loss 5. Repeat F P (P) 3 compute loss Desired output: F(X 1 ) + F(X 2 ) predictions 0.03 0.02 0.95 0.3 0.3 0.4 89
Training a parity model: higher parameter k P = X 1 + X 2 + X 3 + X 4 Can use higher code parameter k F P (P) 1 Desired output: F(X 1 ) + F(X 2 ) + F(X 3 ) + F(X 4 ) 90
Training a parity model: different encoders P = F P (P) 1 91
Training a parity model: different encoders P = F P (P) 1 92
Training a parity model: different encoders P = F P (P) 1 93
Training a parity model: different encoders P = Can specialize encoders and decoders to inference task at hand F P (P) 1 94
Learning results in approximate reconstructions 95
Learning results in approximate reconstructions Appropriate for machine learning inference 96
Learning results in approximate reconstructions Appropriate for machine learning inference 1. Predictions resulting from inference are approximations 97
Learning results in approximate reconstructions Appropriate for machine learning inference 1. Predictions resulting from inference are approximations 2. Inaccuracy only at play when predictions otherwise slow/failed 98
Parity models in action in Clipper queries Frontend Encoder Decoder parity query parity model 99
Evaluation 1. How accurate are reconstructions using parity models? 2. By how much can parity models help reduce tail latency ? 100
Recommend
More recommend