Fast Homomorphic Evaluation of Deep Discretized Neural Networks Florian Bourse Michele Minelli Matthias Minihold Pascal Paillier ENS, CNRS, PSL Research University, INRIA (Work done while visiting CryptoExperts) CRYPTO 2018 – UCSB, Santa Barbara
Enc Enc Machine Learning as a Service (MLaaS) Michele Minelli 2 / 16
Enc Enc Machine Learning as a Service (MLaaS) x Michele Minelli 2 / 16
Enc Enc Machine Learning as a Service (MLaaS) x M ( x ) Michele Minelli 2 / 16
Enc Enc Machine Learning as a Service (MLaaS) x Alice’s privacy! M ( x ) Michele Minelli 2 / 16
Enc Enc Machine Learning as a Service (MLaaS) Possible solution: FHE. Michele Minelli 2 / 16
Enc Machine Learning as a Service (MLaaS) Enc ( x ) Possible solution: FHE. Michele Minelli 2 / 16
Machine Learning as a Service (MLaaS) Enc ( x ) Enc ( M ( x )) Possible solution: FHE. Michele Minelli 2 / 16
Machine Learning as a Service (MLaaS) Enc ( x ) Enc ( M ( x )) Possible solution: FHE. ✓ Privacy data is encrypted (both input and output) ✗ Efficiency main issue with FHE-based solutions Michele Minelli 2 / 16
Machine Learning as a Service (MLaaS) Enc ( x ) Enc ( M ( x )) Possible solution: FHE. ✓ Privacy data is encrypted (both input and output) ✗ Efficiency main issue with FHE-based solutions Goal of this work: homomorphic evaluation of trained networks. Michele Minelli 2 / 16
(Very quick) refresher on neural networks Input Hidden Output layer layers layer d . . . . . . . . . . . . . . . Michele Minelli 3 / 16
(Very quick) refresher on neural networks Computation for every neuron: x 1 w 1 w 2 x 2 y Σ . . . . . . x i , w i , y ∈ R Michele Minelli 3 / 16
(Very quick) refresher on neural networks Computation for every neuron: x 1 w 1 w 2 x 2 y Σ . . . . . . x i , w i , y ∈ R (∑ ) y = f w i x i , i where f is an activation function . Michele Minelli 3 / 16
A specific use case We consider the problem of digit recognition . Michele Minelli 4 / 16
A specific use case We consider the problem of digit recognition . 7 Michele Minelli 4 / 16
A specific use case We consider the problem of digit recognition . 7 Michele Minelli 4 / 16
A specific use case We consider the problem of digit recognition . 7 Dataset: MNIST ( 60 000 training img + 10 000 test img). Michele Minelli 4 / 16
% = = = State of the art Cryptonets [DGBL + 16 ] Michele Minelli 5 / 16
% = = = State of the art Cryptonets [DGBL + 16 ] ✓ Achieves blind, non-interactive classification Michele Minelli 5 / 16
= = = State of the art Cryptonets [DGBL + 16 ] ✓ Achieves blind, non-interactive classification ✓ Near state-of-the-art accuracy ( 98 . 95 % ) Michele Minelli 5 / 16
= = = State of the art Cryptonets [DGBL + 16 ] ✓ Achieves blind, non-interactive classification ✓ Near state-of-the-art accuracy ( 98 . 95 % ) ✗ Replaces sigmoidal activ. functions with low-degree f ( x ) = x 2 Michele Minelli 5 / 16
= = State of the art Cryptonets [DGBL + 16 ] ✓ Achieves blind, non-interactive classification ✓ Near state-of-the-art accuracy ( 98 . 95 % ) ✗ Replaces sigmoidal activ. functions with low-degree f ( x ) = x 2 ✗ Uses SHE = ⇒ parameters have to be chosen at setup time Michele Minelli 5 / 16
= = State of the art Cryptonets [DGBL + 16 ] ✓ Achieves blind, non-interactive classification ✓ Near state-of-the-art accuracy ( 98 . 95 % ) ✗ Replaces sigmoidal activ. functions with low-degree f ( x ) = x 2 ✗ Uses SHE = ⇒ parameters have to be chosen at setup time Main limitation The computation at neuron level depends on the total multiplicative depth of the network ⇒ bad for deep networks! Michele Minelli 5 / 16
= State of the art Cryptonets [DGBL + 16 ] ✓ Achieves blind, non-interactive classification ✓ Near state-of-the-art accuracy ( 98 . 95 % ) ✗ Replaces sigmoidal activ. functions with low-degree f ( x ) = x 2 ✗ Uses SHE = ⇒ parameters have to be chosen at setup time Main limitation The computation at neuron level depends on the total multiplicative depth of the network ⇒ bad for deep networks! Goal: make the computation scale-invariant = ⇒ bootstrapping. Michele Minelli 5 / 16
Enc Enc Enc = A restriction on the model We want to homomorphically compute the multisum ∑ w i x i i Michele Minelli 6 / 16
= A restriction on the model We want to homomorphically compute the multisum ∑ w i x i i Given w 1 , . . . , w p and Enc ( x 1 ) , . . . , Enc ( x p ) , do ∑ w i · Enc ( x i ) i Michele Minelli 6 / 16
= A restriction on the model We want to homomorphically compute the multisum ∑ w i x i i Given w 1 , . . . , w p and Enc ( x 1 ) , . . . , Enc ( x p ) , do ∑ w i · Enc ( x i ) i Proceed with caution In order to maintain correctness, we need w i ∈ Z Michele Minelli 6 / 16
A restriction on the model We want to homomorphically compute the multisum ∑ w i x i i Given w 1 , . . . , w p and Enc ( x 1 ) , . . . , Enc ( x p ) , do ∑ w i · Enc ( x i ) i Proceed with caution In order to maintain correctness, we need w i ∈ Z = ⇒ trade-off efficiency vs. accuracy! Michele Minelli 6 / 16
Discretized neural networks (DiNNs) Goal: FHE-friendly model of neural network. Michele Minelli 7 / 16
Discretized neural networks (DiNNs) Goal: FHE-friendly model of neural network. Definition A DiNN is a neural network whose inputs are integer values in {− I , . . . , I } , and whose weights are integer values in {− W , . . . , W } , for some I , W ∈ N . For every activated neuron of the network, the activation function maps the multisum to integer values in {− I , . . . , I } . Michele Minelli 7 / 16
Discretized neural networks (DiNNs) Goal: FHE-friendly model of neural network. Definition A DiNN is a neural network whose inputs are integer values in {− I , . . . , I } , and whose weights are integer values in {− W , . . . , W } , for some I , W ∈ N . For every activated neuron of the network, the activation function maps the multisum to integer values in {− I , . . . , I } . Not as restrictive as it seems: e.g., binarized NNs; Michele Minelli 7 / 16
Discretized neural networks (DiNNs) Goal: FHE-friendly model of neural network. Definition A DiNN is a neural network whose inputs are integer values in {− I , . . . , I } , and whose weights are integer values in {− W , . . . , W } , for some I , W ∈ N . For every activated neuron of the network, the activation function maps the multisum to integer values in {− I , . . . , I } . Not as restrictive as it seems: e.g., binarized NNs; Trade-off between size and performance; Michele Minelli 7 / 16
Discretized neural networks (DiNNs) Goal: FHE-friendly model of neural network. Definition A DiNN is a neural network whose inputs are integer values in {− I , . . . , I } , and whose weights are integer values in {− W , . . . , W } , for some I , W ∈ N . For every activated neuron of the network, the activation function maps the multisum to integer values in {− I , . . . , I } . Not as restrictive as it seems: e.g., binarized NNs; Trade-off between size and performance; (A basic) conversion is extremely easy. Michele Minelli 7 / 16
Homomorphic evaluation of a DiNN 1 Evaluate the multisum: easy – just need a linearly hom. scheme (∑ ) ∑ w i · Enc ( x i ) = Enc w i x i i i Michele Minelli 8 / 16
Enc Homomorphic evaluation of a DiNN 1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function ( (∑ )) f w i x i i Michele Minelli 8 / 16
Homomorphic evaluation of a DiNN 1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly ( (∑ )) Enc ∗ f w i x i i Michele Minelli 8 / 16
Homomorphic evaluation of a DiNN 1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers ( (∑ )) Enc ∗ f w i x i i Michele Minelli 8 / 16
Homomorphic evaluation of a DiNN 1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers Issues: Choose the message space: guess, statistics, or worst-case Michele Minelli 8 / 16
Homomorphic evaluation of a DiNN 1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers Issues: Choose the message space: guess, statistics, or worst-case The noise grows: need to start from a very small noise Michele Minelli 8 / 16
Homomorphic evaluation of a DiNN 1 Evaluate the multisum: easy – just need a linearly hom. scheme 2 Apply the activation function: depends on the function 3 Bootstrap: can be costly 4 Repeat for all the layers Issues: Choose the message space: guess, statistics, or worst-case The noise grows: need to start from a very small noise How do we apply the activation function homomorphically? Michele Minelli 8 / 16
Recommend
More recommend