The advances in machine learning have revealed its great potential for emerging mobile applications such as face recognition and voice assistant. Models trained via a Neural Network (NN) can offer accurate and efficient inference services for mobile users. Unfortunately, the current deployment of such service encounters privacy concerns. Directly offloading the model to the mobile device violates model privacy of the model owner, while feeding user input to the service compromises user privacy. To address this issue, we propose, tailor, and evaluate Leia, a lightweight cryptographic NN inference system at the edge. Unlike prior cryptographic NN inference systems, Leia is designed with two mobile-friendly perspectives. First, Leia leverages the paradigm of edge computing wherein the inference procedure keeps the model closer to the mobile user to foster low latency service. Specifically, Leia’s architecture consists of two non-colluding edge services to obliviously perform NN inference on the encoded user data and model. Second, Leia’s realization makes the judicious use of potentially constrained computational and communication resources in edge devices. In particular, Leia adapts the Binarized Neural Network (BNN), a trending flavor of NN model with low memory footprint and computational cost, and purely chooses the lightweight secret sharing techniques to develop secure blocks of BNN. Empirical validation executed on Raspberry Pi confirms the practicality of Leia, showing that Leia can produce a prediction result with 97% accuracy by 4 seconds in the edge environment.

Go to Source of this post
Author Of this post:

By admin