Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Privacy in Machine Learning (PriML) 2021

ABY2.0: New Efficient Primitives for STPC with Applications to Privacy in Machine Learning (Extended Abstract)

Arpita Patra · Hossein Yalame · Thomas Schneider · Ajith Suresh


Abstract: In this work, we improve semi-honest secure two-party computation (STPC) over rings, specially for privacy-preserving machine learning, with a focus on the efficiency of the online phase. We construct efficient protocols for several privacy-preserving machine learning (PPML) primitives such as scalar product, matrix multiplication, ReLU, and maxpool. The online communication of our scalar product is two ring elements {\em irrespective} of the vector dimension, which is a feature achieved for the first time in PPML literature. We implement and benchmark training and inference of Logistic Regression and Neural Networks over LAN and WAN networks. For training, we improve online runtime (both for LAN and WAN) over SecureML (Mohassel et al., IEEE S\&P'17) in the range $1.5\times$--$6.1\times$, while for inference, the improvements are in the range of $2.5\times$--$754.3\times$.

Chat is not available.