CPA: Compressed Private Aggregation for Scalable Federated Learning over Massive Networks
Natalie Lang ( Ben-Gurion University of the Negev); Elad Sofer (Ben-Gurion University of the Negev); Nir Shlezinger (Ben-Gurion University); Rafael D'Oliveira (Clemson University); Salim El Rouayheb (Rutgers University)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Federated learning (FL) allows a central server to train a model using remote users’ data. FL faces challenges in preserving the local datasets privacy and in its communication overhead; which is considerably dominant in large-scale networks. These limitations are often mitigated individually by local differential privacy (LDP) mechanisms, compression, and user-selection techniques, which often come at the cost of accuracy. In this work we present compressed private aggregation (CPA), which allows massive deployments to simultaneously communicate at extremely low bit-rates while achieving privacy, anonymity, and resilience to malicious users. CPA randomizes a code-book for compressing the data into a few bits, ensuring anonymity and robustness, with a subsequent perturbation to hold LDP. We provide both a theoretical analysis and a numerical study, demonstrating the performance gains of CPA compared with separate mechanisms for compression and privacy.