World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Privacy-preserving federated learning on lattice quantization

    https://doi.org/10.1142/S0219691323500200Cited by:4 (Source: Crossref)

    Federated learning (FL) is an important approach to cooperate with multiple devices for learning without exchanging data between devices and central server. However, due to bandwidth and other reasons, the communication efficiency should be considered when the volume of information transmitted is limited. In this paper, we utilize the tool of lattice quantization form quantization theory and the variable intercommunication interval to improve communication efficiency. Meanwhile, to make strong privacy guarantee, we incorporate the notion of differential privacy (DP) to the FL framework with local SGD algorithm. By adding calibrated noises, we propose a universal lattice quantization for differentially private federated averaging algorithm (ULQ-DP-FedAvg). We provide tight privacy bound by using some privacy techniques. We also analyze the convergence bound of ULQ-DP-FedAvg based on bits rate constraints and the growing inter-communication interval as well as the data are non-independent identically distribution (Non-IID). It turns out that the algorithm converges and preserves that the privacy has scarcely influenced on the convergence rate. The effectiveness of our algorithm is demonstrated by synthetic and real datasets.

    AMSC: 62J12, 68W15, 90C90