Date of Award

2023

Publication Type

Thesis

Degree Name

M.Sc.

Department

Computer Science

Keywords

Additive secret sharing, Federated learning, Privacy-preserving machine learning, Secure aggregation

Supervisor

D.Alhadidi

Supervisor

S.Samet

Rights

info:eu-repo/semantics/openAccess

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Abstract

Federated learning is a machine learning technique where multiple clients with local data collaborate in training a machine learning model. In FedAvg, the main federated learning algorithm, clients train machine learning models locally and share the trained model with the server. While the sensitive data will never be sent to the server, a malicious server can construct the original training data by having access to the clients’ models in each training round. Secure aggregation techniques such as cryptography, trusted execution environment, or differential privacy are used to solve this problem. However, these techniques incur computation and communication overhead or affect the model’s accuracy. In this thesis, we consider a secure multi-party computation setup where clients use additive secret sharing to send their models to multiple servers. Our solution provides secure aggregation as long as there are at least two non-colluding servers. Moreover, we provide mathematical proof to show that the securely aggregated model at the end of each training round is exactly equal to the one provided by FedAvg without affecting accuracy and with efficient communication and computation. In comparison with SCOTCH, the state-of-the-art secure aggregation solution, experimental results show that our approach is 557% faster compared to SCOTCH and at the same time it reduces the communication cost of clients by 25%. Additionally, the accuracy of the trained model is exactly as FedAvg under balanced, unbalanced, IID, and Non-IID data distributions while it is only 8% slower.

Share

COinS