Squeezing More Utility via Adaptive Clipping on Deferentially Private Gradients in Federated Meta-Learning
Published in Annual Computer Security Applications Conference (ACSAC), 2022
This paper proposes a new differentially private federated meta-learning architecture to addresses data privacy challenges in federated learning. Our proposal features an adaptive gradient clipping method and a one-pass meta-training process to improve model utility-privacy trade-off. It provides two notions of privacy protection for the trusted server and honest-but-curious central server.
Recommended citation: N. Wang, Y. Xiao, Y. Chen, N. Zhang, W. Lou and Y.T. Hou, “Squeezing More Utility via Adaptive Clipping on Deferentially Private Gradients in Federated Meta-Learning,” ACSAC, Dec 5-9, 2022, Austin, USA. http://ning-wang1.github.io/files/dp.pdf