Operations Research Transactions ›› 2024, Vol. 28 ›› Issue (2): 47-57.doi: 10.15960/j.cnki.issn.1007-6093.2024.02.003

Previous Articles     Next Articles

A class of differential privacy stochastic gradient descent algorithm with adaptive gradient clipping

Jiaqi ZHANG1, Jueyou LI2,*()   

  1. 1. School of Mathematics and Statistics, Chongqing University, Chongqing 400044, China
    2. School of Mathematical Sciences, Chongqing Normal University, Chongqing 401331, China
  • Received:2022-06-22 Online:2024-06-15 Published:2024-06-07
  • Contact: Jueyou LI E-mail:lijueyou@cqnu.edu.cn

Abstract:

Gradient clipping is an effective method to prevent gradient explosion, but the selection of the gradient clipping parameter usually has a great influence on the performance of training models.To address this issue, this paper proposes an improved differentially private stochastic gradient descent algorithm by adaptively adjusting the gradient clipping parameter. First, an adaptive gradient clipping method is proposed by using the quantile and exponential averaging strategy to dynamically and adaptively adjust the gradient clipping parameter. Second, the convergence and privacy of the proposed algorithm for the case of non-convex objective function are analyzed. Finally, numerical simulations are performed on MNIST, Fasion-MNIST and IMDB datasets. The results show that the proposed algorithm can significantly improve the model accuracy compared to traditional stochastic gradient descent methods.

Key words: stochastic gradient descent algorithm, differential privacy, gradient clipping, adaptivity

CLC Number: