Deng, W. and Feng, Q. and Karagiannis, G. and Lin, G. and Liang, F. (2021) 'Accelerating Convergence of Replica Exchange Stochastic Gradient MCMC via Variance Reduction.', International Conference on Learning Representations (ICLR'21).
Replica exchange stochastic gradient Langevin dynamics (reSGLD) has shown promise in accelerating the convergence in non-convex learning; however, an excessively large correction for avoiding biases from noisy energy estimators has limited the potential of the acceleration. To address this issue, we study the variance reduction for noisy energy estimators, which promotes much more effective swaps. Theoretically, we provide a non-asymptotic analysis on the exponential acceleration for the underlying continuous-time Markov jump process; moreover, we consider a generalized Girsanov theorem which includes the change of Poisson measure to overcome the crude discretization based on the Gröwall's inequality and yields a much tighter error in the 2-Wasserstein (W2) distance. Numerically, we conduct extensive experiments and obtain the state-of-the-art results in optimization and uncertainty estimates for synthetic experiments and image data.
|Item Type:||Conference item (Paper)|
|Full text:||Publisher-imposed embargo |
(AM) Accepted Manuscript
File format - PDF (2357Kb)
|Publisher Web site:||https://iclr.cc/|
|Date accepted:||12 January 2021|
|Date deposited:||12 February 2021|
|Date of first online publication:||2021|
|Date first made open access:||No date available|
Save or Share this output
|Look up in GoogleScholar|