Skip to main content

Research Repository

Advanced Search

Gradient Origin Networks

Bond-Taylor, Sam; Willcocks, Chris G.

Gradient Origin Networks Thumbnail


Authors

Profile Image

Sam Bond-Taylor samuel.e.bond-taylor@durham.ac.uk
PGR Student Doctor of Philosophy



Abstract

This paper proposes a new type of generative model that is able to quickly learn a latent representation without an encoder. This is achieved using empirical Bayes to calculate the expectation of the posterior, which is implemented by initialising a latent vector with zeros, then using the gradient of the log-likelihood of the data with respect to this zero vector as new latent points. The approach has similar characteristics to autoencoders, but with a simpler architecture, and is demonstrated in a variational autoencoder equivalent that permits sampling. This also allows implicit representation networks to learn a space of implicit functions without requiring a hypernetwork, retaining their representation advantages across datasets. The experiments show that the proposed method converges faster, with significantly lower reconstruction error than autoencoders, while requiring half the parameters.

Citation

Bond-Taylor, S., & Willcocks, C. G. (2021). Gradient Origin Networks.

Conference Name International Conference on Learning Representations
Conference Location Vienna / Virtual
Start Date May 3, 2021
End Date May 7, 2021
Acceptance Date Jan 12, 2021
Publication Date 2021
Deposit Date Nov 27, 2020
Publicly Available Date Oct 28, 2021
Publisher URL https://iclr.cc/

Files





You might also like



Downloadable Citations