Bond-Taylor, Sam and Willcocks, Chris G. (2021) 'Gradient Origin Networks.', International Conference on Learning Representations Vienna / Virtual, 3 - 7 May 2021.
Abstract
This paper proposes a new type of generative model that is able to quickly learn a latent representation without an encoder. This is achieved using empirical Bayes to calculate the expectation of the posterior, which is implemented by initialising a latent vector with zeros, then using the gradient of the log-likelihood of the data with respect to this zero vector as new latent points. The approach has similar characteristics to autoencoders, but with a simpler architecture, and is demonstrated in a variational autoencoder equivalent that permits sampling. This also allows implicit representation networks to learn a space of implicit functions without requiring a hypernetwork, retaining their representation advantages across datasets. The experiments show that the proposed method converges faster, with significantly lower reconstruction error than autoencoders, while requiring half the parameters.
Item Type: | Conference item (Poster) |
---|---|
Full text: | (VoR) Version of Record Download PDF (4983Kb) |
Status: | Peer-reviewed |
Publisher Web site: | https://iclr.cc/ |
Date accepted: | 12 January 2021 |
Date deposited: | 28 October 2021 |
Date of first online publication: | 2021 |
Date first made open access: | 28 October 2021 |
Save or Share this output
Export: | |
Look up in GoogleScholar |