Wednesday, October 26, 2016

Practical Learning of Deep Gaussian Processes via Random Fourier Features

 

Practical Learning of Deep Gaussian Processes via Random Fourier Features by Kurt Cutajar, Edwin V. Bonilla, Pietro Michiardi, Maurizio Filippone

The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic approach to flexibly quantify uncertainty and carry out model selection in various learning scenarios. In this work, we introduce a novel formulation of DGPs based on random Fourier features that we train using stochastic variational inference. Our proposal yields an efficient way of training DGP architectures without compromising on predictive performance. Through a series of experiments, we illustrate how our model compares favorably to other state-of-the-art inference methods for DGPs for both regression and classification tasks. We also demonstrate how an asynchronous implementation of stochastic gradient optimization can exploit the computational power of distributed systems for large-scale DGP learning.
 
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly