Scalable Multi-Class Gaussian Process Classification via Data Augmentation


This paper proposes a new scalable multi–class Gaussian process classification approach building on a novel modified softmax likelihood function. This form of likelihood allows for a latent variable augmentation that leads to a conditionally conjugate model and enables efficient variational inference via block coordinate ascent updates. Our experiments show that our method outperforms state-of-the-art Gaussian process based methods in terms of speed while achieving competitive prediction performance..

NeurIPS 2018 Symposium on Advances in Approximate Bayesian Inference

More Publications