Dynamic topic models (DTMs) model the evolution of prevalent themes in literature, online media, and other forms of text over time. DTMs assume that topics change continuously over time and therefore impose continuous stochastic process priors on their model parameters. In this paper, we extend the class of tractable priors from Wiener processes to the generic class of Gaussian processes (GPs). Second, we show how to perform scalable approximate inference in these models based on ideas around stochastic variational inference and Gaussian processes with inducing points. Our experiments show that our generalized model allows us to find interesting patterns that were not accessible by previous approaches.