Abstract
This paper applies a generative deep learning model, namely a Variational Autoencoder, on probabilistic optimal power flows. The model utilizes Gaussian approximations in order to adequately represent the distributions of the results of a system under uncertainty. These approximations are realized by applying several techniques from Bayesian deep learning, among them most notably Stochastic Variational Inference. Using the reparameterization trick and batch sampling, the proposed model allows for the training a probabilistic optimal power flow similar to a possibilistic process. The results are shown by application of a reformulation of the Kullback-Leibler divergence, a distance measure of distributions. Not only is the resulting model simple in its appearance, it also shows to perform well and accurate. Furthermore, the paper also explores potential pathways for future research and gives insights for practitioners using such or similar generative models.