Variational Mixture of HyperGenerators for learning distributions over functions

Abstract: Recent approaches build on implicit neural representations (INRs) to propose generative models over function spaces. However, they are computationally costly when dealing inference tasks, such as missing data imputation, or directly cannot tackle them. In this presentation, we will talk about a novel deep generative model, Variational Mixture of HyperGenerators (VAMoH). VAMoH combines the capabilities of modeling continuous functions using INRs and the inference capabilities of Variational Autoencoders (VAEs). Through experiments on a diverse range of data types, such as images, voxels, and climate data, we show that VAMoH can effectively learn rich distributions over continuous functions. Furthermore, it can perform inference-related tasks, such as conditional super-resolution generation and in-painting, as well or better than previous approaches, while being less computationally demanding.

Short bio: Batuhan Koyuncu is an ELLIS PhD student at Saarland University, advised by Isabel Valera and co-advised by Ole Winther. His research interests include building expressive, efficient, and interpretable deep generative models, and utilizing their applications in psychiatry and healthcare.

Presenter: Batuhan Koyuncu

Date: 2023-04-05 10:00 (CEST)

Location: Salon de Actos Politecnica IV, Carretera San Vicente del Raspeig s/n, San Vicente del Raspeig 03690, Alicante ES

Online: https://vertice.cpd.ua.es/281053