Project Detail

Gaussian Mixture Neural Networks

arrow_back Back to home

university · 2024

PyTorch GMM Neural Networks

Summary 📘

A research-heavy architecture that fuses Gaussian Mixture Models with a lightweight neural backbone to estimate probability density functions without labels.

The project started as a practical take-home from the AI 2023 exam, but the idea sparked a full exploration thanks to Prof. Trentin.

Behind the model 🧠

  • GMM + NN hybrid: use the statistical rigor of a Gaussian Mixture component as a prior for a neural estimator.
  • Dataset: synthetic distributions handcrafted to stress-test the density estimates.
  • Performance: consistently beats traditional statistical baselines and rivals fully neural PDF approximators like Parzen Neural Networks.

Outcomes & next steps ✨

  • Published the concept in the conference proceedings linked above.
  • The experiments proved the advantages of structured priors for sample-efficient estimation.
  • Next milestone: integrate the hybrid block into generative modeling pipelines and explore its behavior on real-world, noisy datasets.

👋 See you, space cowboy.