Generative modeling is the task of accurately learning a data distribution with the end-goal of evaluating the probability of some specific or generating new samples from the distribution. To do this, we must either explicitly define a model that parameterizes the distribution or implicitly train a model that can sample from the distribution.

Explicit Density

Modeling a completely flexible is intractable due to the exponential number of parameters required. We thus use ๐Ÿชฉ Probabilistic Graphical Models that have some tractable parameterization or approximation methods to simplify the distribution.

  1. ๐Ÿ•ฐ๏ธ Autoregressive Models flexibly captureย ย by conditioning each dimension on the ones before it. One landmark example is ๐Ÿ PixelCNN.
  2. ๐Ÿ’ฆ Normalizing Flow models transform a simple distribution intoย  via invertible operations.
  3. ๐Ÿ–‹๏ธ Variational Autoencoders map a latent spaceย  to our desired distribution.
  4. โšก๏ธ Energy-Based Models assume the Boltzmann distribution and model the energy of the distribution rather than the distribution itself.

Implicit Density

Some models focus on carefully defining objectives rather than deriving them from an analytical distribution. The most famous example is the ๐Ÿ–ผ๏ธ Generative Adversarial Network, which optimizes a generator and discriminator in a two-player minimax game.