diff --git a/paper.md b/paper.md index 67bcac5..58fa169 100644 --- a/paper.md +++ b/paper.md @@ -31,8 +31,8 @@ In addition to its core functionalities, `ISL` offers a suite of utility functio # Methods -Implicit generative models employ an $m$-dimensional latent random variable (r.v.) $\mathbf{z}$ to simulate random samples from a prescribed $n$-dimensional target probability distribution. To be precise, the latent variable undergoes a transformation through a deterministic function $g_{\theta}$, which maps $\mathbb{R}^m \mapsto \mathbb{R}^n$ using the parameter set $\theta$. Given the model capability to generate samples with ease, various techniques can be employed for contrasting two sample collections: one originating from the genuine data distribution and the other from the \jm{model} distribution. This approach essentially constitutes a methodology for \jm{the approximation of probability distributions} via comparison. -Generative adversarial networks (GANs) [goodfellow2014generative], $f$-GANs [nowozin2016f], Wasserstein-GANs (WGANs) [arjovsky2017wasserstein], adversarial variational Bayes (AVB) [mescheder2017adversarial], and \jm{maximum mean-miscrepancy} (MMD) GANs [li2017mmd] are some popular methods that fall within this framework. +Implicit generative models employ an $m$-dimensional latent random variable (r.v.) $\mathbf{z}$ to simulate random samples from a prescribed $n$-dimensional target probability distribution. To be precise, the latent variable undergoes a transformation through a deterministic function $g_{\theta}$, which maps $\mathbb{R}^m \mapsto \mathbb{R}^n$ using the parameter set $\theta$. Given the model capability to generate samples with ease, various techniques can be employed for contrasting two sample collections: one originating from the genuine data distribution and the other from the model distribution. This approach essentially constitutes a methodology for the approximation of probability distributions via comparison. +Generative adversarial networks (GANs) [goodfellow2014generative], $f$-GANs [nowozin2016f], Wasserstein-GANs (WGANs) [arjovsky2017wasserstein], adversarial variational Bayes (AVB) [mescheder2017adversarial], and maximum mean-miscrepancy (MMD) GANs [li2017mmd] are some popular methods that fall within this framework. Approximation of 1-dimensional (1D) parametric distributions is a seemingly naive problem for which the above-mentioned models can perform below expectations. In [zaheer2017gan], the authors report that various types of GANs struggle to approximate relatively simple distributions from samples, emerging with MMD-GAN as the most promising technique. However, the latter implements a kernelized extension of a moment-matching criterion defined over a reproducing kernel Hilbert space, and consequently, the objective function is expensive to compute.