Abstract
We aim to establish estimation bounds for various divergences, including total variation, Kullback-Leibler (KL) divergence, Hellinger divergence, and Pearson χ2 divergence, within the GAN estimator. We derive an inequality based on empirical and population objective functions of the GAN model, achieving almost surely convergence rates. Subsequently, this inequality was employed to derive estimation bounds for total variation, Kullback-Leibler (KL) divergence, Hellinger divergence, and Pearson χ2 divergence, leading to almost surely convergence rates and differences between the expected outputs of the discriminator on real data and generated data. Our study demonstrates better results compared to some existing ones, which are a specific case of the general objective function.