Have a personal or library account? Click to login
f-divergence Analysis of Generative Adversarial Network Cover
By: Mahmud Hasan and  Hailin Sang  
Open Access
|Dec 2025

Abstract

We aim to establish estimation bounds for various divergences, including total variation, Kullback-Leibler (KL) divergence, Hellinger divergence, and Pearson χ2 divergence, within the GAN estimator. We derive an inequality based on empirical and population objective functions of the GAN model, achieving almost surely convergence rates. Subsequently, this inequality was employed to derive estimation bounds for total variation, Kullback-Leibler (KL) divergence, Hellinger divergence, and Pearson χ2 divergence, leading to almost surely convergence rates and differences between the expected outputs of the discriminator on real data and generated data. Our study demonstrates better results compared to some existing ones, which are a specific case of the general objective function.

DOI: https://doi.org/10.2478/fcds-2025-0018 | Journal eISSN: 2300-3405 | Journal ISSN: 0867-6356
Language: English
Page range: 451 - 472
Submitted on: May 19, 2025
Accepted on: Oct 16, 2025
Published on: Dec 8, 2025
Published by: Poznan University of Technology
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2025 Mahmud Hasan, Hailin Sang, published by Poznan University of Technology
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.