Have a personal or library account? Click to login

References

  1. Bassetti, F., Bodini, A. and Regazzini, E. (2006). On minimum Kantorovich distance estimators, Statistics and Probability Letters 76(12): 1298–1302.10.1016/j.spl.2006.02.001
  2. Bishop, C.M. (2006). Pattern Recognition and Machine Learning, Springer, New York.
  3. Dempster, A.P., Laird, N.M. and Rubin, D.B. (1977). Maximum likelihood from incomplete data via the EM algorithm, Journal of the Royal Statistical Society: Series B (Methodological) 39(1): 1–22.10.1111/j.2517-6161.1977.tb01600.x
  4. Elkan, C. (1997). Boosting and naive Bayesian learning, Proceedings of the International Conference on Knowledge Discovery and Data Mining, Newport Beach, USA.
  5. Flach, P.A. and Lachiche, N. (2004). Naive Bayesian classification of structured data, Machine Learning 57(3): 233–269.10.1023/B:MACH.0000039778.69032.ab
  6. Giorgi, G.M. and Gigliarano, C. (2017). The Gini concentration index: A review of the inference literature, Journal of Economic Surveys 31(4): 1130–1148.10.1111/joes.12185
  7. Greenspan, H., Ruf, A. and Goldberger, J. (2006). Constrained Gaussian mixture model framework for automatic segmentation of MR brain images, IEEE Transactions on Medical Imaging 25(9): 1233–1245.10.1109/TMI.2006.880668
  8. Kłopotek, R., Kłopotek, M. and Wierzchoń, S. (2020). A feasible k-means kernel trick under non-Euclidean feature space, International Journal of Applied Mathematics and Computer Science 30(4): 703–715, DOI: 10.34768/amcs-2020-0052.
  9. Kulczycki, P. (2018). Kernel estimators for data analysis, in M. Ram and J.P. Davim (Eds), Advanced Mathematical Techniques in Engineering Sciences, CRC/Taylor & Francis, Boca Raton, pp. 177–202.10.1201/b22440-10
  10. López-Lobato, A.L. and Avendaño-Garrido, M.L. (2020). Using the Gini index for a Gaussian mixture model, in L. Martínez-Villaseñor et al. (Eds), Advances in Computational Intelligence. MICAI 2020, Lecture Notes in Computer Science, Vol. 12469, Springer, Cham, pp. 403–418.10.1007/978-3-030-60887-3_35
  11. Mao, C., Lu, L. and Hu, B. (2020). Local probabilistic model for Bayesian classification: A generalized local classification model, Applied Soft Computing 93: 106379.10.1016/j.asoc.2020.106379
  12. Meng, X.-L. and Rubin, D.B. (1994). On the global and componentwise rates of convergence of the EM algorithm, Linear Algebra and its Applications 199(Supp. 1): 413–425.
  13. Povey, D., Burget, L., Agarwal, M., Akyazi, P., Kai, F., Ghoshal, A., Glembek, O., Goel, N., Karafiát, M., Rastrow, A., Rose, R., Schwarz, P. and Thomas, S. (2011). The subspace Gaussian mixture model: A structured model for speech recognition, Computer Speech & Language 25(2): 404–439.10.1016/j.csl.2010.06.003
  14. Rachev, S., Klebanov, L., Stoyanov, S. and Fabozzi, F. (2013). The Methods of Distances in the Theory of Probability and Statistics, Springer, New York, pp. 659–663.
  15. Reynolds, D.A. (2009). Gaussian mixture models, in S.Z. Li (Ed.), Encyclopedia of Biometrics, Springer, New York, pp. 659–663.10.1007/978-0-387-73003-5_196
  16. Rubner, Y., Tomasi, C. and Guibas, L.J. (2000). The Earth mover’s distance as a metric for image retrieval, International Journal of Computer Vision 40(2): 99–121.
  17. Singh, R., Pal, B.C. and Jabr, R.A. (2009). Statistical representation of distribution system loads using Gaussian mixture model, IEEE Transactions on Power Systems 25(1): 29–37.10.1109/TPWRS.2009.2030271
  18. Torres-Carrasquillo, P.A., Reynolds, D.A. and Deller, J.R. (2002). Language identification using Gaussian mixture model tokenization, 2002 IEEE International Conference on Acoustics, Speech, and Signal Processing, Orlando, USA, pp. I–757.
  19. Ultsch, A. and Lötsch, J. (2017). A data science based standardized Gini index as a Lorenz dominance preserving measure of the inequality of distributions, PloS One 12(8): e0181572.10.1371/journal.pone.0181572555210328796778
  20. Vaida, F. (2005). Parameter convergence for EM and MM algorithms, Statistica Sinica 15(2005): 831–840.
  21. Villani, C. (2003). Topics in Optimal Transportation, American Mathematical Society, Providence.10.1090/gsm/058
  22. Xu, L. and Jordan, M.I. (1996). On convergence properties of the EM algorithm for Gaussian mixtures, Neural Computation 8(1): 129–151.10.1162/neco.1996.8.1.129
DOI: https://doi.org/10.34768/amcs-2021-0033 | Journal eISSN: 2083-8492 | Journal ISSN: 1641-876X
Language: English
Page range: 487 - 500
Submitted on: Mar 29, 2021
Accepted on: Jul 1, 2021
Published on: Sep 27, 2021
Published by: Sciendo
In partnership with: Paradigm Publishing Services
Publication frequency: 4 times per year

© 2021 Adriana Laura López-Lobato, Martha Lorena Avendaño-Garrido, published by Sciendo
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.