Have a personal or library account? Click to login
Linear discriminant analysis with a generalization of the Moore–Penrose pseudoinverse Cover

Linear discriminant analysis with a generalization of the Moore–Penrose pseudoinverse

Open Access
|Jun 2013

References

  1. Anderson, T.W. (1984). An Introduction to Multivariate Analysis, Wiley, New York, NY.
  2. Bensmail, H. and Celeux, G. (1996). Regularized Gaussian discriminant analysis through eigenvalue decomposition, Journal of the American Statistical Association91(436): 1743-1748.10.1080/01621459.1996.10476746
  3. Bergmann, G. and Hommel, G. (1988). Improvements of general multiple test procedures for redundant systems of hypotheses, in P. Bauer, G. Hommel and E. Sonnemann (Eds.), Multiple Hypotheses Testing, Springer, Berlin, pp. 110-115.10.1007/978-3-642-52307-6_8
  4. Chen, L.-F., Liao, H.-Y. M., Ko, M.-T., Lin, J.-C. and Yu, G.-J. (2000). A new LDA-based face recognition system which can solve the small sample size problem, Pattern Recognition33(10): 1713-1726.10.1016/S0031-3203(99)00139-9
  5. Cozzolino, D., Restaino, E. and Fassio, A. (2002). Discrimination of yerba mate (Ilex paraguayensis st.hil.) samples according to their geographical origin by means of near infrared spectroscopy and multivariate analysis, Sensing and Instrumentation for Food Qualityand Safety 4(2): 67-72.
  6. d’Aspremont, A., Banerjee, O. and El Ghaoui, L. (2008). First-order methods for sparse covariance selection, SIAMJournal on Matrix Analysis and Applications 30(1): 56-66.10.1137/060670985
  7. Dempster, A. (1972). Covariance selection, Biometrics28(1): 157-175.10.2307/2528966
  8. Demˇsar, J. (2006). Statistical comparisons of classifiers over multiple data sets, Journal of Machine Learning Research7(1): 1-30.
  9. Dey, D.K. and Srinivasan, C. (1985). Estimation of a covariance matrix under Stein’s loss, The Annals of Statistics1(4): 1581-1591.10.1214/aos/1176349756
  10. Dillon, W. and Goldstein, M. (1984). Multivariate Analysis:Methods and Applications, Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics,Wiley, New York, NY.
  11. Duda, R., Hart, P. and Stork, D. (2001). Pattern Classification, Wiley, New York, NY.
  12. Enis, P. and Geisser, S. (1986). Optimal predictive linear discriminants, Annals of Statistics 2(2): 403-410.10.1214/aos/1176342677
  13. Frank, A. and Asuncion, A. (2010). UCI Machine Learning Repository, University of California, Irvine, CA, http://archive.ics.uci.edu/ml.
  14. Friedman, J. H. (1989). Regularized discriminant analysis, Journal of the American Statistical Association84(405): 165-175.10.1080/01621459.1989.10478752
  15. Garcia, S. and Herrera, F. (2008). An extension on “Statistical comparisons of classifiers over multiple data sets” for all pairwise comparisons, Journal of Machine Learning Research9(12): 2677-2694.
  16. Gleim, G.W. (1984). The profiling of professional football players, Clinics in Sports Medicine 3(1): 185-197.10.1016/S0278-5919(20)31364-8
  17. Hawkins, A.D. and Rasmussen, K.J. (1978). The calls of gadoid fish, Journal of the Marine Biological Association of theUnited Kingdom 58(4): 891-911.10.1017/S0025315400056848
  18. Hommel, G. and Bernhard, G. (1994). A rapid algorithm and a computer program for multiple test procedures using logical structures of hypotheses, Computer Methods andPrograms in Biomedicine 43(3-4): 213-6.10.1016/0169-2607(94)90072-8
  19. Hong, Z.-Q. and Yang, J.-Y. (1991). Optimal discriminant plane for a small number of samples and design method of classifier on the plane, Pattern Recognition24(4): 317-324.10.1016/0031-3203(91)90074-F
  20. Iman, R. and Davenport, J. (1980). Approximations of the critical region of the Friedman statistic, Communicationsin Statistics-Theory and Methods 9(6): 571-595.10.1080/03610928008827904
  21. Kuo, B.-C. and Landgrebe, D.A. (2002). A covariance estimator for small sample size classification problems and its application to feature extraction, IEEE Transactions onGeoscience and Remote Sensing 40(4): 814-819.10.1109/TGRS.2002.1006358
  22. Kwak, N., Kim, S., Lee, C. and Choi, T. (2002). An application of linear programming discriminant analysis to classifying and predicting the symptomatic status of HIV/AIDS patients, Journal of Medical Systems 26(5): 427-438.10.1023/A:1016496916732
  23. Lim, T.-S., Loh, W.-Y. and Shih, Y.-S. (2000). A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms, MachineLearning 40(3): 203-228.10.1023/A:1007608224229
  24. Morrison, D. (1990). Multivariate Statistical Methods, McGraw-Hill Series in Probability and Statistics, McGraw-Hill, New York, NY.
  25. Nemenyi, P. (1963). Distribution-free Multiple Comparisons, Ph.D. thesis, Princeton University, Princeton, NJ.
  26. Olkin, I. and Selliah, J. (1975). Estimating covariances in a multivariate normal distribution, Technical report, Stanford University, Stanford, CA.
  27. Piegat, A. and Landowski, A. (2012). Optimal estimator of hypothesis probability for data mining problems with small samples, International Journal of Applied Mathematicsand Computer Science 22(3): 629-645, DOI: 10.2478/v10006-012-0048-z.10.2478/v10006-012-0048-z
  28. Rao, C. and Mitra, S. (1971). Generalized Inverse of Matricesand Its Applications, Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics, Wiley, New York, NY.
  29. R¨obenack, K. and Reinschke, K. (2011). On generalized inverses of singular matrix pencils, International Journal of AppliedMathematics and Computer Science 21(1): 161-172, DOI: 10.2478/v10006-011-0012-3.10.2478/v10006-011-0012-3
  30. Sharma, A. and Paliwal, K.K. (2008). A gradient linear discriminant analysis for small sample sized problem, NeuralProcessing Letters 27(1): 17-24.10.1007/s11063-007-9056-7
  31. Shin, Y.J. and Park, C.H. (2011). Analysis of correlation based dimension reduction methods, International Journal of AppliedMathematics and Computer Science 21(3): 549-558, DOI: 10.2478/v10006-011-0043-9.10.2478/v10006-011-0043-9
  32. Song, F., Zhang, D., Chen, Q. and Wang, J. (2007). Face recognition based on a novel linear discriminant criterion, Pattern Analysis and Applications 10(3): 165-174.10.1007/s10044-006-0057-3
  33. StatSoft, I. (2007). Statistica (data analysis software system), version 8.0, http://www.statsoft.com.
  34. Stein, C., Efron, B. and Morris, C. (1972). Improving theUsual Estimator of a Normal Covariance Matrix, Stanford University, Stanford, CA.
  35. Swets, D.L. and Weng, J. (1996). Using discriminant eigenfeatures for image retrieval, IEEE Transactions onPattern Analysis and Machine Intelligence 18(8): 831-836. 10.1109/34.531802
  36. Tian, Q., Fainman, Y., Gu, Z.H. and Lee, S.H. (1988). Comparison of statistical pattern-recognition algorithms for hybrid processing, I: Linear-mapping algorithms, Journalof the Optical Society of America A: Optics, Image Scienceand Vision 5(10): 1655-1669.10.1364/JOSAA.5.001655
  37. Titterington, D. (1985). Common structure of smoothing techniques in statistics, International Statistical Review53(2): 141-170.10.2307/1402932
  38. van der Heijden, F., Duin, R., de Ridder, D. and Tax, D. (2004). Classification, Parameter Estimation and State Estimation, Wiley, New York, NY.10.1002/0470090154
  39. Yu, H. and Yang, J. (2001). A direct LDA algorithm for high-dimensional data with application to face recognition, Pattern Recognition 34(10): 2067-2070.10.1016/S0031-3203(00)00162-X
DOI: https://doi.org/10.2478/amcs-2013-0035 | Journal eISSN: 2083-8492 | Journal ISSN: 1641-876X
Language: English
Page range: 463 - 471
Published on: Jun 28, 2013
Published by: University of Zielona Góra
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2013 Tomasz Górecki, Maciej Łuczak, published by University of Zielona Góra
This work is licensed under the Creative Commons License.