Have a personal or library account? Click to login
A feasible k-means kernel trick under non-Euclidean feature space Cover
Open Access
|Dec 2020

References

  1. Ackerman, M., Ben-David, S. and Loker, D. (2010). Towards property-based classification of clustering paradigms, in J. Lafferty et al. (Eds), Advances in Neural Information Processing Systems 23, Curran Associates, Red Hook, NY, pp. 10–18.
  2. Balaji, R. and Bapat, R. (2007). On Euclidean distance matrices, Linear Algebra and Its Applications424(1): 108–117.10.1016/j.laa.2006.05.013
  3. Bao, T. and Kadobayashi, Y. (2008). On tighter inequalities for efficient similarity search in metric spaces, IAENG International Journal of Computer Science35(3): IJCS_35_3_17.
  4. Bradley, P.S., Mangasarian, O.L. and Street, W.N. (1996). Clustering via concave minimization, Proceedings of the 9th International Conference on Neural Information Processing Systems, NIPS’96, Denver, CO, USA, pp. 368–374.
  5. Cailliez, F. (1983). The analytical solution of the additive constant problem, Psychometrika48(2): 305–308.10.1007/BF02294026
  6. Chitta, R., Jin, R., Havens, T. and Jain, A. (2011). Approximate kernel k-means: Solution to large scale kernel clustering, Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD’11, San Diego, CA, USA, pp. 895–903.
  7. Choi, H. and Choi, S. (2005). Kernel Isomap on noisy manifold, 4th International Conference on Development and Learning, Osaka, Japan, pp. 208 – 213.
  8. Cox, T.F. and Cox, M.A.A. (2001). Multidimensional Scaling, 2nd Edn, Chapman & Hall, London.10.1201/9780367801700
  9. Demontis, A., Melis, M., Biggio, B., Fumera, G. and Roli, F. (2017). Super-sparse learning in similarity spaces, CoRR: abs/1712.06131.
  10. Dokmanic, I., Parhizkar, R., Ranieri, J. and Vetterli, M. (2015). Euclidean distance matrices: A short walk through theory, algorithms and applications, CoRR: abs/1502.07541.
  11. Du, L., Zhou, P., Shi, L., Wang, H., Fan, M., Wang, W. and Shen, Y. (2015). Robust multiple kernel k-means using l21-norm, Proceedings of the 24th International Conference on Artificial Intelligence, IJCAI’15, Buenos Aries, Argentina, pp. 3476–3482.
  12. Gisbrecht, A. and Schleif, F.-M. (2015). Metric and non-metric proximity transformations at linear costs, Neurocomputing167(1): 643–657.10.1016/j.neucom.2015.04.017
  13. Gonzalez, J. and Munoz, A. (2010). Representing functional data in reproducing kernel Hilbert spaces with applications to clustering and classification, Statistics and Econometrics Series013, Working paper 10-27.
  14. Gower, J.C. (1982). Euclidean distance geometry, The Mathematical Scientist7: 1–14.
  15. Gower, J.C. (1985). Properties of Euclidean and non-Euclidean distance matrices, Linear Algebra and Its Applications67: 81–97.10.1016/0024-3795(85)90187-9
  16. Gower, J. and Legendre, P. (1986). Metric and Euclidean properties of dissimilarity coefficients, Journal of Classification3(1): 5–48.10.1007/BF01896809
  17. Handhayania, T. and Hiryantob, L. (2015). Intelligent kernel k-means for clustering gene expression, International Conference on Computer Science and Computational Intelligence (ICCSCI 2015), Jakarta, Indonesia, Vol. 59, pp. 171–177.
  18. Higham, N.J. (1988). Computing a nearest symmetric positive semidefinite matrix, Linear Algebra and Its Applications103: 103–118.10.1016/0024-3795(88)90223-6
  19. Hofmann, T., Schölkopf, B. and Smola, A.J. (2008). Kernel methods in machine learning, Annals of Statistics36(3): 1171–1220.10.1214/009053607000000677
  20. Jacobs, D., Weinshall, D. and Gdalyahu, Y. (2000). Classification with nonmetric distances: Image retrieval and class representation, IEEE Transactions on Pattern Analysis and Machine Intelligence22(6): 583–600.10.1109/34.862197
  21. Jain, A. and Zongker, D. (1998). Representation and recognition of handwritten digits using deformable templates, IEEE Transactions on Pattern Analysis and Machine Intelligence19(12): 1386–1390.10.1109/34.643899
  22. Jaworski, M. (2018). Regression function and noise variance tracking methods for data streams with concept drift, International Journal of Applied Mathematics and Computer Science28(3): 559–567, DOI: 10.2478/amcs-2018-0043.10.2478/amcs-2018-0043
  23. Kashima, H., Hu, J., Ray, B. and Singh, M. (2008). K-means clustering of proportional data using l1 distance, Proceedings of the 19th International Conference on Pattern Recognition, Tampa, FL, USA.10.1109/ICPR.2008.4760982
  24. Kleinberg, J. (2002). An impossibility theorem for clustering, Proceedings of the 16th Neural Information Processing Systems Conference, NIPS 2002, Vancouver, BC, Canada, pp. 446–453.
  25. Kłopotek, M. (2019). On the existence of kernel function for kernel-trick of k-means in the light of Gower theorem, Fundamenta Informaticae168(1): 25–43.10.3233/FI-2019-1822
  26. Legendre, P. and Legendre, L. (1998). Numerical Ecology, 2nd Edn, Elsevier, Amsterdam.
  27. Li, C., Georgiopoulos, M. and Anagnostopoulos, G.C. (2013). Kernel-based distance metric learning in the output space, International Joint Conference on Neural Networks (IJCNN), Dallas, TX, USA, pp. 1–8.
  28. Lingoes, J. (1971). Some boundary conditions for a monotone analysis of symmetric matrices, Psychometrika36(2): 195–203.10.1007/BF02291398
  29. Loosli, G., Canu, S. and Ong, C. (2016). Learning SVM in Kreĭn spaces, IEEE Transactions on Pattern Analysis and Machine Intelligence38(6): 1204–1216.10.1109/TPAMI.2015.247783026372207
  30. Marin, D., Tang, M., Ayed, I.B. and Boykov, Y. (2019). Kernel clustering: Density biases and solutions, IEEE Transactions on Pattern Analysis and Machine Intelligence41(1): 136–147.10.1109/TPAMI.2017.278016629990278
  31. Marteau, P.-F. (2019). Times series averaging and denoising from a probabilistic perspective on time-elastic kernels, International Journal of Applied Mathematics and Computer Science29(2): 375–392, DOI: 10.2478/amcs-2019-0028.10.2478/amcs-2019-0028
  32. Pan, V.Y. and Chen, Z.Q. (1999). The complexity of the matrix eigenproblem, Proceedings of the 31st Annual ACM Symposium on Theory of Computing, STOC’99, Atlanta, GA, USA, pp. 507–516.
  33. Pękalska, E. and Duin, R. (2005). The Dissimilarity Representation for Pattern Recognition: Foundations and Applications (Machine Perception and Artificial Intelligence), World Scientific, River Edge, NJ.10.1142/5965
  34. Pękalska, E., Paclík, P. and Duin, R. (2002). A generalized kernel approach to dissimilarity-based classification, Journal of Machine Learning Research2: 175–211.
  35. Qi, H. (2016). A convex matrix optimization for constant problem in multidimensional scaling with application to LLE, SIAM Journal on Optimization26(4): 2564–2590.10.1137/15M1043133
  36. Richter, R., Kyprianidis, J., Springborn, B. and Alexa, M. (2017). Constrained modelling of 3-valent meshes using a hyperbolic deformation metric, Computer Graphics Forum36(6): 62–75.10.1111/cgf.12805
  37. Roth, V., Laub, J., Kawanabe, M. and Buhmann, J. (2003). Optimal cluster preserving embedding of nonmetric proximity data, IEEE Transactions on Pattern Analysis and Machine Intelligence25(12): 1540–1551.10.1109/TPAMI.2003.1251147
  38. Sarma, T., Vishwanath, P. and Reddy, B. (2013). Single pass kernel k-means clustering method, Sadhana38(3): 407–419.10.1007/s12046-013-0143-3
  39. Schleif, F. and Tino, P. (2015). Indefinite proximity learning: A review, Neural Computation27(10): 2039–2096.10.1162/NECO_a_0077026313601
  40. Schoenberg, I.J. (1938). Metric spaces and positive definite functions, Transactions of the American Mathematical Society44(3): 522–536.10.1090/S0002-9947-1938-1501980-0
  41. Shawe-Taylor, J. and Cristianini, N. (2004). Kernel Methods for Pattern Analysis, Cambridge University Press, Cambridge.10.1017/CBO9780511809682
  42. Szalkai, B. (2013). An implementation of the relational k-means algorithm, CoRR: abs/1304.6899.
  43. Szekely, G. and Rizzo, M. (2014). Partial distance correlation with methods for dissimilarities, CoRR: abs/1310.2926.
  44. Tzortzis, G. and Likas, A.C. (2009). The global kernel k-means algorithm for clustering in feature space, IEEE Transactions on Neural Networks7(20): 1181–94.10.1109/TNN.2009.201972219493848
  45. Villmann, T., Kaden, M., Nebel, D. and Bohnsack, A. (2016). Similarities, dissimilarities and types of inner products for data analysis in the context of machine learning, in L. Rutkowski et al. (Eds), Artificial Intelligence and Soft Computing: 15th International Conference, ICAISC 2016, Springer, Cham, pp. 125–133.10.1007/978-3-319-39384-1_11
  46. Wierzchoń, S. and Kłopotek, M. (2018). Modern Clustering Algorithms, Springer, Cham.
  47. Yao, Y. and Chen, H. (2018). Multiple kernel k-means clustering by selecting representative kernels, CoRR: abs/1811.00264.
DOI: https://doi.org/10.34768/amcs-2020-0052 | Journal eISSN: 2083-8492 | Journal ISSN: 1641-876X
Language: English
Page range: 703 - 715
Submitted on: Feb 17, 2020
Accepted on: Jul 17, 2020
Published on: Dec 31, 2020
Published by: University of Zielona Góra
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2020 Robert Kłopotek, Mieczysław Kłopotek, Sławomir Wierzchoń, published by University of Zielona Góra
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.