Have a personal or library account? Click to login
Ensembles of instance selection methods: A comparative study Cover
By: Marcin Blachnik  
Open Access
|Mar 2019

References

  1. Abdi, H. (2010). Holm’s sequential Bonferroni procedure, Encyclopedia of Research Design1(8): 620–627.
  2. Aha, D., Kibler, D. and Albert, M. (1991). Instance-based learning algorithms, Machine Learning6(1): 37–66.10.1007/BF00153759
  3. Alcalá-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sanchez, L. and Herrera, F. (2011). Keel data-mining software tool: Data set repository, integration of algorithms and experimental analysis framework, Journal of Multiple-Valued Logic & Soft Computing17: 255–287.
  4. Arnaiz-González, Á., Blachnik, M., Kordos, M. and García-Osorio, C. (2016a). Fusion of instance selection methods in regression tasks, Information Fusion30: 69–79.10.1016/j.inffus.2015.12.002
  5. Arnaiz-González, Á., Díez-Pastor, J., Rodríguez, J.J. and García-Osorio, C.I. (2016b). Instance selection for regression: Adapting DROP, Neurocomputing201: 66–81.10.1016/j.neucom.2016.04.003
  6. Bauer, E. and Kohavi, R. (1999). An empirical comparison of voting classification algorithms: Bagging, boosting, and variants, Machine Learning36(1): 105–139.10.1023/A:1007515423169
  7. Bezdek, J.C., Ehrlich, R. and Full, W. (1984). FCM: The fuzzy C-means clustering algorithm, Computers & Geosciences10(2–3): 191–203.10.1016/0098-3004(84)90020-7
  8. Bhattacharya, B., Poulsen, R. and Toussaint, G. (1984). Application of proximity graphs to editing nearest neighbor decision rules, International Symposium on Information Theory, Santa Monica, CA, USA, pp. 97–108.
  9. Blachnik, M. (2014). Ensembles of instance selection methods based on feature subset, IEEE Procedia Computer Science35: 388–396.10.1016/j.procs.2014.08.119
  10. Blachnik, M. and Kordos, M. (2014). Bagging of instance selection algorithms, International Conference on Artificial Intelligence and Soft Computing, Zakopane, Poland, pp. 40–51.10.1007/978-3-319-07176-3_4
  11. Brighton, H. and Mellish, C. (2002). Advances in instance selection for instance-based learning algorithms, Data Mining and Knowledge Discovery6(2): 153–172.10.1023/A:1014043630878
  12. Czarnowski, I. and Jędrzejowicz, P. (2015). Ensemble online classifier based on the one-class base classifiers for mining data streams, Cybernetics and Systems46(1–2): 51–68.10.1080/01969722.2015.1007736
  13. Freund, Y. and Schapire, R.E. (1996). Experiments with a new boosting algorithm, International Conference on Machine Learning, Bari, Italy, pp. 148–156.
  14. Freund, Y. and Schapire, R.E. (1997). A decision-theoretic generalization of on-line learning and an application to boosting, Journal of Computer and System Sciences55(1): 119–139.10.1006/jcss.1997.1504
  15. Galar, M., Fernández, A., Barrenechea, E., Bustince, H. and Herrera, F. (2011). An overview of ensemble methods for binary classifiers in multi-class problems: Experimental study on one-vs-one and one-vs-all schemes, Pattern Recognition44(8): 1761–1776.10.1016/j.patcog.2011.01.017
  16. García-Osorio, C., de Haro-García, A. and García-Pedraja, N. (2010). Democratic instance selection: A linear complexity instance selection algorithm based on classifier ensemble concepts, Artificial Intelligence174(4–5): 410–441.10.1016/j.artint.2010.01.001
  17. García, S., Derrac, J., Cano, J.R. and Herrera, F. (2012). Prototype selection for nearest neighbor classification: Taxonomy and empirical study, IEEE Transactions on Pattern Analysis and Machine Intelligence34(3): 417–435.10.1109/TPAMI.2011.14221768651
  18. García-Pedrajas, N. (2009). Constructing ensembles of classifiers by means of weighted instance selection, IEEE Transactions on Neural Networks20(2): 258–277.10.1109/TNN.2008.200549619179252
  19. García-Pedrajas, N. and De Haro-García, A. (2014). Boosting instance selection algorithms, Knowledge-Based Systems67: 342–360.10.1016/j.knosys.2014.04.021
  20. García, S., Luengo, J. and Herrera, F. (2016). Tutorial on practical tips of the most influential data preprocessing algorithms in data mining, Knowledge-Based Systems98: 1–29.10.1016/j.knosys.2015.12.006
  21. Grochowski, M. and Jankowski, N. (2004). Comparison of instance selection algorithms. II: Results and comments, Lecture Notes in Computer Science, Vol. 3070, pp. 580–585.10.1007/978-3-540-24844-6_87
  22. Gunn, I.A., Arnaiz-González, Á. and Kuncheva, L.I. (2018). A taxonomic look at instance-based stream classifiers, Neurocomputing286: 167–178.10.1016/j.neucom.2018.01.062
  23. Hart, P. (1968). The condensed nearest neighbor rule, IEEE Transactions on Information Theory14(3): 515–516.10.1109/TIT.1968.1054155
  24. Herrera, F. (2005). Keel, knowledge extraction based on evolutionary learning, Spanish National Projects TIC2002-04036-C05, TIN2005-08386-C05 and TIN2008-06681-C06, http://www.keel.es.
  25. Jacobs, R.A., Jordan, M.I., Nowlan, S.J. and Hinton, G.E. (1991). Adaptive mixtures of local experts, Neural Computation3(1): 79–87.10.1162/neco.1991.3.1.7931141872
  26. Jankowski, N. and Grochowski, M. (2004). Comparison of instance selection algorithms. I: Algorithms survey, International Conference on Artificial Intelligence and Soft Computing, Zakopane, Poland, Vol. 3070, pp. 598–603.10.1007/978-3-540-24844-6_90
  27. Kordos, M. and Blachnik, M. (2012). Instance selection with neural networks for regression problems, International Conference on Artificial Neural Networks, Lausanne, Switzerland, pp. 263–270.10.1007/978-3-642-33266-1_33
  28. Kordos, M. and Rusiecki, A. (2016). Reducing noise impact on MLP training, Soft Computing20(1): 49–65.10.1007/s00500-015-1690-9
  29. Kuncheva, L. (2004). Combining Pattern Classifiers: Methods and Algorithms, Wiley, Hoboken, NJ.10.1002/0471660264
  30. Kuncheva, L.I., Bezdek, J.C. and Duin, R.P. (2001). Decision templates for multiple classifier fusion: An experimental comparison, Pattern Recognition34(2): 299–314.10.1016/S0031-3203(99)00223-X
  31. Marchiori, E. (2008). Hit miss networks with applications to instance selection, Journal of Machine Learning Research9(Jun): 997–1017.
  32. Marchiori, E. (2010). Class conditional nearest neighbor for large margin instance selection, IEEE Transactions on Pattern Analysis and Machine Intelligence32(2): 364–370.10.1109/TPAMI.2009.16420075464
  33. Raviv, Y. and Intrator, N. (1996). Bootstrapping with noise: An effective regularization technique, Connection Science8(3–4): 355–372.10.1080/095400996116811
  34. Rokach, L. (2009). Taxonomy for characterizing ensemble methods in classification tasks: A review and annotated bibliography, Computational Statistics & Data Analysis53(12): 4046–4072.10.1016/j.csda.2009.07.017
  35. Schapire, R.E. (1990). The strength of weak learnability, Machine Learning5(2): 197–227.10.1007/BF00116037
  36. Sebban, M., Nock, R. and Lallich, S. (2002). Stopping criterion for boosting-based data reduction techniques: From binary to multiclass problem, Journal of Machine Learning Research3(Dec): 863–885.
  37. Shaker, A. and Hüllermeier, E. (2012). IBLStreams: A system for instance-based classification and regression on data streams, Evolving Systems3(4): 235–249.10.1007/s12530-012-9059-0
  38. Skurichina, M. and Duin, R.P. (2001). Bagging and the random subspace method for redundant feature spaces, International Workshop on Multiple Classifier Systems, Cagliari, Italy, pp. 1–10.10.1007/3-540-48219-9_1
  39. Song, Y., Liang, J., Lu, J. and Zhao, X. (2017). An efficient instance selection algorithm for k nearest neighbor regression, Neurocomputing251: 26–34.10.1016/j.neucom.2017.04.018
  40. Tomek, I. (1976). An experiment with the edited nearest-neighbor rule, IEEE Transactions on Systems, Man, and Cybernetics6: 448–452.10.1109/TSMC.1976.4309523
  41. Wilson, D. (1972). Asymptotic properties of nearest neighbor rules using edited data, IEEE Transactions Systems, Man and Cybernetics2: 408–421.10.1109/TSMC.1972.4309137
  42. Wilson, D. and Martinez, T. (2000). Reduction techniques for instance-based learning algorithms, Machine Learning38(3): 257–268.10.1023/A:1007626913721
  43. Wolpert, D.H. (1992). Stacked generalization, Neural Networks5(2): 241–259.10.1016/S0893-6080(05)80023-1
  44. Woźniak, M., Graña, M. and Corchado, E. (2014). A survey of multiple classifier systems as hybrid systems, Information Fusion16: 3–17.10.1016/j.inffus.2013.04.006
  45. Zhu, J., Zou, H., Rosset, S. and Hastie, T. (2009). Multi-class AdaBoost, Statistics and Its Interface2(3): 349–360.10.4310/SII.2009.v2.n3.a8
DOI: https://doi.org/10.2478/amcs-2019-0012 | Journal eISSN: 2083-8492 | Journal ISSN: 1641-876X
Language: English
Page range: 151 - 168
Submitted on: Mar 20, 2018
Accepted on: Nov 7, 2018
Published on: Mar 29, 2019
Published by: University of Zielona Góra
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2019 Marcin Blachnik, published by University of Zielona Góra
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.