Have a personal or library account? Click to login
A New Noisy Random Forest Based Method for Feature Selection Cover

A New Noisy Random Forest Based Method for Feature Selection

Open Access
|Jul 2021

References

  1. 1. Akhiat, Y., M. Chahhou, A. Zinedine. Ensemble Feature Selection Algorithm. – International Journal of Intelligent Systems and Applications, Vol. 11, 2019, No 1, p. 24.<a href="https://doi.org/10.5815/ijisa.2019.01.03" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.5815/ijisa.2019.01.03</a>
  2. 2. Akhiat, Y., M. Chahhou, A. Zinedine. Feature Selection Based on Pairwise Evalution. – In: Proc. of 2017 Intelligent Systems and Computer Vision (ISCV’17), IEEE, 2017.
  3. 3. Akhiat, Y., M. Chahhou, A. Zinedine. Feature Selection Based on Graph Representation. – In: Proc. of 5th International Congress on Information Science and Technology (CiSt’18), IEEE, 2018.<a href="https://doi.org/10.1109/CIST.2018.8596467" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1109/CIST.2018.8596467</a>
  4. 4. Venkatesh, B., J. Anuradha. A Review of Feature Selection and Its Methods. – Cybernetics and Information Technologies, Vol. 19, 2019, No 1, pp. 3-26.<a href="https://doi.org/10.2478/cait-2019-0001" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.2478/cait-2019-0001</a>
  5. 5. Li, J., et al. Feature Selection: A Data Perspective. – ACM Computing Surveys (CSUR), Vol. 50, 2017, No 6, pp. 1-45.<a href="https://doi.org/10.1145/3136625" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1145/3136625</a>
  6. 6. Urbanowicz, R. J., et al. Relief-Based Feature Selection: Introduction and Review. – Journal of Biomedical Informatics, Vol. 85, 2018, pp. 189-203.<a href="https://doi.org/10.1016/j.jbi.2018.07.014629983630031057" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1016/j.jbi.2018.07.014629983630031057</a>
  7. 7. Gu, Q., Z. Li, J. Han. Generalized Fisher Score for Feature Selection. – arXiv preprint arXiv:1202.3725, 2012.
  8. 8. Huang, S. H. Supervised Feature Selection: A Tutorial. – Artif. Intell. Research, Vol. 4, 2015, No 2, pp. 22-37.<a href="https://doi.org/10.5430/air.v4n2p22" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.5430/air.v4n2p22</a>
  9. 9. Jović, A., K. Brkić, N. Bogunović. A Review of Feature Selection Methods with Applications. – In: Proc. of 38th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO’15), IEEE, 2015.<a href="https://doi.org/10.1109/MIPRO.2015.7160458" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1109/MIPRO.2015.7160458</a>
  10. 10. Chandrashekar, G., F. Sahin. A Survey on Feature Selection Methods. – Computers & Electrical Engineering, Vol. 40, 2014, No 1, pp. 16-28.<a href="https://doi.org/10.1016/j.compeleceng.2013.11.024" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1016/j.compeleceng.2013.11.024</a>
  11. 11. Breiman, L. Random Forests. – Machine Learning, Vol. 45, 2001, No 1, pp. 5-32.<a href="https://doi.org/10.1023/A:1010933404324" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1023/A:1010933404324</a>
  12. 12. Díaz-Uriarte, R., S. A. de Andres. Gene Selection and Classification of Microarray Data Using Random Forest. – BMC Bioinformatics, Vol. 7, 2006, No 1, p. 3.<a href="https://doi.org/10.1186/1471-2105-7-3136335716398926" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1186/1471-2105-7-3136335716398926</a>
  13. 13. Breiman, L. Bagging Predictors. – Machine Learning, Vol. 24, 1996, No 2, pp. 123-140.<a href="https://doi.org/10.1007/BF00058655" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1007/BF00058655</a>
  14. 14. Strobl, C., et al. Bias in Random Forest Variable Importance Measures: Illustrations, Sources and a Solution. – BMC Bioinformatics, Vol. 8, 2007, No 1, p. 25.<a href="https://doi.org/10.1186/1471-2105-8-25179690317254353" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1186/1471-2105-8-25179690317254353</a>
  15. 15. Breiman, L., et al. Classification and Regression Trees. CRC Press, 1984.
  16. 16. Genuer, R., J.-M. Poggi, C. Tuleau-Malot. Variable Selection Using Random Forests. – Pattern Recognition Letters, Vol. 31, 2010, No 14, pp. 2225-2236.<a href="https://doi.org/10.1016/j.patrec.2010.03.014" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1016/j.patrec.2010.03.014</a>
  17. 17. Poggi, J. M., C. Tuleau. Classification supervis´ee en grande dimension. Application `a l’agr´ement de conduite automobile. – Revue de Statistique Appliqu´ee, LIV, Vol. 4, 2006, pp. 39-58.
  18. 18. Rakotomamonjy, A. Variable Selection Using SVM-Based Criteria. – Journal of Machine Learning Research, Vol. 3, March 2003, pp. 1357-1370.
  19. 19. Díaz-Uriarte, R., S. A. de Andres. Gene Selection and Classification of Microarray Data Using Random Forest. – BMC Bioinformatics, Vol. 7, 2006, No 1, p. 3.<a href="https://doi.org/10.1186/1471-2105-7-3136335716398926" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1186/1471-2105-7-3136335716398926</a>
  20. 20. Menze, B. H., et al. A Comparison of Random Forest and Its Gini Importance with Standard Chemometric Methods for the Feature Selection and Classification of Spectral Data. – BMC Bioinformatics, Vol. 10, 2009, No 1, p. 213.<a href="https://doi.org/10.1186/1471-2105-10-213272442319591666" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1186/1471-2105-10-213272442319591666</a>
  21. 21. Jiang, H., et al. Joint Analysis of Two Microarray Gene-Expression Data Sets to Select Lung Adenocarcinoma Marker Genes. – BMC Bioinformatics, Vol. 5, 2004, No 1 p. 81.
  22. 22. Deng, H., G. Runger. Gene Selection with Guided Regularized Random Forest. – Pattern Recognition, Vol. 46, 2013, No 12, pp. 3483-3489.<a href="https://doi.org/10.1016/j.patcog.2013.05.018" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.1016/j.patcog.2013.05.018</a>
  23. 23. Deng, H. Guided Random Forest in the RRF Package. – arXiv preprint arXiv:1306.0237, 2013.
  24. 24. Dua, D., C. Graff. UCI Machine Learning Repository. 2019. Irvine, CA, University of California, School of Information and Computer Science, 2003. http://archive.ics.uci.edu/ml
  25. 25. Venkatesh, B., J. Anuradha. A Review of Feature Selection and Its Methods. – Cybernetics and Information Technologies, Vol. 19, 2019, No 1, pp. 3-26.<a href="https://doi.org/10.2478/cait-2019-0001" target="_blank" rel="noopener noreferrer" class="text-signal-blue hover:underline">10.2478/cait-2019-0001</a>
  26. 26. Gilles, L. Understanding Random Forests: From Theory to Practice. – arXiv preprint arXiv:1407.7502, 2014.
DOI: https://doi.org/10.2478/cait-2021-0016 | Journal eISSN: 1314-4081 | Journal ISSN: 1311-9702
Language: English
Page range: 10 - 28
Submitted on: Aug 27, 2020
Accepted on: Feb 22, 2021
Published on: Jul 1, 2021
Published by: Bulgarian Academy of Sciences, Institute of Information and Communication Technologies
In partnership with: Paradigm Publishing Services
Publication frequency: 4 times per year

© 2021 Yassine Akhiat, Youness Manzali, Mohamed Chahhou, Ahmed Zinedine, published by Bulgarian Academy of Sciences, Institute of Information and Communication Technologies
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.