Selecting the Optimal Classifier for Wrapper-Based Feature Selection Methods

Authors

  • Farzad Zandi Department of Mathematics and Computer Science, Arak Branch, Islamic Azad University, Arak, Iran
  • Parvaneh Mansouri Department of Mathematics and Computer Science, Arak Branch, Islamic Azad University, Arak, Iran
  • Reza Sheibani Department of Computer Engineering, Mashhad Branch, Islamic Azad University, Mashhad, Iran

DOI:

https://doi.org/10.53560/PPASA(61-3)848

Keywords:

Feature selection, Wrapper-based Methods, Metaheuristic Algorithms, Roulette Wheel, Optimal Classifier

Abstract

Dimensionality reduction, the elimination of irrelevant features, and the selection of an optimal subset of features are critical components in the construction of an efficacious machine learning model. Among the various feature selection methodologies, wrapper-based methods yield superior results due to their evaluation of candidate subsets. Numerous meta-heuristic methods have been employed in this feature selection process. A significant and complex issue in feature selection utilizing these methods is the selection of the most suitable classifier. In this study, we propose a novel method for selecting the optimal classifier during the feature selection process. We employ ten distinct classifiers for two swarm intelligence methods, namely Bat and Gray Wolf, and compute their results on four cancer datasets: Leukemia, SRBCT, Prostate, and Colon. Our findings demonstrate that the proposed method identifies the optimal classifiers for all four datasets. Consequently, when employing wrapper-based methods to select features for each dataset, the optimal classifier is identified.

References

A. Jović, K. Brkić, and N. Bogunović. A review of feature selection methods with applications. 38th international convention on information and communication technology, electronics and microelectronics (MIPRO), Ieee, Opatija, Croatia, pp. 1200-1205 (2015).

P.E. Greenwood and M.S. Nikulin (Eds.). A guide to chi-squared testing. John Wiley & Sons, New York, USA (1996).

B. Azhagusundari and A.S. Thanamani. Feature selection based on information gain. International Journal of Innovative Technology and Exploring Engineering (IJITEE) 2(2): 18-21 (2013).

A.G. Asuero, A. Sayago, and A. González. The correlation coefficient: An overview. Critical Reviews in Analytical Chemistry 36(1): 41-59 (2006).

X.W. Chen and J.C. Jeong. Enhanced recursive feature elimination. 6th International Conference on Machine Learning and Applications (ICMLA), IEEE, Cincinnati, OH, USA, pp. 429-435 (2007).

V. Kumar and S. Minz. Feature selection. SmartCR 4(3): 211-229 (2014).

V. Fonti and E. Belitser. Feature selection using lasso. VU Amsterdam Research Paper in Business Analytics 30: 1-25 (2017).

X.S. Yang. Metaheuristic optimization. Scholarpedia 6(8): 11472 (2011).

P. Agrawal, H.F. Abutarboush, T. Ganesh, and A.W. Mohamed. Metaheuristic algorithms on feature selection: A survey of one decade of research (2009-2019). Ieee Access 9: 26766-26791 (2021).

I. Muhammad and Z. Yan. Supervised machine learning approaches: a survey. ICTACT Journal on Soft Computing 5(3): 946-952 (2015).

M.D. Purbolaksono, K.C. Widiastuti, M.S. Mubarok, and F.A. Ma’ruf. Implementation of mutual information and bayes theorem for classification microarray data. Journal of Physics: Conference Series 971(1): 012011 (2018).

H. Aydadenta and A. Adiwijaya. A clustering approach for feature selection in microarray data classification using random forest. Journal of Information Processing Systems 14(5): 1167-1175 (2018).

N.D. Cilia, C. De Stefano, F. Fontanella, S. Raimondo, and A. Scotto di Freca. An experimental comparison of feature-selection and classification methods for microarray datasets. Information 10(3): 109 (2019).

V. Bolón-Canedo, N. Sánchez-Marono, A. Alonso-Betanzos, J.M. Benítez, and F. Herrera. A review of microarray datasets and applied feature selection methods. Information sciences 282: 111-135 (2014).

M. Al-Batah, B. Zaqaibeh, S.A. Alomari, and M.S. Alzboon. Gene microarray cancer classification using correlation based feature selection algorithm and rules classifiers. International Journal of Online & Biomedical Engineering 15(8): 62 (2019).

S.K. Baliarsingh, C. Dora, and S. Vipsita. Jaya optimized extreme learning machine for breast cancer data classification. Intelligent and Cloud Computing: Proceedings of ICICC 2019 2: Springer, Singapore, pp. 459-467 (2021).

Q. Su, Y. Wang, X. Jiang, F. Chen, and W.C. Lu. A cancer gene selection algorithm based on the KS test and CFS. BioMed Research International 1: 1645619 (2017).

F.K. Ahmad. A comparative study on gene selection methods for tissues classification on large scale gene expression data. Jurnal Teknologi 78(5-10): 116-125 (2016).

Q. Wu, Z. Ma, J. Fan, G. Xu, and Y. Shen. A feature selection method based on hybrid improved binary quantum particle swarm optimization. Ieee Access 7: 80588-80601 (2019).

S.A. Medjahed, T.A. Saadi, A. Benyettou, and M. Ouali. Kernel-based learning and feature selection analysis for cancer diagnosis. Applied Soft Computing 51: 39-48 (2017).

I. Jain, V.K. Jain, and R. Jain. Correlation feature selection based improved-binary particle swarm optimization for gene selection and cancer classification. Applied Soft Computing 62: 203-215 (2018).

S. Shahbeig, M.S. Helfroush, and A. Rahideh. A fuzzy multi-objective hybrid TLBO–PSO approach to select the associated genes with breast cancer. Signal Processing 131: 58-65 (2017).

M. Abd-Elnaby, M. Alfonse, and M. Roushdy. Classification of breast cancer using microarray gene expression data: A survey. Journal of Biomedical Informatics 117: 103764 (2021).

O. Kramer (Ed.). Dimensionality reduction with unsupervised nearest neighbors. Springer, Berlin, Heidelberg, Germany (2013).

D.A. Pisner and D.M. Schnyer. Support vector machine. Machine learning, Elsevier, Amesterdam, Netherlands, pp. 101-121 (2020).

M.E. Glickman and D.A. Van Dyk. Basic bayesian methods. Topics in Biostatistics 404: 319-338 (2007).

C. Peng and Q. Cheng. Discriminative ridge machine: A classifier for high-dimensional data or imbalanced data. IEEE Transactions on Neural Networks and Learning Systems 32(6): 2595-2609 (2020).

Y.Y. Song and L.U. Ying. Decision tree methods: applications for classification and prediction. Shanghai Archives of Psychiatry 27(2): 130 (2015).

L. Breiman. Random forests. Machine learning 45: 5-32 (2001).

I. Cohen and M. Goldszmidt. Properties and benefits of calibrated classifiers. European conference on principles of data mining and knowledge discovery, Springer, Berlin, Heidelberg, Germany, pp. 125-136 (2004).

G. Ke, Q. Meng, T. Finley, T. Wang, W. Chen, W. Ma, Q. Ye, and T.Y. Liu. Lightgbm: A highly efficient gradient boosting decision tree. 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA (2017).

M.C. Popescu, V.E. Balas, L. Perescu-Popescu, and N. Mastorakis. Multilayer perceptron and neural networks. WSEAS Transactions on Circuits and Systems 8(7): 579-588 (2009).

P. Xanthopoulos, P.M. Pardalos, T.B. Trafalis, P. Xanthopoulos, P.M. Pardalos, and T.B. Trafalis (Eds.). Linear discriminant analysis. In: Robust Data Mining. SpringerBriefs in Optimization. Springer, New York pp. 27-33 (2013).

T.R. Golub, D.K. Slonim, P. Tamayo, C. Huard, M. Gaasenbeek, J.P. Mesirov, ... and E.S. Lander. Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. Science 286(5439): 531-537 (1999).

J. Khan, J.S. Wei, M. Ringner, L.H. Saal, M. Ladanyi, F. Westermann, ... and P.S. Meltzer. Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks. Nature Medicine 7(6): 673-679 (2001).

K. Nirmalakumari, H. Rajaguru, and P. Rajkumar. Microarray prostate cancer classification using eminent genes. In: 2021 Smart Technologies, Communication and Robotics (STCR). Sathyamangalam, India, pp. 1-5. (2021).

U. Alon, N. Barkai, D.A. Notterman, K. Gish, S. Ybarra, D. Mack, and A.J. Levine. Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proceedings of the National Academy of Sciences 96(12): 6745-6750 (1999).

D.R. Griffin, F.A. Webster, and C.R. Michael. The echolocation of flying insects by bats. Animal Behavior 8(3-4): 141-154 (1960).

W. Metzner. Echolocation behaviour in bats. Science Progress 75: 453-465 (1991).

H.U. Schnitzler and E.K. Kalko. Echolocation by insect-eating bats: we define four distinct functional groups of bats and find differences in signal structure that correlate with the typical echolocation tasks faced by each group. Bioscience 51(7): 557-569 (2001).

X.S. Yang. Bat algorithm for multi-objective optimization. International Journal of Bio-Inspired Computation 3(5): 267-274 (2011).

S. Mirjalili, S.M. Mirjalili, and A. Lewis. Grey wolf optimizer. Advances in Engineering Software 69: 46-61 (2014).

N. Mittal, U. Singh, and B.S. Sohi. Modified grey wolf optimizer for global engineering optimization. Applied Computational Intelligence and Soft Computing 1: 7950348 (2016).

S. Anand, N. Afreen, and S. Yazdani. A novel and efficient selection method in genetic algorithm. International Journal of Computer Applications 129(15): 7-12 (2015).

Published

2024-09-24

How to Cite

Zandi, F., Mansouri, P., & Sheibani, R. (2024). Selecting the Optimal Classifier for Wrapper-Based Feature Selection Methods. Proceedings of the Pakistan Academy of Sciences: A. Physical and Computational Sciences, 61(3). https://doi.org/10.53560/PPASA(61-3)848

Issue

Section

Research Articles