International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
|
Volume 187 - Issue 36 |
Published: September 2025 |
Authors: Amit Kumar Saxena, Damodar Patel, Umesh Kumar Shriwas, Abhishek Dubey, Gayatri Sahu, Shreya Chinde |
![]() |
Amit Kumar Saxena, Damodar Patel, Umesh Kumar Shriwas, Abhishek Dubey, Gayatri Sahu, Shreya Chinde . A Bio-Inspired Earthworm Optimization Algorithm Combined with PCA for Improved Feature Selection in Machine Learning Models. International Journal of Computer Applications. 187, 36 (September 2025), 43-54. DOI=10.5120/ijca2025925620
@article{ 10.5120/ijca2025925620, author = { Amit Kumar Saxena,Damodar Patel,Umesh Kumar Shriwas,Abhishek Dubey,Gayatri Sahu,Shreya Chinde }, title = { A Bio-Inspired Earthworm Optimization Algorithm Combined with PCA for Improved Feature Selection in Machine Learning Models }, journal = { International Journal of Computer Applications }, year = { 2025 }, volume = { 187 }, number = { 36 }, pages = { 43-54 }, doi = { 10.5120/ijca2025925620 }, publisher = { Foundation of Computer Science (FCS), NY, USA } }
%0 Journal Article %D 2025 %A Amit Kumar Saxena %A Damodar Patel %A Umesh Kumar Shriwas %A Abhishek Dubey %A Gayatri Sahu %A Shreya Chinde %T A Bio-Inspired Earthworm Optimization Algorithm Combined with PCA for Improved Feature Selection in Machine Learning Models%T %J International Journal of Computer Applications %V 187 %N 36 %P 43-54 %R 10.5120/ijca2025925620 %I Foundation of Computer Science (FCS), NY, USA
High-dimensional data often leads to increased computational complexity and reduced model performance due to the curse of dimensionality. This study introduces an effective feature selection and classification framework that integrates the Earthworm Optimization Algorithm (EWA), Principal Component Analysis (PCA), and supervised classifiers, K Nearest Neighbors (KNN) and Support Vector Machine (SVM). EWA, a bio-inspired metaheuristic based on the foraging behavior of earthworms, efficiently identifies optimal feature subsets. PCA is then applied to further minimize dimensionality while preserving essential variance. The proposed EWA-PCA was evaluated on 19 benchmark datasets using stratified 10-fold cross-validation and standard classification metrics. In the KNN average accuracy of 19 datasets, using the original feature set achieved 77.65% of accuracy, while the EWA-PCA achieved better 86.56%; similarly, in SVM, 84.43% of accuracy was achieved in the original feature, while the EWA-PCA achieved 88.10%. Results show that EWA-PCA consistently outperforms conventional and modern feature selection techniques, including Chi2, ReliefF, SIFS, mRMR, ATFS, and EmPo. EWA-PCA achieved better classification accuracies with KNN and SVM, demonstrating high stability and substantial feature reduction. The findings validate EWA-PCA as a scalable, accurate, and efficient solution for high-dimensional data classification.