Loading [MathJax]/jax/output/CommonHTML/jax.js
World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Student Performance Prediction with Decision Tree Ensembles and Feature Selection Techniques

    https://doi.org/10.1142/S0219649225500169Cited by:0 (Source: Crossref)

    The prevalence of student dropout in academic settings is a serious issue that affects individuals and society as a whole. Timely intervention and support can be provided to such students if we get an accurate prediction of student performance. However, class imbalance and data complexity in education data are major challenges for traditional predictive analytics. Our research focusses on utilising machine learning techniques to predict student performance while handling imbalanced datasets. To address the imbalanced class problem, we employed both oversampling and undersampling techniques in our decision tree ensemble methods for the risk classification of prospective students. The effectiveness of classifiers was evaluated by varying the sizes of the ensembles and the oversampling and undersampling ratios. Additionally, we conducted experiments to integrate the feature selection processes with the best ensemble classifiers to further enhance the prediction. Based on the extensive experimentation, we concluded that ensemble methods such as Random Forest, Bagging, and Random Undersampling Boosting perform well in terms of performance measures such as Recall, Precision, F1-score, Area Under the Receiver Operating Characteristic Curve, and Geometric Mean. The F1-score of 0.849 produced by the Random Undersampling Boost classifier in conjunction with the Least Absolute Shrinkage and Selection Operator feature selection method indicates that this ensemble produces the best results.