The study illustrates how four feature selection methods-'ReliefF', 'Correlation-based', 'Consistency-based' and 'Wrapper' algorithms help to improve three aspects of the performance of scoring models: model si
Basiri, "An application of locally linear model tree algorithm with combination of feature selection in credit scoring," International Journal of Systems Science, vol. 45, no. 10, pp. 2213- 2222, 2014.Siami, M., Gholamian, M.R., and Basiri, J.: `An application of locally linear ...
Since, in the selection process m-top ranked features are selected where, multiple features can have the same ranking and can be considered as redundant. Redundant feature is a feature which does not provide any discriminative information to the model for the improvement of classification accuracy....
The most common approach to feature selection in RNA-seq analysis tool boxes such as scanpy10 and Seurat11 is to select highly variable features, those that show excess variability beyond what is expected. This approach assumes that extra variability results from differences in gene expression betwee...
In building a predictive credit scoring model, feature selection is an essential pre-processing step that can improve the predictive accuracy and comprehensibility of models. In this study, we select the optimal feature subset based on group feature selection in lieu of the individual feature selectio...
Thus, stakeholder requirements are successfully incorporated into the feature selection process. This results in a better balance between objectives. These findings extend the research on hybrid metaheuristics for feature selection, as well as alternative data for credit scoring....
Financial distress prediction (FDP) is a complex task involving both feature selection and model construction. While many studies have addressed these challenges individually, there is a lack of FDP models that integrate feature selection into the overall model building process. To address the issues ...
c Feature selection, model training, and evaluation: Feature selection from unbalanced and ADASYN-balanced datasets identified 79 and 35 features, respectively. Random Forest classifier was used for training and evaluation, with confusion matrices showing classification performance for 20-fold cross-...
A simple algorithm for feature selection with a filter is a greedy forward selection search that seeks to maximize feature scoring function J , which is shown in Fig. 1. The search initializes the relevant feature set F be empty, then for k iterations, an objective function J is maximized....
In this research, we suggest a credit risk assessment model for MSEs based on federated learning and feature selection. This paper’s main points are: The following sections of the paper are structured as follows: In Section 2, we outline earlier initiatives to improve MSEs’ credit model per...