The exploration of computational intelligence led us to evaluate a Neural Network model, harnessing its pattern recognition capabilities derived from artificial neurons.
Utilizing a tuning process, the most optimum layers for the network were identified, and SMOTE was employed to address class and feature imbalances.
Optimum Layers Configuration:
{'activation': 'relu', 'first_units': 26, 'num_layers': 4, 'units_0': 26, 'units_1': 21, 'units_2': 21, 'units_3': 11, 'units_4': 11, 'tuner/epochs': 20, 'tuner/initial_epoch': 7, 'tuner/bracket': 1, 'tuner/round': 1, 'tuner/trial_id': '0018'}
The model achieved an accuracy score of approximately 86.06%.
Below is the result of the Confusion Matrix:
Contrary to initial expectations, the Neural Network model did not outperform the XGBoost model. Factors contributing to this outcome include:
SMOTE is a technique designed to amend class imbalances by generating synthetic samples for the minority class, thereby balancing the class distribution and enhancing representation of the minority class within the dataset.
SMOTE operates by: