Adaboost
Adaboost is an ensemble learning method that combines multiple weak classifiers to create a strong classifier. It iteratively adjusts the weights of the weak classifiers to focus on difficult instances, improving overall accuracy. Adaboost is particularly effective in handling complex classification problems.
Precision: | 84.47% |
Accuracy: | 91.07% |
Recall: | 88.81% |
F1 Score: | 86.59% |
ROC AUC Score: | 90.48% |