AdaBoost is a machine learning algorithm that enhances the performance of weak classifiers by combining them into a strong classifier. It works by sequentially training multiple models, where each new model focuses on the errors made by the previous ones. This process helps improve accuracy by giving more weight to misclassified instances.
The algorithm assigns weights to each training sample, adjusting them after each iteration. Samples that are misclassified receive higher weights, prompting the next model to pay more attention to them. Ultimately, AdaBoost aggregates the predictions of all models, resulting in a robust final prediction that often outperforms individual classifiers.