ADABOOST is a way for
enhancing machine learning methods that was recently created.
It has the potential to greatly enhance the performance of
classification methods (e.g., decision trees).
It works by repeatedly applying the procedure to the data,
analyzing the findings, and then reweighting the observations to provide more
weight to the misclassified instances.
By a majority vote of the individual classifiers, the final
classifier employs all of the intermediate classifiers to categorize an
observation.
It also has the intriguing virtue of continuing to lower the
generalization error (i.e., the error in a test set) long after the training
set error has stopped dropping or hit 0.
See Also:
arcing, Bootstrap AGGregation (bagging)
Find Jai on Twitter | LinkedIn | Instagram
Be sure to refer to the complete & active AI Terms Glossary here.
You may also want to read more about Artificial Intelligence here.