Adaboost Learning Rate in Matlab Documentation
3 views (last 30 days)
Show older comments
Dario Walter
on 13 Aug 2020
Commented: Dario Walter
on 23 Aug 2020
Hey,
the description of Adaboost allows to set a learn rate.
However, the learn rate typically refers to Gradient Boosting. Could anyone explain to me what Matlab is doing when AdaboostM1 is applied.
Thanks for your help!
0 Comments
Accepted Answer
Raunak Gupta
on 15 Aug 2020
Hi,
The LearnRate option in AdaBoostM1 tells about the learning rate of shrinkage which is essentially the shrink in contribution of each new base-model learned in the ensemble. This parameter controls how much the new model contributes to existing one. Normally if the LearnRate is too small it will require more iterations to get trained and will be more accurate. AdaBoostM1 is used for binary classification problem only.
3 Comments
Raunak Gupta
on 21 Aug 2020
Edited: Raunak Gupta
on 21 Aug 2020
Hi Dario,
The LearnRate parameter is included while calculate the weights of the weak hypothesis in the ensemble that is (
) in the Algorithm mentioned here. It is not explained in the original AdaBoost algorithm but is used widely in almost all of the application of AdaBoost because it provides a way to tinker the actual contribution of subsequent weak learners. So when any
is calculated for a learner it is multiplied with LearnRate to diminish or enhance its contribution (based on value being < 1 or > 1).


More Answers (0)
See Also
Categories
Find more on Classification Ensembles in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!