SPLBoost: An Improved Robust Boosting Algorithm Based on Self-Paced Learning

Volume: 51, Issue: 3, Pages: 1556 - 1570
Published: Mar 1, 2021
Abstract
It is known that boosting can be interpreted as an optimization technique to minimize an underlying loss function. Specifically, the underlying loss being minimized by the traditional AdaBoost is the exponential loss, which proves to be very sensitive to random noise/outliers. Therefore, several boosting algorithms, e.g., LogitBoost and SavageBoost, have been proposed to improve the robustness of AdaBoost by replacing the exponential loss with...
Paper Details
Title
SPLBoost: An Improved Robust Boosting Algorithm Based on Self-Paced Learning
Published Date
Mar 1, 2021
Volume
51
Issue
3
Pages
1556 - 1570
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.