The Impossibility of Parallelizing Boosting

Amin Karbasi, Kasper Green Larsen

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearch

Abstract

The aim of boosting is to convert a sequence of weak learners into a strong learner. At their heart, these methods are fully sequential. In this paper, we investigate the possibility of parallelizing boosting. Our main contribution is a strong negative result, implying that significant parallelization of boosting requires an exponential blow-up in the total computing resources needed for training.

Original languageEnglish
Title of host publicationProceedings of Machine Learning Research
Number of pages19
Volume237
Publication dateFeb 2024
Pages635-653
Publication statusPublished - Feb 2024
Event35th International Conference on Algorithmic Learning Theory, ALT 2024 - La Jolla, United States
Duration: 25 Feb 202428 Feb 2024

Conference

Conference35th International Conference on Algorithmic Learning Theory, ALT 2024
Country/TerritoryUnited States
CityLa Jolla
Period25/02/202428/02/2024
SeriesProceedings of Machine Learning Research
ISSN2640-3498

Keywords

  • Boosting
  • Generalization
  • Parallelization
  • Weak-to-Strong Learning

Fingerprint

Dive into the research topics of 'The Impossibility of Parallelizing Boosting'. Together they form a unique fingerprint.

Cite this