Research Post

Precision-Based Boosting


AdaBoost is a highly popular ensemble classification method for which many variants have been published. This paper proposes a generic refinement of all of these AdaBoost variants. Instead of assigning weights based on the total error of the base classifiers (as in AdaBoost), our method uses classspecific error rates. On instance x it assigns a higher weight to a classifier predicting label y on x, if that classifier is less likely to make a mistake when it predicts class y. Like AdaBoost, our method is guaranteed to boost weak learners into strong learners. An empirical study on AdaBoost and one of its multi-class versions, SAMME, demonstrates the superiority of our method on datasets with more than 1,000 instances as well as on datasets with more than three classes

Latest Research Papers

Connect with the community

Get involved in Alberta's growing AI ecosystem! Speaker, sponsorship, and letter of support requests welcome.

Explore training and advanced education

Curious about study options under one of our researchers? Want more information on training opportunities?

Harness the potential of artificial intelligence

Let us know about your goals and challenges for AI adoption in your business. Our Investments & Partnerships team will be in touch shortly!