CSC2515 Fall 2008
 Introduction to Machine Learning

Lecture 11a
 Boosting and Naïve Bayes

A commonsense way to use limited computational resources

Making weak learners stronger

Boosting (ADAboost)

How to train each classifier

How to weight each training case               for classifier m

How to make predictions using a committee of classifiers

An alternative derivation of ADAboost

Learning classifier m using exponential loss

Re-writing the part of the exponential loss that is relevant when fitting classifier m

An impressive example of boosting