24
Learning by dividing and conquering
Re-weighting the data: In boosting, we learn a
sequence of simple models. After learning each model,
we re-weight the data so that the next model learns to
deal with the cases that the previous models found
difficult.
There is a nice guarantee that the overall model
gets better.
Projecting the data:  In PCA, we find the leading
eigenvector and then project the data into the
orthogonal subspace.
Distorting the data: In projection pursuit, we find a non-
Gaussian direction and then distort the data so that it is
Gaussian along this direction.