Fine-tuning with a contrastive version of the
“wake-sleep” algorithm
    After learning many layers of features, we can fine-tune
the features to improve generation.
1.  Do a stochastic bottom-up pass
Adjust the top-down weights to be good at
reconstructing the feature activities in the layer below.
2. Do a few iterations of sampling in the top level RBM
-- Adjust the weights in the top-level RBM.
3. Do a stochastic top-down pass
Adjust the bottom-up weights to be good at
reconstructing the feature activities in the layer above.