A temporary digression
Vapnik and his co-workers developed a very clever type
of perceptron called a Support Vector Machine.
Instead of hand-coding the layer of non-adaptive
features, each training example is used to create a
new feature using a fixed recipe.
The feature computes how similar a test example is to that
training example.
Then a clever optimization technique is used to select
the best subset of the features and to decide how to
weight each feature when classifying a test case.
But its just a perceptron and has all the same limitations.
In the 1990’s, many researchers abandoned neural
networks with multiple adaptive hidden layers because
Support Vector Machines worked better.