SketchEmbedNet: Learning Novel Concepts by Imitating Drawings
Alexander Wang*,
Mengye Ren*,
and Richard Zemel
In Proceedings of the 38th International Conference on Machine Learning
2021
Sketch drawings capture the salient information of visual concepts. Previous work has shown that neural networks are capable of producing sketches of natural objects drawn from a small number of classes. While earlier approaches focus on generation quality or retrieval, we explore properties of image representations learned by training a model to produce sketches of images. We show that this generative, class-agnostic model produces informative embeddings of images from novel examples, classes, and even novel datasets in a few-shot setting. Additionally, we find that these learned representations exhibit interesting structure and compositionality.