Attentive Recurrent Comparators
Rapid learning requires flexible representations to quickly adopt to new evidence. We develop a novel class of models called Attentive Recurrent Comparators (ARCs) that form representations of objects by cycling through them and making observations. Using the representations extracted by ARCs, we de...
Saved in:
Main Authors: | , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
02-03-2017
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Rapid learning requires flexible representations to quickly adopt to new
evidence. We develop a novel class of models called Attentive Recurrent
Comparators (ARCs) that form representations of objects by cycling through them
and making observations. Using the representations extracted by ARCs, we
develop a way of approximating a \textit{dynamic representation space} and use
it for one-shot learning. In the task of one-shot classification on the
Omniglot dataset, we achieve the state of the art performance with an error
rate of 1.5\%. This represents the first super-human result achieved for this
task with a generic model that uses only pixel information. |
---|---|
DOI: | 10.48550/arxiv.1703.00767 |