Backprop as Functor: A compositional perspective on supervised learning
A supervised learning algorithm searches over a set of functions A\rightarrow B parametrised by a space P to find the best approximation to some ideal function f:A\rightarrow B . It does this by taking examples (a, f(a))\in A\times B , and updating the parameter according to some rule. We define a c...
Saved in:
Published in: | 2019 34th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS) pp. 1 - 13 |
---|---|
Main Authors: | , , |
Format: | Conference Proceeding |
Language: | English |
Published: |
IEEE
01-06-2019
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A supervised learning algorithm searches over a set of functions A\rightarrow B parametrised by a space P to find the best approximation to some ideal function f:A\rightarrow B . It does this by taking examples (a, f(a))\in A\times B , and updating the parameter according to some rule. We define a category where these update rules may be composed, and show that gradient descent-with respect to a fixed step size and an error function satisfying a certain property-defines a monoidal functor from a category of parametrised functions to this category of update rules. A key contribution is the notion of request function. This provides a structural perspective on backpropagation, giving a broad generalisation of neural networks and linking it with structures from bidirectional programming and open games. |
---|---|
DOI: | 10.1109/LICS.2019.8785665 |