Gradient Networks

Directly parameterizing and learning gradients of functions has widespread significance, with specific applications in inverse problems, generative modeling, and optimal transport. This paper introduces gradient networks ( GradNets ): novel neural network architectures that parameterize gradients of...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on signal processing pp. 1 - 16
Main Authors: Chaudhari, Shreyas, Pranav, Srinivasa, Moura, Jose M.F.
Format: Journal Article
Language:English
Published: IEEE 12-11-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Directly parameterizing and learning gradients of functions has widespread significance, with specific applications in inverse problems, generative modeling, and optimal transport. This paper introduces gradient networks ( GradNets ): novel neural network architectures that parameterize gradients of various function classes. GradNets exhibit specialized architectural constraints that ensure correspondence to gradient functions. We provide a comprehensive GradNet design framework that includes methods for transforming GradNets into monotone gradient networks ( mGradNets ), which are guaranteed to represent gradients of convex functions. Our results establish that our proposed GradNet (and mGradNet ) universally approximate the gradients of (convex) functions. Furthermore, these networks can be customized to correspond to specific spaces of potential functions, including transformed sums of (convex) ridge functions. Our analysis leads to two distinct GradNet architectures, GradNet-C and GradNet-M , and we describe the corresponding monotone versions, mGradNet-C and mGradNet-M . Our empirical results demonstrate that these architectures provide efficient parameterizations and outperform existing methods by up to 15 dB in gradient field tasks and by up to 11 dB in Hamiltonian dynamics learning tasks.
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2024.3496692