On Network Science and Mutual Information for Explaining Deep Neural Networks

In this paper, we present a new approach to interpret deep learning models. By coupling mutual information with network science, we explore how information flows through feedforward networks. We show that efficiently approximating mutual information allows us to create an information measure that qu...

Full description

Saved in:
Bibliographic Details
Published in:ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 8399 - 8403
Main Authors: Davis, Brian, Bhatt, Umang, Bhardwaj, Kartikeya, Marculescu, Radu, Moura, Jose M.F.
Format: Conference Proceeding
Language:English
Published: IEEE 01-05-2020
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we present a new approach to interpret deep learning models. By coupling mutual information with network science, we explore how information flows through feedforward networks. We show that efficiently approximating mutual information allows us to create an information measure that quantifies how much information flows between any two neurons of a deep learning model. To that end, we propose NIF, Neural Information Flow, a technique for codifying information flow that exposes deep learning model internals and provides feature attributions.
ISSN:2379-190X
DOI:10.1109/ICASSP40776.2020.9053078