Evaluating Explanations of Convolutional Neural Network Image Classifications
In this paper, we seek to automate the evaluation of explanations of image classification decisions made by complex convolutional neural networks (CNN). Explanation frameworks like Local Interpretable Model-agnostic Explanations (LIME) treat complex machine learning models, such as deep neural netwo...
Saved in:
Published in: | 2020 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 8 |
---|---|
Main Authors: | , |
Format: | Conference Proceeding |
Language: | English |
Published: |
IEEE
01-07-2020
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper, we seek to automate the evaluation of explanations of image classification decisions made by complex convolutional neural networks (CNN). Explanation frameworks like Local Interpretable Model-agnostic Explanations (LIME) treat complex machine learning models, such as deep neural networks, as black boxes and generate human-interpretable explanations of their decisions using linear proxy models. We propose a pair of experiments to quantitatively evaluate the quality of generated explanations by measuring their sufficiency and salience. To test if a generated explanation contains sufficient information for classification, we test the ability of a trained CNN to classify that explanation properly. We test explanations for salience by training two new CNNs, one using raw image data and the other using explanations as training data, and comparing their classification precision and recall on a common set of test data. We use our new evaluation framework to test our hypothesis that LIME is able to generate explanations that are both sufficient and salient. Our results show that the generated explanations have the potential to be sufficient and salient, provided that the complexity of the explanations is enough to describe the underlying classes. |
---|---|
ISSN: | 2161-4407 |
DOI: | 10.1109/IJCNN48605.2020.9207129 |