TAN-NTM: Topic Attention Networks for Neural Topic Modeling
Topic models have been widely used to learn text representations and gain insight into document corpora. To perform topic discovery, most existing neural models either take document bag-of-words (BoW) or sequence of tokens as input followed by variational inference and BoW reconstruction to learn to...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
02-12-2020
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Topic models have been widely used to learn text representations and gain
insight into document corpora. To perform topic discovery, most existing neural
models either take document bag-of-words (BoW) or sequence of tokens as input
followed by variational inference and BoW reconstruction to learn topic-word
distribution. However, leveraging topic-word distribution for learning better
features during document encoding has not been explored much. To this end, we
develop a framework TAN-NTM, which processes document as a sequence of tokens
through a LSTM whose contextual outputs are attended in a topic-aware manner.
We propose a novel attention mechanism which factors in topic-word distribution
to enable the model to attend on relevant words that convey topic related cues.
The output of topic attention module is then used to carry out variational
inference. We perform extensive ablations and experiments resulting in ~9-15
percentage improvement over score of existing SOTA topic models in NPMI
coherence on several benchmark datasets - 20Newsgroups, Yelp Review Polarity
and AGNews. Further, we show that our method learns better latent
document-topic features compared to existing topic models through improvement
on two downstream tasks: document classification and topic guided keyphrase
generation. |
---|---|
DOI: | 10.48550/arxiv.2012.01524 |