Retrieval-Augmented Transformer-XL for Close-Domain Dialog Generation
Transformer-based models have demonstrated excellent capabilities of capturing patterns and structures in natural language generation and achieved state-of-the-art results in many tasks. In this paper we present a transformer-based model for multi-turn dialog response generation. Our solution is bas...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
19-05-2021
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Transformer-based models have demonstrated excellent capabilities of
capturing patterns and structures in natural language generation and achieved
state-of-the-art results in many tasks. In this paper we present a
transformer-based model for multi-turn dialog response generation. Our solution
is based on a hybrid approach which augments a transformer-based generative
model with a novel retrieval mechanism, which leverages the memorized
information in the training data via k-Nearest Neighbor search. Our system is
evaluated on two datasets made by customer/assistant dialogs: the Taskmaster-1,
released by Google and holding high quality, goal-oriented conversational data
and a proprietary dataset collected from a real customer service call center.
Both achieve better BLEU scores over strong baselines. |
---|---|
DOI: | 10.48550/arxiv.2105.09235 |