End-to-end hard constrained text generation via incrementally predicting segments
Hard-constrained text generation is an important task that requires generating fluent sentences to include several specific keywords. It has numerous real-world applications, such as advertisement generation, keyword-based summary generation, and query rewriting. Although previous plug-and-play appr...
Saved in:
Published in: | Knowledge-based systems Vol. 278; p. 110886 |
---|---|
Main Authors: | , , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
Elsevier B.V
25-10-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Hard-constrained text generation is an important task that requires generating fluent sentences to include several specific keywords. It has numerous real-world applications, such as advertisement generation, keyword-based summary generation, and query rewriting. Although previous plug-and-play approaches like enhanced beam search and stochastic search have been proven effective, they lack time efficiency and may reduce the quality of generated sentences. While end-to-end methods based on seq2seq models are superior in speed, they cannot guarantee that outputs satisfy all constraints. In this work, we propose a novel end-to-end method for lexically constrained text generation via incrementally predicting segments (IPS) between every two adjacent lexical constraints using seq2seq models. Our approach guarantees that all constrained keywords will be included in the generated sentence. The experimental results show that our method not only satisfies all lexical constraints but also achieves state-of-the-art performance. Our code and data will be available at https://github.com/blcuicall/IPS. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2023.110886 |