Vector-to-Sequence Models for Sentence Analogies

We solve sentence analogies by generating the solution rather than identifying the best candidate from a given set of candidates, as usually done. We design a decoder to transform sentence embedding vectors back into sequences of words. To generate the vector representations of answer sentences, we...

Full description

Saved in:
Bibliographic Details
Published in:2020 International Conference on Advanced Computer Science and Information Systems (ICACSIS) pp. 441 - 446
Main Authors: Wang, Liyan, Lepage, Yves
Format: Conference Proceeding
Language:English
Published: IEEE 17-10-2020
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We solve sentence analogies by generating the solution rather than identifying the best candidate from a given set of candidates, as usually done. We design a decoder to transform sentence embedding vectors back into sequences of words. To generate the vector representations of answer sentences, we build a linear regression network which learns the mapping between the distribution of known and expected vectors. We subsequently leverage this pre-trained decoder to decode sentences from regressed vectors. The results of experiments conducted on a set of semantico-formal sentence analogies show that our proposed solution performs better than a state-of-the-art baseline vector offset method which solves analogies using embeddings.
DOI:10.1109/ICACSIS51025.2020.9263191