" id="header">

Publications

A Gated Self-attention Memory Network for Answer Selection

EMNLP 2019

Publication date: November 5, 2019

Tuan Lai, Quan Tran, Trung Bui, Daisuke Kihara

Answer selection is an important research problem, with applications in many areas. Previous deep learning based approaches for the task mainly adopt the Compare-Aggregate architecture that performs word-level comparison followed by aggregation. In this work, we take a departure from the popular Compare-Aggregate architecture, and instead, propose a new gated self-attention memory network for the task. Combined with a simple transfer learning technique from a large-scale online corpus, our model outperforms previous methods by a large margin, achieving new state-of-the-art results on two standard answer selection datasets: TrecQA and WikiQA.