Publications

The Context-dependent Additive Recurrent Neural Net

Proceedings of NAACL-HLT 2018

Publication date: June 1, 2018

Quan Hung Tran, Tuan Manh Lai, Gholamreza Haffari, Ingrid Zukerman, Trung Bui, Hung Bui

Contextual sequence mapping is one of the fundamental problems in Natural Language Processing. Instead of relying solely on the information presented in a text, the learning agents have access to a strong external signal given to assist the learning process. In this paper, we propose a novel family of Recurrent Neural Network unit: the Context-dependent Additive Recurrent Neural Network (CARNN) that is designed specifically to leverage this external signal. The experimental results on public datasets in the dialog problem (Babi dialog Task 6 and Frame), contextual language model (Switchboard and Penn Discourse Tree Bank) and question answering (TrecQA) show that our novel CARNN-based architectures outperform previous methods.

Learn More