Conference Presentation Series

Improving Slot Filling by Utilizing Contextual Information

Conference: ACL

Date: July 2020

Presenters: Amir Pouran Ben Veyseh, Franck Dernoncourt, Thien Huu Nguyen

Partner Institution(s): University of Oregon

Slot Filling is the task of extracting the semantic concept from a given natural language utterance. Recently it has been shown that using contextual information, either in work representations (e.g., BERT embedding) or in the computation graph of the model, could improve the performance of the model. However, recent work uses the contextual information in a restricted manner, e.g., by concatenating the word representation and its context feature vector, limiting the model from learning any direct association between the context and the label of word. We introduce a new deep model utilizing the contextual information for each work in the given sentence in a multi-task setting. Our model enforce consistency between the feature vectors of the context and the word while increasing the expressiveness of the context about the label of the word. Our empirical analysis on a slot filling dataset proves the superiority of the model over the baselines.

View Publication