Hierarchical Recurrent Attention Network for Response Generation
Proceedings of the AAAI Conference on Artificial Intelligence2018Vol. 32(1)
Citations Over TimeTop 10% of 2018 papers
Abstract
We study multi-turn response generation in chatbots where a response is generated according to a conversation context. Existing work has modeled the hierarchy of the context, but does not pay enough attention to the fact that words and utterances in the context are differentially important. As a result, they may lose important information in context and generate irrelevant responses. We propose a hierarchical recurrent attention network (HRAN) to model both the hierarchy and the importance variance in a unified framework. In HRAN, a hierarchical attention mechanism attends to important parts within and among utterances with word level attention and utterance level attention respectively.
Related Papers
- → The effect of familiarity of conversation partners on conversation turns contributed by augmented and typical speakers(2013)9 cited
- → Analysis of User Reactions to Turn-Taking Failures in Spoken Dialogue Systems(2007)10 cited
- Conversation and the Relevance and Turn Transition in Conversation(2004)
- A Preliminary Study on Conversation Repair(2012)
- 深まりのある意見交換は接觸場面において どのように達成されるか - 對話/共話との關連から -(2017)