A Hierarchical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion
Citations Over TimeTop 1% of 2015 papers
Abstract
Users may strive to formulate an adequate textual query for their information need. Search engines assist the users by presenting query suggestions. To preserve the original search intent, suggestions should be context-aware and account for the previous queries issued by the user. Achieving context awareness is challenging due to data sparsity. We present a novel hierarchical recurrent encoder-decoder architecture that makes possible to account for sequences of previous queries of arbitrary lengths. As a result, our suggestions are sensitive to the order of queries in the context while avoiding data sparsity. Additionally, our model can suggest for rare, or long-tail, queries. The produced suggestions are synthetic and are sampled one word at a time, using computationally cheap decoding techniques. This is in contrast to current synthetic suggestion models relying upon machine learning pipelines and hand-engineered feature sets. Results show that our model outperforms existing context-aware approaches in a next query prediction setting. In addition to query suggestion, our architecture is general enough to be used in a variety of other applications.
Related Papers
- → Towards Understanding the Interplay of Generative Artificial Intelligence and the Internet(2023)9 cited
- → Generative Model for Person Re-Identification: A Review(2020)
- → Are generative approaches to ZSAR a look in the right direction?(2023)
- → TC-VAE: Uncovering Out-of-Distribution Data Generative Factors(2023)
- → Research on Generative Pre-Trained Model Evaluation Based on Causality Analysis(2023)