Semantic Search With Sentence-BERT for Design Information Retrieval
Citations Over TimeTop 20% of 2022 papers
Abstract
Abstract Managing and referencing design knowledge is a critical activity in the design process. However, reliably retrieving useful knowledge can be a frustrating experience for users of knowledge management systems due to inherent limitations of standard keyword-based searches. In this research, we consider the task of retrieving relevant lessons learned from the NASA Lessons Learned Information System (LLIS). To this end, we apply a state-of-the-art natural language processing (NLP) technique for information retrieval (IR): semantic search with sentence-BERT, which is a modification of a Bidirectional Encoder Representations from Transformers (BERT) model that uses siamese and triplet network architectures to obtain semantically meaningful sentence embeddings. While the pre-trained sBERT model performs well out-of-the-box, we further fine-tune the model on data from the LLIS so that it learns on design engineering-relevant vocabulary. We quantify the improvement in query results using both standard sBERT and fine-tuned sBERT over a keyword search. Our use case throughout the paper is to use queries related to specific requirements from a NASA project. Fine tuning the sBERT model on LLIS data yields a mean average precision (MAP) of 0.807 on queries based on information needs from a real NASA project. Results indicate that applying state-of-the-art natural language processing techniques, especially when fine-tuned using engineering data, to design information retrieval tasks shows significant promise in modernizing design knowledge management systems.
Related Papers
- → Fine-Tuned Transformer Models for Question Answering(2023)2 cited
- Question answering systems: a partial answer(2007)
- → Similarity Detection of Natural-Language Questions and Answers using the VANiLLa dataset(2021)2 cited
- → A Comparative Study of Transformer-Based Language Models on Extractive Question Answering(2021)22 cited
- → Exploring Neural Net Augmentation to BERT for Question Answering on SQUAD 2.0(2019)1 cited