Efficiency Implications of Term Weighting for Passage Retrieval
Joel Mackenzie, Zhuyun Dai, Luke Gallagher, Jamie Callan
Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval | ACM | Published : 2020
Language model pre-training has spurred a great deal of attention for tasks involving natural language understanding, and has been successfully applied to many downstream tasks with impressive results. Within information retrieval, many of these solutions are too costly to stand on their own, requiring multi-stage ranking architectures. Recent work has begun to consider how to "backport" salient aspects of these computationally expensive models to previous stages of the retrieval pipeline. One such instance is DeepCT, which uses BERT to re-weight term importance in a given context at the passage level. This process, which is computed offline, results in an augmented inverted index with re-we..View full abstract
Awarded by Australian Research Council
This work was supported by the Australian Research Council (ARC) Discovery Grant DP170102231, an Australian Government Research Training Program Scholarship, and the National Science Foundation (NSF) grant IIS-1815528.