Improving the estimation of relevance models using large external corpora

F. Diaz and D. Metzler
SIGIR 2006
Information retrieval algorithms leverage various collection statistics to improve performance. Because these statistics are often computed on a relatively small evaluation corpus, we believe using larger, non-evaluation corpora should improve performance. Specifically, we advocate incorporating external corpora based on language modeling. We refer to this process as external expansion. When compared to traditional pseudo-relevance feedback techniques, external expansion is more stable across topics and up to 10% more effective in terms of mean average precision. Our results show that using a high quality corpus that is comparable to the evaluation corpus can be as, if not more, effective than using the web. Our results also show that external expansion outperforms simulated relevance feedback. In addition, we propose a method for predicting the extent to which external expansion will improve retrieval performance. Our new measure demonstrates positive correlation with improvements in mean average precision.

bibtex

Copied!
@inproceedings{diaz:ee, year = {2006}, title = {Improving the estimation of relevance models using large external corpora}, publisher = {ACM Press}, pages = {154--161}, location = {Seattle, Washington, USA}, isbn = {1-59593-369-7}, doi = {http://doi.acm.org/10.1145/1148170.1148200}, booktitle = {SIGIR '06: Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval}, author = {Fernando Diaz and Donald Metzler}, address = {New York, NY, USA} }