Abstract
In this paper we present an evaluation resource for geographic information retrieval developed within the Cross Language Evaluation Group (CLEF). The GeoClef track is dedicated to the evaluation of geographic information retrieval systems. The rescource encompasses more than 600,000 documents, 75 topics so far, and more than 100,000 relevance judgements for these topics. Geographic information retrieval requires an evaluation resource which represents realistic information needs and which is geographically challenging. Some experimental results and analysis are reported.