Skip to content

Text REtrieval Conference (TREC) 2011

Entity

Overview | Proceedings | Data | Results | Runs | Participants

The TREC Entity track aimed to build test collections to evaluate entity-oriented search on Web data. In 2011, the track worked with two corpora: the ClueWeb 2009 web corpus and the new Sindice-2011 dataset.

Track coordinator(s):

  • Krisztian Balog, NTNU
  • Pavel Serdyukov, Yandex
  • Arjen P. de Vries, CWI

Track Web Page: https://web.archive.org/web/20110811014305/http://ilps.science.uva.nl/trec-entity/


Microblog

Overview | Proceedings | Data | Results | Runs | Participants

The Microblog track examines search tasks and evaluation methodologies for information seeking behaviours in microblogging environments such as Twitter. It was first introduced in 2011, addressing a real-time adhoc search task, whereby the user wishes to see the most recent but relevant information to the query. In particular, systems should respond to a query by providing a list of relevant tweets ordered from newest to oldest, starting from the time the query was issued.

Track coordinator(s):

  • Iadh Ounis, University of Glasgow
  • Craig Macdonald, University of Glasgow
  • Jimmy Lin, Twitter and University of Maryland, College Park
  • Ian Soboroff, National Institute of Standards and Technology (NIST)

Track Web Page: https://web.archive.org/web/20120604072950/http://sites.google.com/site/microblogtrack/


Web

Overview | Proceedings | Data | Results | Runs | Participants

The TREC Web Track explores and evaluates Web retrieval technology over large collections of Web data. In its current incarnation, the Web Track has been active since TREC 2009, where it included both a traditional adhoc retrieval task and a new diversity task. The goal of this diversity task is to return a ranked list of pages that together provide complete coverage for a query, while avoiding excessive redundancy in the result list. For TREC 2010 the track introduced a new Web spam task and Web-style, six-level relevance assessment for the adhoc task. For TREC 2011, as recommended by participants at the track planning session held during TREC 2010, we dropped the spam task but continued the other tasks essentially unchanged. As we did for TREC 2009 and TREC 2010, we based our TREC 2011 experiments on the billion-page ClueWeb09 collection created by the Language Technologies Institute at Carnegie Mellon University

Track coordinator(s):

  • Charles L. A. Clarke, University of Waterloo
  • Nick Craswell, Microsoft
  • Ian Soboroff, National Institute of Standards and Technology (NIST)
  • Ellen M. Voorhees, National Institute of Standards and Technology (NIST)

Track Web Page: https://plg.uwaterloo.ca/~trecweb/2011.html


Overview | Proceedings | Data | Runs | Participants

The TREC 2011 Legal Track consisted of a single task: the learning task, which captured elements of both the TREC 2010 learning and interactive tasks. Participants were required to rank the entire corpus of 685,592 documents by their estimate of the probability of responsiveness to each of three topics, and also to provide a quantitative estimate of that probability. Participants were permitted to request up to 1,000 responsiveness determinations from a Topic Authority for each topic. Participants elected either to use only these responsiveness determinations in preparing automatic submissions, or to augment these determinations with their own manual review in preparing technologyassisted submissions.

Track coordinator(s):

  • Maura R. Grossman, Wachtell, Lipton, Rosen & Katz
  • Gordon V. Cormack, University of Waterloo
  • Bruce Hedin, H5
  • Douglas W. Oard, University of Maryland

Track Web Page: http://trec-legal.umiacs.umd.edu/


Chemical

Overview | Proceedings | Data | Results | Runs | Participants

The third year of the Chemical IR evaluation track benefitted from the support of many more people interested in the domain, as shown by the number of co-authors of this overview paper. We continued the two tasks we had before, and introduced a new task focused on chemical image recognition. The objective is to gradually move towards systems really useful to the practitioners, and in chemistry, this involves both text and images.

Track coordinator(s):

  • Mihai Lupu, Vienna University of Technology
  • Zhao Jiashu, York University
  • Jimmy Huang, York University
  • Harsha Gurulingappa, Fraunhofer SCAI
  • Juliane Fluck, Fraunhofer SCAI
  • Marc Zimmerman, Fraunhofer SCAI
  • Igor Filippov, National Institutes of Health
  • John Tait, johntait.net Ltd.

Track Web Page: https://trec.nist.gov/data/chemical11.html


Medical

Overview | Proceedings | Data | Results | Runs | Participants

The search task within the TREC 2011 Medical Records track was an ad hoc search task that modeled the clinical task of finding cohorts for comparative effectiveness research. The document set used in the track was based on a set of de-identified clinical reports made available by the University of Pittsburgh's BLULab NLP Repository. This document set is not available to non-participants. There is a many-to-one mapping between reports and ''visits'', where a visit is an individual patient's single stay at a hospital. The visit was used as the unit of retrieval in the track (so a document for the purposes of the track was the union of the content in all records mapped to that visit).

Track Web Page: https://www.trec-cds.org/


Session

Overview | Proceedings | Data | Results | Runs | Participants

The TREC Session track ran for the second time in 2011. The track has the primary goal of providing test collections and evaluation measures for studying information retrieval over user sessions rather than one-time queries. These test collections are meant to be portable, reusable, statistically powerful, and open to anyone that wishes to work on the problem of retrieval over sessions

Track coordinator(s):

  • Evangelos Kanoulas, University of Sheffield
  • Mark Hall, University of Sheffield
  • Paul Clough, University of Sheffield
  • Ben Carterette, University of Delaware
  • Mark Sanderson, Royal Melbourne Institute of Technology (RMIT University)

Track Web Page: http://ir.cis.udel.edu/sessions


Crowdsourcing

Overview | Proceedings | Runs | Participants

Track Web Page: https://trec.nist.gov/data/crowd.html