|Institutional Office Address
DMIF, Room NS4
|Research Project Title
Economical Evaluation of Information Retrieval Systems
|Research Project Description
The research project is in the Information Retrieval (IR) field and aims to reduce the effort required to evaluate the effectiveness of IR systems.
To evaluate IR effectiveness, a common approach is to use test collections, which are composed of a collection of documents, a set of description of information needs (called topics), and a set of relevant documents to each topic.
Test collections are modeled in a competition scenario: for example, in the TREC initiative, participants run their own retrieval systems over a set of topics and they provide a ranked list of retrieved documents; some of the retrieved documents constitute the so called pool and their relevance is evaluated by human assessors; the document list is then used to compute effectiveness metrics and rank the participant systems.
The whole evaluation process is rather expensive, in terms of both human time and money: TREC cost from 1999 to 2009 is about $30M.
An ideal test collection should have a perfect sample of topics, an adequate document collection, etc. In general the evaluation setting is not ideal, along different dimensions.
We do not yet have an overall understanding of what happens when we vary the ideal setting. Some studies in the literature address single aspects of this problem. My PhD focus in study what happens when we vary the ideal setting. Homepage: https://users.dimi.uniud.it/~kevin.roitero/