Resumen:
Textual information available has grown so much as to make necessary to study new techniques that assist users in information access (IA). In this paper, we propose utilizing a user directed summarization system in an IA setting for helping users to decide about document relevance. The summaries are generated using a sentence extraction method that scores the sentences performing some heuristics employed successfully in previous works (keywords, title and location). User modeling is carried out exploiting user’s query to an IA system and expanding query terms using WordNet. We present an objective and systematic evaluation method oriented to measure the summary effectiveness in two IA significant tasks: ad hoc retrieval and relevance feedback. Results obtained prove our initial hypothesis, i.e., user adapted summaries are a useful tool assisting users in an IA context.