You are here:

Improving Performance Support Systems through Information Retrieval Evaluation
Article

, University of Hartford, United States

Journal of Interactive Learning Research Volume 17, Number 4, ISSN 1093-023X Publisher: Association for the Advancement of Computing in Education (AACE), Waynesville, NC

Abstract

This study examines existent and new methods for evaluating the success of information retrieval systems. The theory underlying current methods is not robust enough to allow testing retrieval using different meta-tagging schemas. Traditional measures rely on judgments of whether a document is relevant to a particular question. A good system returns all the relevant documents and no extraneous documents. There is a rich literature questioning the efficacy of relevance judgments. Such questions as, Relevant to whom? When? and To what purpose? are not well-answered in traditional theory. In this study, two new measures (Spink's Information Need and Cooper's Utility) are used in evaluating two search tools (tag-based and text-based), comparing these new measures with traditional measures and each other. The open-source Swish text-based search engine and a self-constructed tag-based search tool were used. Thirty-four educators searched for information using both search engines and evaluated the information retrieved by each. Construct measures, derived by multiplying each of the three measures (traditional, information need, and utility) by a rating of satisfaction were compared using two way analysis of variance. This study specifically analyzes small information systems. The design concepts would be untenable for large systems. Results indicated that there was a significant correlation between the three measures, indicating that the new measures provide an equivalent method of evaluating systems and have some significant advantages, which include not requiring relevance judgments and the ability to use the measures in situ.

Citation

Schatz, S. (2006). Improving Performance Support Systems through Information Retrieval Evaluation. Journal of Interactive Learning Research, 17(4), 407-423. Waynesville, NC: Association for the Advancement of Computing in Education (AACE). Retrieved November 20, 2019 from .

Keywords

Cited By

View References & Citations Map

These links are based on references which have been extracted automatically and may have some errors. If you see a mistake, please contact info@learntechlib.org.