Demands on Users for Interpretation of Achievement Test Scores: Implications for the Evaluation Profession

Main Article Content

Gabriel Mario Della-Piana
Michael Gardner

Abstract

Background:  Professional standards for validity of achievement tests have long reflected a consensus that validity is the degree to which evidence and theory support interpretations of test scores entailed by the intended uses of tests.  Yet there are convincing lines of evidence that the standards are not adequately followed in practice, that standards alone are not sufficient guides to action, and that reviewers of tests do not call attention to important kinds of validity evidence that might support the demanding process of making sense of test scores or reasoning from test scores.

Purpose: The intent of this article is to make more transparent the demands of achievement test interpretation on users in instructional contexts and to open up a dialogue on implications for the evaluation profession for improvement of practice along lines already set out by evaluation theorists.

Setting:  Not applicable.

Intervention: Not applicable.

Research Design: Not applicable.

Data Collection and Analysis: Review of current practice.

Findings:  The article makes transparent the lack of attention to validation of achievement tests to support inferences relevant to intended uses in instruction and project evaluation. Elements of a model for the process of reasoning from test scores are articulated. The cognitive demands on the test score user are illustrated in achievement test contexts in writing, science, and mathematics. Implications are drawn for deliberation on issues and for the development of casebooks to guide practice.

Keywords:  assessment; test validation; test users; test interpretation

 

Downloads

Download data is not yet available.

Article Details

How to Cite
Della-Piana, G. M., & Gardner, M. (2011). Demands on Users for Interpretation of Achievement Test Scores: Implications for the Evaluation Profession. Journal of MultiDisciplinary Evaluation, 7(16), 20–31. https://doi.org/10.56645/jmde.v7i16.318
Section
Research Articles