Construction Versus Choice in Cognitive Measurement: Issues in Constructed Response, Performance Testing, and Portfolio Assessment

Published on Oct 12, 2012
Randy Elliot Bennett29
Estimated H-index: 29
William C. Ward15
Estimated H-index: 15
Contents: Preface. R.E. Bennett, On the Meanings of Constructed Response. R.E. Traub, On the Equivalence of the Traits Assessed by Multiple-Choice and Constructed-Response Tests. R.E. Snow, Construct Validity and Constructed-Response Tests. S. Messick, Trait Equivalence as Construct Validity of Score Interpretation Across Multiple Methods of Measurement. R.J. Mislevy, A Framework for Studying Differences Between Multiple-Choice and Free-Response Test Items. K.K. Tatsuoka, Item Construction and Psychometric Models Appropriate for Constructed Responses. N.J. Dorans, A.P. Schmitt, Constructed Response and Differential Item Functioning: A Pragmatic Approach. J. Braswell, J. Kupin, Item Formats for Assessment in Mathematics. R. Camp, The Place of Portfolios in Our Changing Views of Writing Assessment. D.P. Wolf, Assessment as an Episode of Learning. D.H. Gitomer, Performance Assessment and Educational Measurement. C.A. Dwyer, Innovation and Reform: Examples from Teacher Assessment. T.W. Hartle, P.A. Battaglia, The Federal Role in Standardized Testing. S.P. Robinson, The Politics of Multiple-Choice Versus Free-Response Assessment.
  • References (0)
  • Citations (92)
📖 Papers frequently viewed together
63 Citations
1 Author (Robert L. Linn)
2,775 Citations
1 Author (Richard E. Snow)
51 Citations
78% of Scinapse members use related papers. After signing in, all features are FREE.
Cited By92
#1Anthony D. Albano (NU: University of Nebraska–Lincoln)H-Index: 9
#2Michael C. Rodriguez (UMN: University of Minnesota)H-Index: 21
Recent legislation and federal regulations in education have increased our attention to issues of inclusion, fairness, equity, and access in achievement testing. This has resulted in a growing literature on item writing for accessibility. This chapter provides an overview of the fundamentals of item development, especially as they pertain to accessible assessments. Constructed-response and selected-response items are first introduced and compared, with examples. Next, the item development proces...
ABSTRACTAs evidenced from past research presented in this paper, seafarer students and employers are dissatisfied with traditional assessment methods in its ability to engage students in learning and develop skills for transfer from classrooms to the workplace. Although authentic assessments conducted in real world contexts may provide a possible solution, there is a dearth of research on it in the area of seafarer education and training (SET). Moreover, the Standards of Training, Certification,...
#1Yigal Attali (Princeton University)H-Index: 18
#2Cara Cahalan Laitusis (Princeton University)H-Index: 6
Last. Elizabeth Stone (Princeton University)H-Index: 6
view all 3 authors...
There are many reasons to believe that open-ended (OE) and multiple-choice (MC) items elicit different cognitive demands of students. However, empirical evidence that supports this view is lacking. In this study, we investigated the reactions of test takers to an interactive assessment with immediate feedback and answer-revision opportunities for the two types of items. Eighth-grade students solved mathematics problems, both MC and OE, with standard instructions and feedback-and-revision opportu...
5 CitationsSource
#1Nicolas BeckerH-Index: 9
#2Florian SchmitzH-Index: 14
Last. Frank M. SpinathH-Index: 39
view all 8 authors...
Several studies have shown that figural matrices can be solved with one of two strategies: (1) Constructive matching consisting of cognitively generating an idealized response, which is then compared with the options provided by the response format; or (2) Response elimination consisting of comparing the response format with the item stem in order to eliminate incorrect responses. A recent study demonstrated that employing a response format that reduces response elimination strategies results in...
6 CitationsSource
#1Meghan Rector Federer (OSU: Ohio State University)H-Index: 2
#2Ross H. Nehm (SBU: Stony Brook University)H-Index: 23
Last. Dennis K. Pearl (OSU: Ohio State University)H-Index: 42
view all 4 authors...
A large body of work has been devoted to reducing assessment biases that distort inferences about students’ science understanding, particularly in multiple-choice instruments (MCI). Constructed-response instruments (CRI), however, have invited much less scrutiny, perhaps because of their reputation for avoiding many of the documented biases of MCIs. In this study we explored whether known biases of MCIs—specifically item sequencing and surface feature effects—were also apparent in a CRI designed...
11 CitationsSource
#1Lucilene Bender de Sousa (PUCRS: Pontifícia Universidade Católica do Rio Grande do Sul)H-Index: 3
#2Lilian Cristine Hübner (PUCRS: Pontifícia Universidade Católica do Rio Grande do Sul)H-Index: 2
Este artigo aborda importantes desafios na construcao de tarefas de avaliacao da compreensao leitora, revisando dois aspectos cruciais: as demandas cognitivas presentes nas tarefas e a leiturabilidade textual. Ao longo da revisao reflete-se sobre a complexidade envolvida na avaliacao da compreensao leitora juntamente com os cuidados e os criterios a serem tomados quando da construcao das tarefas em relacao aos dois aspectos discutidos (demandas cognitivas e leiturabilidade). Na primeira parte, o...
Definition of a Coding Procedure of Open-ended Questions Based on the Models of International Studies The paper presents an in-depth study within the «Problem-solving and geographical skills» project funded by Sapienza University of Rome in 2011. The main research instrument consists of open-ended items which was used as a basis for designing a coding procedure of student responses in order to maximise coding and coder control and reliability. Bearing in mind that when analyzing open answers, th...
#1Adewale Adesina (Lboro: Loughborough University)H-Index: 1
#2Roger G. Stone (Lboro: Loughborough University)H-Index: 8
Last. Ian Jones (Lboro: Loughborough University)H-Index: 69
view all 4 authors...
Technology today offers many new opportunities for innovation in educational assessment and feedback through rich assessment tasks, efficient scoring and reporting. However many Computer-Aided Assessment (CAA) environments focus on grading and providing feedback on the final product of assessment tasks rather than the process of problem solving. Focussing on steps and problem-solving processes can help teachers to diagnose strengths and weaknesses, discover strategies, and to provide appropriate...
6 CitationsSource