Response Time and Learning Progress: Teaching Perspective
PROCEEDINGS
Eugene Gvozdenko, Dianne Chambers, Kaye Stacey, Vicki Steinle, The University of Melbourne, Australia
Global Learn, in Melbourne, Australia ISBN 978-1-880094-85-3 Publisher: Association for the Advancement of Computing in Education (AACE)
Abstract
The study analysed the utility of data from logging response times of students on a computerised basic mathematics test to monitor cognitive progress of a group of pre-service teacher education students over a semester. The results suggest that the magnitude of reduction in mean response time can be a useful indicator of the impact of teaching and learning. Response time was found to be especially valuable to monitor the progress of high achievers whose answers were correct both at the beginning and at the end of the semester. The findings of the study will enhance the pedagogical practice of Assessment For Learning (formative assessment) which stands in the centre of the current education assessment reform.
Citation
Gvozdenko, E., Chambers, D., Stacey, K. & Steinle, V. (2011). Response Time and Learning Progress: Teaching Perspective. In S. Barton, J. Hedberg & K. Suzuki (Eds.), Proceedings of Global Learn Asia Pacific 2011--Global Conference on Learning and Technology (pp. 400-408). Melbourne, Australia: Association for the Advancement of Computing in Education (AACE). Retrieved August 8, 2024 from https://www.learntechlib.org/primary/p/37203/.
© 2011 Association for the Advancement of Computing in Education (AACE)
Keywords
References
View References & Citations Map- Bergstrom, B., Gershon, R., & Lunz, M.E. (1994). Computerized adaptive testing: Exploring examinee response time using hierarchical linear modeling. Paper presented at the Annual meeting of the National Council on Measurement in Education, New Orlean, LA, USA.
- Bridgeman, B. & Cline, F. (2004). Effects of differentially time-consuming tests on computer-adaptive test scores. Journal of Educational Measurement, 41, 137-148.
- Chang, S., Plake, B.S. & Ferdous, A.A. (2005). Response times for correct and incorrect item responses on computerized adaptive tests as a function of examinee characteristics. Paper presented at the 2005 annual meeting of the American Educational Research Association (AERA), Montréal, Canada.
- Cook, T.D., & Campbell D.T. (1979). Quasi-experimentation: design& Analysis issues for field settings. Boston: Houghton Mifflin.
- Fox, J.-P., Klein Entink, R.H., & Vander Linden, W.J. (2007). Modelling of responses and response times with the package CIRT. Journal of Statistical Software, 20(7), 1– 14.
- Gardner, J. (Ed.)(2005). Assessment and learning. Sage Publications. Gaviria, Jose-Luis (2005). Increase in precision when estimating parameters in computer assisted testing using response time. Quality& Quantity, 39(1), 45– 69.
- González-Espada, W. & Bullock, D. (2007). Innovative Applications of Classroom Response Systems: Investigating Students’ Item response times in Relation to Final Course Grade, Gender, General Point Average, and High School ACT Scores. Electronic Journal for the
- Gvozdenko, E. (2005). Question response time in computerized testing: applicability for test design and prediction of error. Thesis (M.IT.Ed.), University of Melbourne, Faculty of Education.
- Gvozdenko, E. (2010). Meaning and Potential of Test Response Time and Certainty Data: Teaching Perspective. Thesis (PhD, in progress), University of Melbourne, Melbourne Graduate School of Education.
- Hornke, L.F. (1997). Investigating item response times in computerized adaptive testing. Diagnostica 43(1), 27-39.
- Hornke, L.F. (2000). Item response times in computerized adaptive testing. Psicoló gica, 21(1), 175-189.
- Kong, X., Wise, S. & Bhola, D. (2007). Setting the response time threshold parameter to differentiate solution behavior from rapid-guessing behavior. Educational and Psychological Measurement, 67(4), 606-619.
- Linden, W.J. (2008). Using response times for item selection in adaptive testing. Journal of educational and behavioural statistics, 33, (5), 5-20.
- Meijer, R. (2006, May). Detection of Advance Item Knowledge Using Response Times in Computer Adaptive Testing. LSAC Computerized Testing Report 03-03. Law School Admission Council: Newtown, PA.
- Rammsayer, T. (1999). Timing behaviour in computerized adaptive testing: response times as a function of correct and incorrect answers. Diagnostica 45(4), 178-183.
- Schnipke, D.L., & Scrams, D.J. (1999). Exploring issues of test taker behaviour: Insights gained from response time analyses. Princeton, NJ: Law School Admission Council.
- Schnipke, D.L. & Scrams, D.J. (2002). Exploring issues of examinee behavior: Insights gained from response-time analyses. In C.N. Mills, M.T. Potenza, J.J. Fremer, & W.C. Ward (Eds.), Computer-based testing: Building the foundation for future assessments. Mahwah, NJ: Lawrence Erlbaum Associates.
- Stout, W. (2002). Psychometrics: from practice to theory and back. 15 Years of Nonparametric Multidimensional IRT, DIF/Test Equity, and Skills Diagnostic Assessment. Psychometrika, 67 (4), 485 – 518.
- Thurstone, L.L. (1937). Psychology as a quantitative rational science. Science, 85, 227-232.
These references have been extracted automatically and may have some errors. Signed in users can suggest corrections to these mistakes.
Suggest Corrections to References