You are here:

Implementing the interactive response system in a high school physics context: Intervention and reflections

, Kainan University ; , National Changhua University of Education

Australasian Journal of Educational Technology Volume 29, Number 5, ISSN 0814-673X Publisher: Australasian Society for Computers in Learning in Tertiary Education


The interactive response system (IRS) has been widely used to promote student learning since 2003. It is an electronic system connected to handset devices allowing students to transmit their responses by pressing the desired buttons and meanwhile allowing the teacher to monitor and track individual students' answers anonymously and statistically. However, there is limited research examining the challenges teachers may encounter when designing IRS-based questions and providing mediations which may lead them to develop quality questions. The purpose of this study is to address this research gap by investigating one high school teacher's IRS implementation based on both the teacher's and students' teaching/learning experiences as well as presenting an intervention to help the teacher develop higher quality IRS questions. High quality questions denote questions that are able to help students engage in deeper thinking and eventually lead to comprehensive understanding of the concepts learned. The data sources consist of tests, classroom observations, interviews, face-to-face meetings, and email correspondence. The findings disclose that enhancing the teacher's content knowledge and capability of recognizing the students' learning pitfalls is the foundation to developing quality IRS questions. Collaboration established between the teacher and a university physics education expert appears to have effectively helped both participants gain insights and knowledge into designing quality questions aimed at identifying the students' learning bottlenecks.


Shieh, R. & Chang, W. (2013). Implementing the interactive response system in a high school physics context: Intervention and reflections. Australasian Journal of Educational Technology, 29(5),. Australasian Society for Computers in Learning in Tertiary Education. Retrieved March 21, 2019 from .

View References & Citations Map


  1. Baird, J.R., Fensham, P.J., Gunstone, R.F., & White, R.T. (1991). The importance of reflection in improving science teaching and learning. Journal of Research in Science Teaching 28(2), 163-182.
  2. Bartsch, R.A., & Murphy, W. (2011). Examining the effects of an electronic classroom response system on student engagement and performance. Journal of Educational Computing Research, 44(1), 25-33.
  3. Beatty, I.D., & Gerace, W.J. (2009). Technology-enhanced formative assessment: A research-based pedagogy for teaching science with classroom response technology. Journal of Science Education and Technology, 18,146-162.
  4. Bell, B., & Cowie, B. (2001). The characteristics of formative assessment in science education. Science Education 85(5), 536-553.
  5. Buckenmeyer, J., & Freitas, D. (2006). Is technology an effective tool to improve schools? Two Views. Educational technology, 46, 54-56.
  6. Connor, E. (2011). Using cases and clickers in library instruction: Designed for science undergraduates. Science& Technology Libraries, 30, 244-253.
  7. Creese, J. (2011). Self-and cohort-directed design in research training tutorials for undergraduate researchers: increasing ownership and relevance to improve learning outcomes. The Journal Academic Librarianship, 37(4), 327-332.
  8. Dori, Y.J., & Belcher, J. (2005). How does technology-enabled active learning affect undergraduate students' understanding of electromagnetism concepts? The Journal of the Learning Sciences, 14(2), 243-279.
  9. Dufresne, R.J., & Gerace, W.J. (2004). Assessing-to-learn: Formative assessment in physics instruction. The Physics Teacher, 42(7), 428-433.
  10. Etkina, E. (2010). Pedagogical content knowledge and preparation of high school physics teachers. Physical Review Special Topics: Physics Education Research, 6(2), 020110.
  11. Hancock, T.M. (2010). Use of audience response systems for summative assessment in large classes. Australasian Journal of Educational Technology, 26(2), 226-237.
  12. Henriksen, E.K., & Angell, C. (2010). The role of 'talking physics' in an undergraduate physics class using an electronic audience response system. Physics Education, 45(3), 278-284.
  13. Kay, R.H., & LeSage, A. (2009). A strategic assessment of audience response systems used in higher education. Australasian Journal of Educational Technology, 25(2), 235-249. Retrieved from
  14. Koehler, M.J., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Technology (Eds.), Handbook of technological pedagogical content knowledge (TPCK) for educators, (pp. 3-30). New York: Routledge.
  15. Laxman, K. (2011). A study on the adoption of clickers in higher education. Australasian Journal Educational Technology, 27(Special Issue, 8), 1291-1303.
  16. Lee, A., Ding, L., Reay, N.W., & Bao, L. (2011). Single-concept clicker question sequences, The Physics Teacher, 49, 385-389.
  17. Lijnse, P.L. (1995). "Developmental research" as a way to an empirically based "didactical structure" of science. Science Education, 79(2), 189-199.
  18. Lin, Y.C., Liu, T.C., & Chu, C.C. (2011). Implementing clickers to assist learning in science lectures: The clicker-assisted conceptual change model. Australasian Journal Educational Technology, 27(6), 979-996.
  19. Magnusson, S., Krajcik, J., & Borko, H. (1999). Nature, sources and development of pedagogical content knowledge for science teaching. In J. Gess-Newsome & N.G. Lederman (Eds.), Examining pedagogical Kluwer Academic Publishers. Content knowledge: The construct and its implications for science education (pp. 95–133). Dordrecht:
  20. Maloney, D., O'Kuma, T., Hieggelke, C., & Van Heuvelen, A. (2001). Surveying students' conceptual knowledge of electricity and magnetism. Physics Education Research, American Journal of Physics Supplement, 69(7), S12-S23.
  21. Mazur, E. (1997). Peer instruction: getting students to think in class. AIP Conference Proceedings, 399(2), 981-988.
  22. Mishra, P., & Koehler, M.J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054.
  23. Patton, M.Q. (2002). Qualitative research& Evaluation methods. Thousand Oaks, CA: Sage Publications.
  24. Shieh, R.S., Chang, W., & Liu, Z.F. (2011). Technology enabled active learning (TEAL) in introductory Physics: Impact on genders and achievement levels. Australasian Journal of Educational Technology 27(7), 1082-1099. Retrieved from
  25. Shieh, R.S. (2012). The impact of technology-enabled active learning (TEAL) implementation on student learning and teachers' teaching in a high school context. Computers& Education, 59, 206–214.
  26. Shulman, L.S. (1996). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14.
  27. Viennot, L., Chauvet, F., Colin, P., & Rebmann, G. (2005). Designing strategies and tools for teacher training: The role of critical details, examples in optics. Science Education, 89, 13-27.
  28. Wittmann, M.C., Steinberg, R.N., & Redish, E.F. (1999). Making sense of how students make sense of waves, The Physics Teacher, 37, 15–21.
  29. Yeh, C.R., & Tao, Y.H. (2012). College students' intention to continue using a personal response system: Deriving a model from four theoretical perspectives. Australasian Journal Educational Technology, 28(5), 912-930.
  30. Yourstone, S.A., Kraye, H.S., & Albaum, G. (2008). Classroom questioning with immediate electronic response: Do clickers improve learning? Decision Sciences Journal of Innovative Education, 6(1), 75-88.

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact