You are here:

Towards appropriate methodologies to research interactive learning: Using a design experiment to assess a learning programme for complex thinking development

, , , University of Johannesburg, South Africa

IJEDICT Volume 1, Number 2, ISSN 1814-0556 Publisher: Open Campus, The University of the West Indies, West Indies


In this paper, we contend that the most appropriate way to research the effectiveness of online learning is the use of Design Experiments. We present an exemplar of a recent design experiment that was completed at a university in Johannesburg, South Africa. During this study, the researchers wanted to explore the extent to which Complex Thinking skills can be facilitated in online learning environments. A design experiment was engineered in which a learning programme was designed and developed for Masters students. Specific instructional methodologies were employed in the learning programme, and activities were required that facilitate the use of Complex Thinking skills. The extent to which these skills were evident in student online activities could easily be detected by using comprehensive checklists and rubrics that were generated. A rigorous framework for analysis was developed. The findings were integrated with theoretical perspectives on instructional strategies for Complex Thinking development and new, unique criteria for online learning design were yielded.


Botha, J., van der Westhuizen, D. & De Swardt, E. (2005). Towards appropriate methodologies to research interactive learning: Using a design experiment to assess a learning programme for complex thinking development. International Journal of Education and Development using ICT, 1(2), 105-117. Open Campus, The University of the West Indies, West Indies. Retrieved February 21, 2019 from .


View References & Citations Map


  1. Barab, S.A. & Kirshner, D. (2001), “Special issue: Rethinking methodology in the learning sciences“, Journal of the Learning Sciences, vol. 10, nos. 1–2, pp. 1–222.
  2. Brown, A.L. (1992), “Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings”, The Journal of Learning Sciences, vol. 2, no. 2, pp. 141–178.
  3. Brown, A.L. (1997), “Transforming schools into communities of thinking and learning about serious matters”, American Psychologist, vol. 5, no. 52, pp. 399–413.
  4. Brown, A.L. & Campione J.C. (1996), Psychological theory and the design of innovative learning environments. In L.. Schauble & R. Glaser (1996), Innovations in learning: New environments for education, Lawrence Erlbaum Associates, Mahwah NJ.
  5. Clark, R.E. (1983), “Reconsidering research on learning with media”, Review of Educational Research, vol. 53, no. 4, pp. 445–459.
  6. Clark, R.E. (1994), “Media will never influence learning”, Educational Technology Research and Development, vol. 42, 2, pp. 21-29.
  7. Cobb, P., Confrey J., Disessa, A., Lehrer, R., & Schauble, L.. (2003), “Design experiments in educational research”, Educational Researcher, vol. 32, no. 1, pp. 9–13.
  8. Collins, A. (1999), The changing infrastructure of educational research. In J. Hawkins& A. Collins (1998), Design experiments using technology to restructure schools, Cambridge University Press, New York.
  9. Dillon, A. & Gabbard, R. (1998), “Hypermedia as an educational technology: A review of the quantitative research literature on learner comprehension, control and style”, Review of Educational Research, vol. 68 no. 3, pp. 322-349.
  10. Disessa, A.A. (1991), Local sciences: Viewing the design of human-computer systems as cognitive science. In M.J. Carrol Designing interaction: Psychology at the human-computer interface, Cambridge University Press, New York.
  11. Edelson, D.C. (2001), “Design research: What we learn when we engage in design”, Journal of the Learning Sciences, vol. 22, no. 1, pp. 105–121.
  12. Henri, F. (1992), Computer conferencing and content analysis. In A.R. Kaye Collaborative learning through computer conferencing, Springer-Verlag, Berlin.
  13. Hoepfl, M.C. (1997), “Choosing qualitative research: A primer for Technology Education researchers”, Journal of Technology Education, vol. 9, no. 1, pp. 47–63.
  14. Joseph, D. (2000), Passion as a driver for learning: A framework for the design of interest centered curricula. Doctoral thesis, Northwest University, Evanston, IL.
  15. LeCompte, M.D., Preissle, J., & Renata, T. (1993), Ethnography and qualitative design in educational research, Academic Press, San Diego.
  16. Lockard, J. & Abrams, P.D. (2001), Computers for twenty-first century educators, Longman, New York.
  17. Merriam, S.B. (1992), Qualitative research and case study applications in education, JosseyBass Publishers, San Francisco.
  18. Miles, M.B. & Huberman, A.M. (1994), Qualitative data analysis (2nd ed.), Sage, Thousand Oaks.
  19. Mitchell, P.D. (1997), The impact of educational technology: a radical reappraisal of research methods. In D. Squires, G. Conole& G. Jacobs (eds.) The changing face of learning technology, University Wales Press, Cardiff, pp. 51–58.
  20. Reeves, T.C. (1995), Questioning the questions of instructional technology research. In M.R. Simonson& M. Anderson (eds.) Proceedings of the Annual Conference of the Association for Educational Communications and Technology, Research and Theory Division, Anaheim, CA, pp. 459–470).
  21. Reeves, T.C. (2000) Enhancing the worth of instructional technology research through ‘design experiments’ and other development research strategies. Symposium on International perspectives on instructional technology research for the 21st century (session 41.29: New Orleans, LA, USA).
  22. Reeves, T.C. & Hedberg, J.C. (2003), Interactive Learning Systems Evaluation, Educational Technology Publications, Englewood Cliffs, New Jersey.
  23. Roblyer, M.D. & Knezek, G.A. (2003), “New millennium research for educational technology: A call for a national research agenda”, Journal of research on Technology in Education, vol. 36, no. 1.
  24. Roosevelt Haas, M. (2001), The new perspectives in technology and education series. Harvard Graduate School of Education. Online. Accessed on 19 August 2003 at:
  25. Shepard, L.A. (2000), “The role of assessment in a learning culture”, Education Researcher, vol. 29, no. 7, pp. 4–14.
  26. Soltis, J.F. (1992), Inquiry paradigms. In M.C. Alkin (ed.) Encyclopedia of educational research, Macmillan, New York, pp. 620–622.
  27. Stigler, J. & Hiebert, J. (1999), The teaching gap: Best ideas from the world’s teachers for improving education in the classroom, Free Press, New York.
  28. Stokes, D.E. (1997), Pasteur’s quadrant: Basic science and technological innovation. Brookings Institution Press, Washington, DC.
  29. Strauss, A. & Corbin, J. (1990), Basics of qualitative research, Sage Publications Inc., Newbury park, California.
  30. Tellez, K. (1993), Experimental and quasi-experimental research in technology and teacher education. In H.C. Waxman& G.W. Bright (eds.) Approaches to research on teacher education and technology, Association for the Advancement of Computers in Education, Charlottesville, pp. 67– 78.
  31. Van den Akker, J. (1999), Principles and methods of development research. In J van den Akker, N. Nieveen, R.M. Branch, K.L. Gustafson& T. Plomp (eds.) Design methodology and development research in education and training, Kluwer Academic Publishers, The Netherlands.
  32. VanderWesthuizen, D. (2002), Online learning in the South African context: A meta-analysis of research trends, issues and topics. Proceedings of the 2002 SASE conference. South African Society for Education, Pretoria.
  33. Vockell, E.L. & Asher, J.W. (1995), Educational research, Prentice Hall, Englewood Cliffs, New Jersey.
  34. Willis, B. (ed.) (1994), Distance Education: Strategies and Tools. Educational Technology Publications, New Jersey.
  35. Winegardner, K.E. (2000), The cased study method of scholarly research. Online. Accessed May 2002 at:

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact