You are here:

Making Sense of Learning Analytics Dashboards: A Technology Acceptance Perspective of 95 Teachers

, Open University UK ; , , Open University UK, Institute of Educational Technology, Milton Keynes, United Kingdom ; , , Open University UK, LTI, Milton Keynes, United Kingdom

IRRODL Volume 19, Number 5, ISSN 1492-3831 Publisher: Athabasca University Press


The importance of teachers in online learning is widely acknowledged to effectively support and stimulate learners. With the increasing availability of learning analytics data, online teachers might be able to use learning analytics dashboards to facilitate learners with different learning needs. However, deployment of learning analytics visualisations by teachers also requires buy-in from teachers. Using the principles of technology acceptance model, in this embedded case-study, we explored teachers’ readiness for learning analytics visualisations amongst 95 experienced teaching staff at one of the largest distance learning universities by using an innovative training method called Analytics4Action Workshop. The findings indicated that participants appreciated the interactive and hands-on approach, but at the same time were skeptical about the perceived ease of use of learning analytics tools they were offered. Most teachers indicated a need for additional training and follow-up support for working with learning analytics tools. Our results highlight a need for institutions to provide effective professional development opportunities for learning analytics.


Rienties, B., Herodotou, C., Olney, T., Schencks, M. & Boroowa, A. (2018). Making Sense of Learning Analytics Dashboards: A Technology Acceptance Perspective of 95 Teachers. The International Review of Research in Open and Distributed Learning, 19(5),. Athabasca University Press. Retrieved March 21, 2019 from .

View References & Citations Map


  1. Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179-211. Doi:10.1016/0749-5978(91)90020-T
  2. Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319-340.
  3. Davis, F.D., Bagozzi, R.P., & Warshaw, P.R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982–1002.
  4. Fynn, A. (2016). Ethical considerations in the practical application of the Unisa socio-critical model of student success. International Review of Research in Open and Distributed Learning, 17(6). Doi:10.19173/irrodl.v17i6.2812
  5. Heath, J., & Fulcher, D. (2017). From the trenches: Factors that affected learning analytics success
  6. Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., Hlosta, M., & Naydenova, G. (2017). Implementing predictive learning analytics on a large scale: The teacher's perspective. Paper
  7. Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: Analysing atrisk students at The Open University. Learning Analytics Review, LAK15-1, 1–16. Retrieved from
  8. Jindal-Snape, D., & Topping, K.J. (2010). Observational analysis within case-study design. In S.
  9. McKenney, S., & Mor, Y. (2015). Supporting teachers in data-informed educational design. British Journal of Educational Technology, 46(2), 265-279.
  10. Mishra, P., & Koehler, M.J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054.
  11. Mor, Y., Ferguson, R., & Wasson, B. (2015). Editorial: Learning design, teacher inquiry into student
  12. Muñoz Carril, P.C., González Sanmamed, M., & Hernández Sellés, N. (2013). Pedagogical roles and
  13. Papamitsiou, Z., & Economides, A. (2016). Learning analytics for smart learning environments: A meta-analysis of empirical research results from 2009 to 2015. In J.M. Spector, B.B. Lockee,
  14. Pynoo, B., Devolder, P., Tondeur, J., van Braak, J., Duyck, W., & Duyck, P. (2011). Predicting secondary school teachers’ acceptance and use of a digital learning environment: A cros ssectional study. Computers in Human Behavior, 27(1), 568-575.
  15. Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016). Analytics4Action
  16. Rienties, B., Brouwer, N., & Lygo-Baker, S. (2013). The effects of online professional development on
  17. Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction
  18. Shattuck, J., Dubins, B., & Zilberman, D. (2011). Marylandonline's inter-institutional project to train
  19. Slade, S., & Boroowa, A. (2014). Policy on ethical use of student data for learning analytics. Milton Keynes: Open University UK.
  20. Stenbom, S., Jansson, M., & Hulkko, A. (2016). Revising the community of inquiry framework for the
  21. Šumak, B., Heričko, M., & Pušnik, M. (2011). A meta-analysis of e-learning technology acceptance: The role of user types and e-learning technology types. Computers in Human Behavior, 27(6), 2067-2077.
  22. Tempelaar, D.T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for
  23. Teo, T., & Zhou, M. (2016). The influence of teachers’ conceptions of teaching and learning on their technology acceptance. Interactive Learning Environments, 25(4), 1-15.

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact