You are here:

Sharing Secrets with Robots?
PROCEEDINGS

, , , Coventry University, United Kingdom

EdMedia + Innovate Learning, in Tampere, Finland ISBN 978-1-939797-08-7 Publisher: Association for the Advancement of Computing in Education (AACE), Waynesville, NC

Abstract

This paper presents initial findings from a large-scale study that evaluated levels of student disclosure on sensitive topics. Four different conditions of survey delivery were applied and follow up interviews were undertaken. Non-parametric tests were used due to the data not satisfying the assumptions of parametric statistical tests; Wilcoxon Signed Ranks tests were conducted to examine the differences further. Preliminary data suggest that students disclosed additional information to the chatbot on more sensitive topics when the length of engagement was increased, but that this effect could be negated by the inclusion of the depth of engagement questions. Such findings suggest that the sensitivity of the student-chatbot conversation is critical in determining the influence of the chatbot, and that particular care should be taken when designing contextually-relevant ‘icebreaker’ questions.

Citation

Bhakta, R., Savin-Baden, M. & Tombs, G. (2014). Sharing Secrets with Robots?. In J. Viteli & M. Leikomaa (Eds.), Proceedings of EdMedia 2014--World Conference on Educational Media and Technology (pp. 2295-2301). Tampere, Finland: Association for the Advancement of Computing in Education (AACE). Retrieved January 22, 2019 from .

View References & Citations Map

References

  1. Baylor, A.L. (2011). The design of motivational agents and avatars. Educational Technology Research and Development, 59(2), 291–300. Doi:10.1007/s11423-011-9196-3
  2. Childs, M. (2010, November). Learners’ Experience of Presence in Virtual Worlds (Doctoral). University of Warwick.
  3. Clark, R., & Mayer, R.E. (2008). E-learning and the science of instruction (2nd ed.). San Francisco: Jossey-Bass.
  4. Corritore, C.L., Kracher, B., & Wiedenbeck, S. (2003). On-line trust: concepts, evolving themes, a model. International Journal of Human-Computer Studies, 58(6), 737–758. Doi:10.1016/S1071-5819(03)00041-7
  5. Culley, K.E., & Madhavan, P. (2013). A note of caution regarding anthropomorphism in HCI agents. Computers in Human Behavior, 29(3), 577–579.
  6. Dehn, D.M., & Van Mulken, S. (2000). The impact of animated interface agents: a review of empirical research. International Journal of Human-Computer Studies, 52(1), 1–22.
  7. Demeure, V., Niewiadomski, R., & Pelachaud, C. (2011). How is believability of a virtual agent related to warmth, competence, personification, and embodiment? Presence: Teleoperators and Virtual Environments, 20(5), 431– 448.
  8. Dunsworth, Q., & Atkinson, R.K. (2007). Fostering multimedia learning of science: Exploring the role of an animated agent’s image. Computers& Education, 49(3), 677–690. Doi:10.1016/J.compedu.2005.11.010
  9. Garau, M., Slater, M., Vinayagamoorthy, V., Brogni, A., Steed, A., & Sasse, M.A. (2003). The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment (P. 529). ACM
  10. Kim, Y. (2007). Desirable characteristics of learning companions. International Journal of Artificial Intelligence in Education, 17(4), 371–388. Retrieved from http://ijaied.org/pub/1110/file/1110_Kim07.pdf Kim, Y,, & Baylor, A.L. (2006). A Social-Cognitive Framework for Pedagogical Agents as Learning Companions. Educational Technology Research and Development, 54(6), 569–596. Doi:10.1007/s11423-006-0637-3
  11. Kim, Y., Baylor, A.L., & PALS Group. (2006). Pedagogical Agents as Learning Companions: The Role of Agent Competency and Type of Interaction. Educational Technology Research and Development, 54(3), 223–243.
  12. Lee, E.-J. (2010). The more humanlike, the better? How speech type and users’ cognitive style affect social responses to computers. Computers in Human Behavior, 26(4), 665–672.
  13. Lessler, J.T., & O’Reilly, J.M. (1997). Mode of interview and reporting of sensitive issues: Design and implementation of audio computer assisted self-interviewing. In L. Harrison& A. Hughes (Eds.), The validity of self-reported drug use: Improving the accuracy of survey measurements (pp. 366–382). Rockville, MD: National Institute of Drug
  14. Office for National Statistics. (2012). Internet Access-Households and Individuals, 2012 (Statistical Bulletin). London: Office for National Statistics. Retrieved from http://www.ons.gov.uk/ons/dcp171778_275775.pdf Savin-Baden, M., Hilton, S., Kavia, S., Linsey, T., Morris, D., Poulton, T., … Younis, A. (2009). PREVIEW project final report. JISC.
  15. Tourangeau, R., & Smith, T.W. (1996). Asking sensitive questions: the impact of data collection mode, question format, and question context. Public Opinion Quarterly, 60, 275–304.
  16. Turing, A.M. (1950). Computing machinery and intelligence. Mind, 49, 433–460.
  17. Turkle, S. (2010). In good company? On the threshold of robotic companions. In Y. Wilks (Ed.), Close engagements with artificial companions: key social, psychological, ethical and design issues (pp. 3–10). Amsterdamn ; Philadelphia,
  18. Veletsianos, G., & Miller, C. (2008). Conversing with pedagogical agents: A phenomenological exploration of interacting with digital entities. British Journal of Educational Technology, 39(6), 969–986.
  19. Veletsianos, G., & Russell, G. (2014). Pedagogical Agents. In M. Spector, D. Merrill, J. Elen, & M.J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 759–769). New York:

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact info@learntechlib.org.

Slides