Usage Data as Indicators of STEM OER Utility
article
Marcia Mardis, Chandrahasa Ambavapuru, Florida State University, United States
Journal of Online Learning Research Volume 3, Number 2, ISSN 2374-1473 Publisher: Association for the Advancement of Computing in Education (AACE), Waynesville, NC USA
Abstract
A key component of online and blended learning content, open educational resources, (OER) are heralded in a global movement toward high-quality, affordable, accessible, and personalized education. However, stakeholders have expressed concern about scaling OER use due to a lack of means to ensure a fit between learner, resource, and task. Usage data, or “paradata,” such as reviews, ratings, views, downloads, favorites, and shares, may yield insight into the fit. We examined paradata from National Science Digital Library (NSDL), the largest extant accessible corpus, for the extent to which K-12 science, technology, engineering, and mathematics (STEM) resource fit can be determined from user- and system- generated data. We conducted sentiment analyses of user reviews and correlations between the sentiment scores and data elements. Some relationships between NSDL paradata elements suggested aspects of resource fit. Despite prior research indicating that user reviews tended to be strongly positive or strongly negative, the results of this study indicated that educators left feedback that contained a blend of sentiments and that users usually downloaded resources they viewed. The results of this study suggest that while it is unlikely that educator feedback can currently be used to assess resource quality, with larger and more robust usage data sets, this area is a fertile area for further research into nuanced sentiment. We conclude with observed data trends and further research directions to inform online learning.
Citation
Mardis, M. & Ambavapuru, C. (2017). Usage Data as Indicators of STEM OER Utility. Journal of Online Learning Research, 3(2), 197-221. Waynesville, NC USA: Association for the Advancement of Computing in Education (AACE). Retrieved August 5, 2024 from https://www.learntechlib.org/primary/p/174192/.
© 2017 Association for the Advancement of Computing in Education (AACE)
References
View References & Citations Map- Abramovich, S., & Schunn, C. (2012). Studying teacher selection of resources in an ultralarge scale interactive system: Does metadata guide the way? Computers& Education, 58(1), 551-559. Doi: http://dx.doi.org/10.1016/J.compedu.2011.09.001Abramovich,S.,Schunn,C.D., & Correnti, R.J. (2013). The role of evaluative metadata in an online teacher resource exchange. Educational Technology Research and Development, 61(6), 863-883. Doi:10.1007/s11423-013-9317-2
- Bienkowski, M., Brecht, J., & Klo, J. (2012). The Learning Registry: Building a foundation for learning resource analytics. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, British Columbia, Canada. Http://dl.acm.org/citation.cfm?doid=2330601.2330651 Cambria, E., Schuller, B., Liu, B., Wang, H., & Havasi, C. (2013). Statistical approaches to concept-level sentiment analysis. Intelligent Systems, IEEE, 28(3), 6-9.
- Campbell, L.M., & Barker, P. (2013). Activity data and paradata: A briefing paper. Retrieved February 5, 2014, from http://publications.cetis.ac.uk/wp-content/uploads/2013/05/paradataBriefing.pdf
- Chen, C.-M., & Sun, Y.-C. (2012). Assessing the effects of different multimedia materials on emotions and learning performance for visual and verbal style learners. Comput-Usage Data 217
- Collins, S., & Levy, P. (2013, March). Guide to the use of Open Educational Resources in K-12 and postsecondary education. Retrieved June 25, 2013, from https://www.siia.net/ index.php?op t ion=com_docman& Task=doc_down load&g id
- Digital Textbook Collaborative. (2012). The digital textbook playbook. Retrieved March 10, 2012, from http://transition.fcc.gov/files/Digital_Textbook_Playbook.pdf Essa, A. (2016). A possible future for next generation adaptive learning systems. Smart Learning Environments, 3(1), 16.
- Ganu, G., Kakodkar, Y., & Marian, A. (2013). Improving the quality of predictions using textual information in online user reviews. Information Systems, 38(1), 1-15.
- Gayo-Avello. (2012, April 28). “I Wanted to Predict Elections with Twitter and all I got was this Lousy Paper”--A balanced survey on election prediction using Twitter data. Retrieved April 20, 2014, from http://arxiv.org/pdf/1204.6441v1 Gerlitz, C., & Helmond, A. (2013). The
- Green, S.B., & Salkind, N.J. (2005). Using SPSS for Windows and Macintosh for analyzing and understanding data. Upper Saddle River, NJ: Pearson Prentice Hall.
- Griffin, A. (2013, July). Subjective and objective reviews of instructional materials. Retrieved April 1, 2014, from http://simra.us/wp/news/subjective-and-objective-reviews-of-instructional-materials-2/
- Hanover Research Council. (2011, October). K-12 STEM education overview. Retrieved July 12, 2012, from http://www.hanoverresearch.com/wp-content/Uploads/2011/12/K-12-STEM-Education-Overview-Membership.pdf
- Hanover Research Council. (2012, October). Best practices in personalized learning environments (Grades 4 – 9). From http://www.hanoverresearch.com/media/Best-Practices-in-Personalized-Learning-Environments.pdf
- Hanson, K., & Carlson, B. (2005). Effective access: Teachers’ use of digital resources in STEM teaching. Newton, MA: Education Development Center, Inc. Hewlett Foundation. Open Educational Resources. Retrieved May 5, 2014, from http://www.hewlett.org/programs/education/open-educational-resources Hewlett Foundation. (2013, November 22). White paper: Open Education Resources: Breaking the lockbox on education. From http://www.hewlett.org/sites/default/files/
- Lagoze, C.J. (2010). Lost identity: The assimilation of digital libraries into the web. (PhD Dissertation), Cornell University, Ithaca, NY. ProQuest Dissertations& Theses Database database. (AAT 3396228)
- Lak, P., & Turetken, O. (2014). Star ratings versus sentiment analysis: A comparison of explicit and implicit measures of opinions. Paper presented at the 47th Hawaii International Conference on System Sciences (HICSS), 6-9 Jan. 2014 Waikoloa, HI
- Leutkemeyer, J., & Mardis, M.A. (2016). Applying the quadratic usage framework to research on K–12 STEM digital learning resources. School Library Research, 19. Retrieved from http://www.ala.org/aasl/sites/ala.org.aasl/files/content/Aaslpubsandjournals/slr/vol19/SLR_K-12STEMDigitalLearningResources_V19.pdf
- Liu, Y., Yu, X., An, A., & Huang, X. (2013). Riding the tide of sentiment change: sentiment analysis with evolving online reviews. WorldWide Web, 16(4), 477-496.
- Ludwig, S., de Ruyter, K., Friedman, M., Brüggen, E.C., Wetzels, M., & Pfann, G. (2013). More Than Words: The Influence of Affective Content and Linguistic Style Matches in Online Reviews on Conversion Rates. Journal of Marketing, 77(1), 87103.
- MacNeill, S., Campbell, L.M., & Hawksey, M. (2014). Analytics for education. Journal of Interactive Media In Education. Retrieved from http://www-jime.open.ac.uk/jime/ Article/view/2014-07%3C/div>
- Mardis, M.A. (2003). If we build it, will they come? An overview of the issues in K-12 digital libraries. In M. Mardis (Ed.), Developing digital libraries for K-12 education. Syracuse, NY: ERIC Information Technology Clearinghouse.
- Mardis, M.A., ElBasri, T., Norton, S.K., & Newsum, J. (2012). The new digital lives of U.S. Teachers: A research synthesis and trends to watch. School Libraries Worldwide, 18(1), 70-86.
- Mardis, M.A., & Everhart, N. (2013). From paper to pixel: Digital textbooks and Florida schools. In M. Orey, S.A. Jones& R.M. Branch (Eds.), Educational Media and Technology Yearbook (Vol. 37, pp. 93-118). New York, NY: Springer.
- Mardis, M.A., & Howe, K. (2010). STEM for our students: Content or co-conspiracy? Knowledge Quest, 39(2), 8-11.
- Massart, D., & Shulman, E. (2013). Unlocking open educational resources (OERs) interaction data. D-LIb, 19(5/6).Retrieved from http://www.dlib.org/dlib/may13/massart/05massart.html
- Maull, K.E., Saldivar, M.G., & Sumner, T. (2010). Observing the online behavior of teachers: From Internet usage to personalization for pedagogical practice. Paper presented at the Association for Computing Machinery Conference on Human Factors in Computing Systems, Atlanta, GA. Http://communication.ucsd.edu/barry/chiws10/maull_positionpaper_chi2010ws.pdf
- McMartin, F., Iverson, E., Wolf, A., Morrill, J., Morgan, G., & Manduca, C. (2008). The use of online digital resources and educational digital libraries in higher education. International Journal on Digital Libraries, 9(1), 65-79.
- Mickey, K., & Meaney, K. (2010). Simba Information’s 2010 national textbook adoption scorecard and 2011 outlook. Stamford, CT: Simba Information.
- Mickey, K., & Meaney, K. (2011). Simba Information’s 2011 national textbook adoption scorecard and 2012 outlook. Stamford, CT.
- Mickey, K., & Meaney, K. (2013). Getting ready for the Common Core 2013-2014. Stamford, CT: Simba Information, Inc.
- Milan, F.B., Parish, S.J., & Reichgott, M.J. (2006). A Model for Educational Feedback Based on Clinical Communication Skills Strategies: Beyond the “Feedback Sandwich”. Teaching and Learning in Medicine, 18(1), 42-47. Doi: 10.1207/ Usage Data 219
- Mudambi, S.M., & Schuff, D. (2010). What makes a helpful online review? A study of customer reviews on Amazon.com. MIS Quarterly, 34(1), 185-200.
- Niemann, K., Scheffel, M., & Wolpers, M. (2012). A comparison of usage data formats for recommendations in TEL. Paper presented at the 2nd Workshop on Recommender Systems in Technology Enhanced Learning 2012 in conjunction with the 7th European Conference on Technology Enhanced Learning (EC-TEL 2012), September 18-19, Saarbrücken, Germany.
- Okerson, A. (2000). Are we there yet? Online E-resources ten years after. Library Trends, 48(4), 671-693.
- Padmaja, S., & Fatima, S. (2013). Opinion mining and sentiment analysis –An assessment of peoples’ belief: A survey. International Journal of Ad hoc, Sensor& Ubiquitous Computing (IJASUC), 4(1), 21-33.
- Pang, B., & Lee, L. (2005). Sentence polarity dataset. Retrieved from: http://www.cs.cornell.edu/people/pabo/movie-review-data/rt-polaritydata.tar.gz
- Pang, B., & Lee, L. (2008). Opinion mining and sentiment analysis. Foundations and Trends in Information Retrieval, 2(1–2), 1-135.
- Parkes, J., Abercrombie, S., & McCarty, T. (2013). Feedback sandwiches affect perceptions but not performance. Advances in Health Sciences Education, 18(3), 397-407.
- Porcello, D., & Hsi, S. (2013). Curating and crowdsourcing online education resources. Science, 34(6143), 240-241.
- Price, L. (2007). Lecturers’ vs. Students’ perceptions of the accessibility of instructional materials. Instructional Science, 35(4), 317-341. Tomorrow. (2012a, April). Mapping a personalized learning journey– K-12 students and parents connect the dots with digital learning. Retrieved June 10, 2012, from http://www.tomorrow.org/speakup/pdfs/SU11_PersonalizedLearning_Students. Pdf
- Recker, M., Leary, H., Walker, A., Diekema, A., Wetzler, P., Sumner, T., & Martin, J.H. (2011). Modeling teacher ratings of online resources: A human-machine approach to quality. Paper presented at the American Educational Research Association (AERA) Annual Meeting, April 8-11, New Orleans, LA.
- Recker, M., Walker, A., Giersch, S., Mao, X., Halioris, S., Palmer, B., Johnson, D., Leary, H., & Robertshaw, M.B. (2007). A study of teachers’ use of online learning resources to design classroom activities. New Review of Hypermedia and Multimethods, 13(2), 117-134.
- Rosenbaum, S. (2011). Curation nation: Why the future of content is context. New York, NY: McGraw Hill.
- School Library Journal. (2013, December 5). School technology survey: U.S. School libraries 2013. From http://www.slj.com/downloads/slj-technology-survey/ Seaman, J.E., & Allen, I.E. (2014). Listening to sentiment. Quality Progress, 47(2), 4850.
- Sharifrazi, F., & McCabe, M.B. (2014). Perils of click like, share or comment on social media networks. ASBBS Proceedings, 21(1), 739-745.
- Simba Information, I. (2014). 2013 National Instructional Materials Adoption Scorecard and 2014 Forecast. Stamford, CT: Author.
- Spiegel, D.L. (1989). Instructional resources: Evaluating instructional materials. The Reading Teacher, 43(1), 72-73.
- Springer, N., Pfaffinger, C., & Engelmann, I. (2015). User comments: Motives and inhibitors to write and read. Information, Communication& Society, 18(7), 798-815.
- Stern, L., & Roseman, J.E. (2004). Can middle-school science textbooks help students learn important ideas? Findings from Project 2061’s curriculum evaluation study: Life science. Journal of Research in Science Teaching, 41(6), 538-568.
- Sumner, T., Khoo, M., Recker, M., & Marlino, M. (2003). Understanding educator perceptions of ”quality” in digital libraries. Paper presented at the 3rd ACM/IEEE-CS Joint Conference on Digital Libraries Houston (JCDL), Houston, TX.
- Tenenboim, O., & Cohen, A.A. (2015). What prompts users to click and comment: A longitudinal study of online news. Journalism, 16(2), 198-217.
- Tonks, D., Patrick, S., & Bliss, T. (2013, June). OER and collaborative content development. From http://www.inacol.org/wp-content/uploads/2015/02/oer-and DASHDASH
- U.S. Department of Education. (2014). Learning Registry: Sharing what we know. Retrieved March 8, 2014, from http://learningregistry.org/ U.S. Department of Education. (2017a, March). #GoOpen district launch packet, 1.3. From https://tech.ed.gov/files/2017/03/GoOpen-District-Launch-Packet-2017-V-1.3.pdf
- Williams, D.A., & Coles, L. (2003). The use of research information by teachers: Information literacy, access and attitudes. A report for the Economic and Social Research Council. Aberdeen, Scotland: Robert Gordon University.
- Wilson, T., Wiebe, J., & Hoffmann, P. (2009). Recognizing contextual polarity: An exploration of features for phrase-level sentiment analysis. Computational Linguistics, 35(3), 399-434.
- YouGov. (2014, January 22). Reviews: Fieldwork Dates: 17th-20th January 2014. Retrieved May 5, 2014, from http://cdn.yougov.com/r/14/YouGov_Omnibus_Online_ Reviews.xls
- Ziegele, M., Breiner, T., & Quiring, O. (2014). What creates interactivity in online news discussions? An exploratory analysis of discussion factors in user comments on
These references have been extracted automatically and may have some errors. Signed in users can suggest corrections to these mistakes.
Suggest Corrections to References