The MOOC and learning analytics innovation cycle (MOLAC): a reflective summary of ongoing research and its challenges
Abstract
The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro‐level, the data collection and analytics activities are focused on individual reflection and individual prediction. On the meso‐level, data from several open courses are combined to support benchmarking and to create insights about behaviour of groups of learners rather than the individual. These insights can inform the institution to adapt their educational model. On the macro‐level of the framework, cross‐institutional learning analytics enables to develop learning and teaching interventions that can be tested in a cluster of educational organizations to analyse the impact of these interventions beyond contextual factors. The article proposes four areas of future activities that are needed to enable the MOLAC framework. These are the standardization of the description of the educational design of MOOCs, data sharing facilities across institutions, joint policymaking and ethical guidelines and last but not least standardized evaluation approaches.
Number of times cited: 11
- Hendrik Drachsler and Jan Schneider, JCAL Special Issue on Multimodal Learning Analytics, Journal of Computer Assisted Learning, 34, 4, (335-337), (2018).
- R. Conijn, A. Van den Beemt and P. Cuijpers, Predicting student performance in a blended MOOC, Journal of Computer Assisted Learning, 34, 5, (615-628), (2018).
- Olga Viberg, Mathias Hatakka, Olof Bälter and Anna Mavroudi, The current landscape of learning analytics in higher education, Computers in Human Behavior, 10.1016/j.chb.2018.07.027, 89, (98-110), (2018).
- Camilo Vieira, Ying Ying Seah and Alejandra J. Magana, Students' experimentation strategies in design: Is process data enough?, Computer Applications in Engineering Education, 26, 5, (1903-1914), (2018).
- Stefaan Ternier, Maren Scheffel and Hendrik Drachsler, Towards a Cloud-Based Big Data Infrastructure for Higher Education Institutions, Frontiers of Cyberlearning, 10.1007/978-981-13-0650-1_10, (177-194), (2018).
- Shi Yang and Lu Juan, Research on Data Mining of Learning Behaviours of College Students on MOOC Platform, MATEC Web of Conferences, 10.1051/matecconf/201824603025, 246, (03025), (2018).
- Kun Li, Motivational Design in Chemistry MOOCs: Applying the ARCS Model, Online Approaches to Chemical Education, 10.1021/bk-2017-1261.ch003, (35-45), (2017).
- Lorenzo Vigentini, Andrew Clayphan, Xia Zhang and Mahsa Chitsaz, Overcoming the MOOC Data Deluge with Learning Analytic Dashboards, Learning Analytics: Fundaments, Applications, and Trends, 10.1007/978-3-319-52977-6_6, (171-198), (2017).
- Maka Eradze and Kairit Tammets, Learning Analytics in MOOCs: EMMA Case, Data Science and Social Research, 10.1007/978-3-319-55477-8_18, (193-204), (2017).
- Maren Scheffel, Hendrik Drachsler, Christian Toisoul, Stefaan Ternier and Marcus Specht, The Proof of the Pudding: Examining Validity and Reliability of the Evaluation Framework for Learning Analytics, Data Driven Approaches in Digital Education, 10.1007/978-3-319-66610-5_15, (194-208), (2017).
- Kyungrog Kim and Nammee Moon, A model for collecting and analyzing action data in a learning process based on activity theory, Soft Computing, 10.1007/s00500-017-2969-9, (2017).




