The full text of this article hosted at iucr.org is unavailable due to technical difficulties.

Closing commentary

The MOOC and learning analytics innovation cycle (MOLAC): a reflective summary of ongoing research and its challenges

H. Drachsler

Corresponding Author

Welten Institute – Research Centre for Learning, Teaching and Technology, Open University of the Netherlands, , The Netherlands

Correspondence: Hendrik Drachsler, Welten Institute, Open Universiteit, P.O. Box 29606401 DL, Heerlen, The Netherlands. Email:

hendrik.drachsler@ou.nl

Search for more papers by this author
M. Kalz

Welten Institute – Research Centre for Learning, Teaching and Technology, Open University of the Netherlands, , The Netherlands

Chair for Open Education, Faculty of Management, Science and Technology, Open University of the Netherlands, , The Netherlands

Search for more papers by this author
First published: 16 March 2016
Cited by: 11

Abstract

The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro‐level, the data collection and analytics activities are focused on individual reflection and individual prediction. On the meso‐level, data from several open courses are combined to support benchmarking and to create insights about behaviour of groups of learners rather than the individual. These insights can inform the institution to adapt their educational model. On the macro‐level of the framework, cross‐institutional learning analytics enables to develop learning and teaching interventions that can be tested in a cluster of educational organizations to analyse the impact of these interventions beyond contextual factors. The article proposes four areas of future activities that are needed to enable the MOLAC framework. These are the standardization of the description of the educational design of MOOCs, data sharing facilities across institutions, joint policymaking and ethical guidelines and last but not least standardized evaluation approaches.

Number of times cited: 11

  • , JCAL Special Issue on Multimodal Learning Analytics, Journal of Computer Assisted Learning, 34, 4, (335-337), (2018).
  • , Predicting student performance in a blended MOOC, Journal of Computer Assisted Learning, 34, 5, (615-628), (2018).
  • , The current landscape of learning analytics in higher education, Computers in Human Behavior, 10.1016/j.chb.2018.07.027, 89, (98-110), (2018).
  • , Students' experimentation strategies in design: Is process data enough?, Computer Applications in Engineering Education, 26, 5, (1903-1914), (2018).
  • , Towards a Cloud-Based Big Data Infrastructure for Higher Education Institutions, Frontiers of Cyberlearning, 10.1007/978-981-13-0650-1_10, (177-194), (2018).
  • , Research on Data Mining of Learning Behaviours of College Students on MOOC Platform, MATEC Web of Conferences, 10.1051/matecconf/201824603025, 246, (03025), (2018).
  • , Motivational Design in Chemistry MOOCs: Applying the ARCS Model, Online Approaches to Chemical Education, 10.1021/bk-2017-1261.ch003, (35-45), (2017).
  • , Overcoming the MOOC Data Deluge with Learning Analytic Dashboards, Learning Analytics: Fundaments, Applications, and Trends, 10.1007/978-3-319-52977-6_6, (171-198), (2017).
  • , Learning Analytics in MOOCs: EMMA Case, Data Science and Social Research, 10.1007/978-3-319-55477-8_18, (193-204), (2017).
  • , The Proof of the Pudding: Examining Validity and Reliability of the Evaluation Framework for Learning Analytics, Data Driven Approaches in Digital Education, 10.1007/978-3-319-66610-5_15, (194-208), (2017).
  • , A model for collecting and analyzing action data in a learning process based on activity theory, Soft Computing, 10.1007/s00500-017-2969-9, (2017).