Engaging learners in online learning without external incentives: Evidence from a field experiment

Online learning platforms increase opportunities for learners to participate in learning out of interest, without any external incentives such as fulfilling requirements for degree programmes or certificates. However, such forms of online learning often suffer from low sustained learning engagement. Building on theories related to normative influences, this study extends the literature by focusing on the effect of peer information on learning engagement and outcomes in an online learning setting without external incentives. A field experiment was conducted through a leading massive open online course platform in China, and information interventions were manipulated in social media groups associated with the course. Surprisingly, the results revealed that the presence of peers' active learning behaviour information did not always lead to enhanced learning engagement. Specifically, it had a positive influence only when question intervention was applied, that is, when the learners were also presented with questions related to the course content prior to learning. Moreover, question intervention alone was effective in enhancing learning engagement. Our results also showed that the learners with question interventions were more likely to pay attention to their peers' behaviour and align their learning pace with that of their peers. Implications for theory and practice are discussed.

influences, this study extends the literature by focusing on the effect of peer information on learning engagement and outcomes in an online learning setting without external incentives.A field experiment was conducted through a leading massive open online course platform in China, and information interventions were manipulated in social media groups associated with the course.Surprisingly, the results revealed that the presence of peers' active learning behaviour information did not always lead to enhanced learning engagement.Specifically, it had a positive influence only when question intervention was applied, that is, when the learners were also presented with questions related to the course content prior to learning.Moreover, question intervention alone was effective in enhancing learning engagement.Our results also showed that the learners with question interventions were more likely to pay attention to their peers' behaviour and align their learning pace with that of their peers.Implications for theory and practice are discussed.

K E Y W O R D S
normative influence, online learning, peer information, question intervention

| INTRODUCTION
Online learning platforms have shown significant growth over the past decade (Chiu et al., 2007;Gupta & Bostrom, 2009;Huang et al., 2021;Kizilcec et al., 2017).The high accessibility of learning materials online promotes extra-curricular learning activities (Lin et al., 2012).Individuals can enrol in an online course out of their personal interests in a particular domain (e.g., Python, cooking, or photography).However, despite learners' interest in learning subjects, learning engagement is typically low on online learning platforms.For example, learners on iMOOC, one of the leading online learning platforms in China, complete only 20% of course content on average (Wang et al., 2019).
On Coursera, for courses without grading or degree requirements, only 10% of learners eventually complete the courses (Zhang et al., 2017).Such low engagement poses significant challenges in terms of guaranteeing good learning outcomes, which in turn negatively influences the operational efficiency and profits of online learning platforms (Aparicio et al., 2019;Coussement et al., 2020;Reich & Ruiperez-Valiente, 2019).
There are several reasons underlying the low engagement problem in online learning.First, while learners may join online learning out of an intrinsic interest, that is, the inherent pleasure of gaining knowledge (Deci et al., 1999;Ryan & Deci, 2000), this initial interest can fade over time (Hidi, 2001;Schiefele, 2009).In fact, it is found that learners' interest decreased significantly from the beginning to the end of a course (Kyewski & Krämer, 2018).
Second, in most online learning scenarios, learners do not have other external incentives,1 such as gaining rewards or fulfilling requirements of degree programmes (Chen et al., 2020;Coussement et al., 2020;Lu et al., 2022).For example, for certain courses on Coursera, over 90% of the learners are not graded or evaluated for certification purposes (Zhang et al., 2017).In addition to massive open online course (MOOC) platforms, other independent-learning or skill-sharing platforms (e.g., Datacamp, Skillshare, Udemy) also provide interested learners with courses on diverse subjects (such as cooking and photography) without any evaluation criteria or external incentives.A similar learning context is organisational training, where employees are encouraged to follow their personal interests to join training classes (e.g., in cybersecurity), often with no requirements regarding their participation or performance (Kam et al., 2022).In addition to fading interest and a lack of external incentives, the isolated environment of online learning can negatively affect the learning experience.Online learners are physically separated from others, may learn at different paces, and may barely interact with their instructors and peers.It is difficult for these learners to learn from and obtain support from others in the learning process, thus contributing to low engagement and high dropout rates (Wang et al., 2019).
Many researchers have attempted to address the low engagement problem.One major focus has been to improve the presence of online learning peers during the learning process, as learners are often influenced by their peers.For example, many commercial learning and skill-sharing platforms have adopted gamification elements, such as leaderboards and point systems, to involve learners in common rewards (Kyewski & Krämer, 2018;Landers & Landers, 2014;Leung et al., 2023).Other non-commercial, large-scale MOOC platforms have developed simpler and lower-cost ways of improving peer learners' presence.For example, Coursera and XuetangX display information that reveals the prevalence of active learning behaviours, such as viewing course videos and doing assignments among all learning peers.
A few recent studies have shown that displaying such peer information leads to more video viewing duration, more on-time assignment completion, and higher submission rates (Huang et al., 2021;Li et al., 2021).In these study settings, there is often a common external evaluation of the learners' grades or requirements on completion.It has been suggested that featuring peer information may improve learners' learning engagement because learners tend to conform to their peers as they perceive a shared goal (such as to meet the imposed standards or obtain good course grades) and hence appreciate social acceptance.However, very few studies have examined whether such peer influence holds in a learning context without any external incentives (i.e., no rewards or performance evaluation).In such a setting, learners are likely in a more independent online learning process without any common goal.They may have less pressure to conform to their peers as learners' social identification with the learning group is weaker and social acceptance is less likely to be an issue.
This study aimed to explore whether and when peer information influences learning engagement and outcomes in an online learning context without external incentives.Drawing upon theories related to normative influences, which underscore the importance of peer information in shaping individuals' beliefs about certain behavioural standards, thereby motivating individuals to change (Barlow et al., 2018;Christensen et al., 2004;Liu et al., 2019), we argue that the presence of peer information may create group norms that encourage learners to align their learning behaviour with that of others.Such conformity is often based upon people's identification with their peers and their desire for positive self-evaluation in a valued social group (Christensen et al., 2004;Liu et al., 2019).However, online learners' highly independent learning process without externally imposed requirements may diminish their sense of social identification with the group.Unlike in an offline learning context, where the sense of belonging to a learning group can be easily established with the physical presence of peers, in an online context, establishing social identification may require specific guidance.Therefore, we further explored how instructional guidance affects the dynamics of online learning groups and the peer influence.
Prior literature has shown that cultivating common interests or value is key to fostering social identification in online contexts.In an online learning context, guidance provided by instructors may easily shape learners' learning interests (Schiefele, 2009;Yi et al., 2022).In particular, we consider a very low-cost way that may facilitate learners to establish common goals, namely question interventions.In traditional learning settings, instructors often guide learners' learning process by raising relevant questions prior to learning to trigger the learners' interest in the new learning concept (De Keersmaecker et al., 2020;Kruglanski & Webster, 1996;Stokhof et al., 2017;van Leeuwen & Janssen, 2019).For instance, before starting the next chapter related to exercise and positive mind, instructors can pose a question to their learners, such as why engaging in exercise can reduce stress.In the online learning context, such instructor guidance becomes less prevalent due to a lack of visible and synchronous interactions between the instructors and learners (Xu et al., 2020).We proposed a design of learning-content guidance via social media associated with the learning platform, in the form of question interventions, as a potential way to create common interests in an online learner group.We aimed to explore the effect of such question interventions on the learners' engagement in learning, as well as their tendency to be influenced by their peers.
Overall, this study aimed to resolve two key questions: (1) Does peer information affect learning engagement and outcomes for online learners without external incentives?(2) When does peer information influence learning engagement and outcomes, and what is the role of the question intervention?We partnered with a leading onlinelearning platform in China to conduct a field experiment.Specifically, we invited learners to enrol in an online course with no external incentives.This setting represented a form of online learning in which learners' motivation for selfimprovement is a key driver of learning (Kizilcec & Schneider, 2015).We used a 2 (peers' active learning behaviour information: presence vs. no presence) Â 2 (question interventions: presence vs. no presence) between-subjects design, involving 306 participants in a 2-week online course.Consistent with previous research findings, we found that the question intervention improves learners' learning engagement in terms of learning course content and doing related practices, as well as learning outcomes.However, information on peers' active learning behaviour did not always have a positive influence.It only increased the learners' learning engagement and outcomes when the learners were also presented with the question intervention.This is plausible because guidance in the form of questions helps trigger learners' common interest in the learning content and aligns their learning paces (i.e., the rate at which they view the course videos).Hence, when information about peers' active learning behaviour was present, learners were more likely to perceive their common learning goals and thus develop identification with the learning group, leading to enhanced peer influence.This research makes several contributions.First, we identified that in a general online learning setting where learners lack sustainable intrinsic motivation and external incentives, the effects of peer information on the learners' engagement (i.e., normative influence) may not always hold.This extends prior work on how peer information may encourage learners, and it deepens our understanding of peer influence.Second, we determined that a form of instructional design, namely question interventions, can effectively engage learners and enhance normative influence.We thus provide further evidence to support theories on normative influence and highlight a form of information intervention that may change social dynamics and, in turn, the peer effect.Third, through our attempt to resolve learning engagement issues on a real online learning platform, we demonstrate how these information interventions can be implemented in a field setting.

| LITERATURE REVIEW
In this section, we first review the literature on normative influence in general, which explains how one's observation of peers' behaviour influences her own behaviour, and then normative influence in online learning settings.Then, we review studies on how learning instruction design may affect learning engagement.

| Effects of peer information: Normative influences
Numerous studies in social psychology have demonstrated that an individual changes her behaviour under social influence (Dannals & Miller, 2017;Ng et al., 2021;Yamin et al., 2019).One important mechanism of this effect is referred to as 'normative influence', meaning that people tend to align their behaviour according to the descriptive norm of their social groups, that is, what most of the people in a group do (Christensen et al., 2004;Lapinski & Rimal, 2005;Spears, 2021).This is because people expect social approval and positive self-evaluation in their own community.In fact, normative influences are increasingly prevalent in the online context due to the increased accessibility of online peer information (Liu et al., 2019).Studies have observed the effectiveness of descriptive norms in motivating individuals' behaviours in various online contexts, such as online review writing (Burtch et al., 2018;Chen et al., 2010) and engagement in protection behaviours against cybersecurity threats (Ng et al., 2021).
Studies have further shown that normative influence is more likely to be exhibited when users can identify with their social group (Christensen et al., 2004;Robinson et al., 2014).In particular, people are likely to identify with a reference group that shares their goals or values (Akerlof & Kranton, 2000).They pay attention to the behaviours of the others in the reference group and consider the descriptive norms of this group as standards of behaviour, that is, the behaviours that are supposed to be performed by all of the individuals in the group (Panteli & Sivunen, 2019;Thielmann et al., 2020;Zhao et al., 2012).In online contexts, users' social identification with a community is often formed when they have interpersonal interactions, for example, via social interactions (e.g., sharing personal stories, supporting others; Panteli & Sivunen, 2019) and building social networks (e.g., following others and being followed; Liu et al., 2019).On many online learning platforms, especially non-commercial MOOC platforms, social relationship building is often not the focus.Nonetheless, the learners may have a salient common goal to pursue, such as obtaining certain rewards or grades, which may help foster their identification with the learning community.Next, we review studies on peer influence in the online learning context.

| Peer influence in online learning
Many studies on online learning have explored how to improve learners' engagement via peer influence.One area of research focuses on integrating gamified designs into learning platforms to encourage peer interaction (Amo et al., 2020;Santhanam et al., 2016).Scholars have proposed that the design of gamification elements should align with learners' needs and the learning task in order to foster meaningful engagement (Khan et al., 2020;Liu et al., 2017;Schöbel et al., 2020).For instance, incorporating features such as leaderboards and badges allows learners to observe the progress of their peers and introduces external rewards for all learners to pursue.By motivating learners to pay attention to their peers' performance and fostering a sense of achievement, learning platforms can facilitate positive social influence in the learning process (e.g., Landers & Landers, 2014;Leung et al., 2023).
Most non-commercial MOOC platforms have adopted simpler forms to improve learning peers' presence, such as presenting information about peers' learning behaviours (Günther, 2021;Huang et al., 2021).Studies focusing on this form of peer information presentation have generally demonstrated positive peer influence, mostly in graded courses.For example, Huang et al. (2021) showed that presenting information about peers' on-time assignment completion positively influenced learners' on-time assignment submission rate.Eyink et al. (2020) reported that displaying other students' learning activities made learners do more practice before exams.Table 1 summarises the studies on peer influence in online learning contexts.However, relatively few studies have paid attention to learning contexts in which learners do not have common external incentives (e.g., external rewards or grading requirements).In this context, learners' perception of sharing goals with their peers is much weakened.Specifically, as their learning process becomes selfdirected and isolated, their identification with a social learning group weakens accordingly.Therefore, whether peer influence holds and how to enhance its positive influence in these learning contexts deserve further investigation.
T A B L E 1 Studies on peer influence in online learning contexts.

| Online learning instruction design
There is a rich stream of literature on how to engage learners by designing learning instructions.Such research has focused on two major means by which instruction design can affect learning engagement.One is to design instructions to engage learners via indirect means, such as by enhancing the learners' feeling of social relatedness (e.g., improving instructor-learner communication; Akcaoglu & Bowman, 2016) or competence (e.g., providing instructions that are as complete as possible for solving tasks; Likourezos & Kalyuga, 2017).The other type of instruction design focuses on the delivery of course content (Kam et al., 2022;Krapp, 2005;Renninger et al., 2014), which is a direct way to trigger learners' interest in the learning materials (Schiefele, 2009).For example, storytelling has been shown to be an effective form of instruction that can increase learning interest and attract learners (Hull et al., 2019).
Our work is focused on a specific form of content-related instruction design, namely question intervention, which has been widely adopted in teaching practices.Asking questions related to the learning content before a lesson has been shown to guide learners to become interested in the learning content (Stokhof et al., 2017).This is because users' motivation to learn and seek more information can be triggered by the presence of unknowns (Loewenstein, 1994;van Dijk & Zeelenberg, 2007).In particular, questions make people aware of the existence of incomplete information.It has been found that people have an innate tendency to reduce ambiguity in their environment and the associated feeling of discomfort (Kruglanski et al., 2006;Kruglanski & Webster, 1996;Roets & Van Hiel, 2007).This implies that people desire a firm answer when presented with a question (De Keersmaecker et al., 2020;Kruglanski & Webster, 1996).The anticipated pleasure of obtaining the answer also activates people's cognitive processing when exploring information, leading to enhanced information understanding and memory retention (Gruber et al., 2014;McDaniel et al., 2000).
Extending this literature, this study goes beyond the effects of question invention on engagement per se but how this type of instruction intervention can help form a common pursuit (i.e., seeking answers to the questions) among learners, and thus moderate the effect of peer information on engagement.

| HYPOTHESES DEVELOPMENT
We investigated the effects of learning peers' behaviour information and question interventions on online learners' learning engagement and outcomes.We specifically focused on online learning contexts without external incentives.
In our study, learning engagement is defined as a learner's participation in course-related activities such as studying course materials and doing practices (Fredricks et al., 2004;Kizilcec et al., 2017), and learning outcome is defined as a learner's achievement, measured at the end of the learning process (Leung et al., 2023).Specifically, we focused on learning outcomes in terms of learners' conceptual understanding of the learning content (Santhanam et al., 2008).
Following the extensive literature on normative influence, we first present a general hypothesis that learners' learning engagement can be influenced by peers' behaviour information.Specifically, people often observe the actions of their peers and learn about the descriptive norms of their social groups (Christensen et al., 2004;Wu et al., 2021).They follow their peers' behaviour in response to pressure to conform to it (Christensen et al., 2004;Spears, 2021).In this study, we particularly focus on information about peers' active learning behaviour, such as viewing course videos and completing related practice exercises.Learners are likely to consider these active learning behaviours as the norm, thus setting this norm as a behaviour standard by which to evaluate their own learning processes.Accordingly, the learners engage in their learning more actively.In the absence of information about peers' active learning behaviour; however, learners are less likely to learn about the norm and be inspired by their peers to join in course learning.Therefore, we propose the following hypothesis: Hypothesis 1.The presence of information about peers' active learning behaviour positively influences learning engagement.
We further propose that information on peers' active learning behaviour has a positive influence on learners' learning outcomes.As mentioned above, when individuals observe others' active learning actions, they perceive learning as a normative behaviour and follow others' actions to engage in learning by attending to course-related information (Kuan et al., 2014;Wattal et al., 2010).As they devote more attention and thought to the course content during the learning process, their learning outcomes are likely to improve (Alavi et al., 2002;Glaser & Bassok, 1989).Therefore, we propose the following hypothesis: Hypothesis 2. The presence of information about peers' active learning behaviour positively influences learning outcomes.
We then propose that presenting question interventions will have a direct effect on learners' learning engagement and outcome.Presenting questions is an effective way to arouse individuals' awareness of incomplete information and prompt them to collect further information relevant to resolving the unknown (Richland et al., 2009).This is due to humans' natural information exploration behaviour, intended to close any gap in their knowledge on a relevant topic (Loewenstein, 1994;van Dijk & Zeelenberg, 2007).In particular, when online learners are presented with course-related questions, their thoughts are likely to be activated by the unknown information (Hidi & Renninger, 2006;Schmidt et al., 2011), and they tend to make conjectures about the answer based on their prior knowledge (Hmelo-Silver, 2004).They are then motivated to study the course materials to verify their conjectures and find definitive answers to the posed questions (Klein & Fishbach, 2014).In other words, conjectures serve as a relatively explicit goal on what information to further explore.As learners can derive pleasure from closing a knowledge gap to achieve their goal, they may also tend to participate in learning activities to strengthen their understanding of information and fully eliminate their doubts.In contrast, if learners are guided by a declarative statement of course content without questions, they are less likely to generate doubts and develop conjectures prior to learning (Thomas et al., 2014).For example, if instructors only provide a declarative statement regarding the content of the next chapter (e.g., 'the next chapter will cover exercising as a way to reduce stress'), learners may not have an explicit and specific goal to achieve and are less likely to feel an urge to explore information further.They may tend to move through the course content quickly, without a clear intention to probe.
Overall, in the context of online learning with no external incentives, providing question interventions during the learning process may intrigue learners and can be particularly effective in improving their information explorations about the learning subject and, in turn, their learning engagement.As such, we propose the following hypothesis: Hypothesis 3. The presence of question interventions positively influences learning engagement.
We also expect that question interventions may positively influence learning outcomes because an urge to obtain an answer to a question prompts learners to actively elaborate on relevant information (Cogliano et al., 2019;Hmelo-Silver, 2004).In particular, in the process of generating and verifying conjectures related to the questions, learners' prior knowledge is activated (Collins et al., 1991;Schmidt et al., 2011).They keep their relevant knowledge and doubts in mind while acquiring more information to eliminate the doubts (Kang et al., 2009).This allows learners to better integrate new knowledge with their prior knowledge, leading to a deep understanding of the subject and enhanced memory (Cogliano et al., 2019;Gottlieb et al., 2013).For instance, Kang et al. (2009) found that question interventions had a positive effect on the subsequent recall of information relevant to the questions.In contrast, when learners are not presented with questions, they are less likely to make conjectures and their prior knowledge is also less likely to be activated (Schmidt et al., 2011).Consequently, they are less likely to develop a deep understanding of new knowledge and to integrate their new and prior knowledge.Therefore, learners who were presented with questions prior to learning are likely to achieve better learning outcomes than those who were not.Thus, we propose the following hypothesis: Hypothesis 4. The presence of question interventions positively influences learning outcomes.
Furthermore, we argue that the effect of the information about peers' active learning behaviour on learners' learning engagement and outcomes would be strengthened if learners are presented with course-related questions.
Specifically, research has suggested that normative influences largely rely on the extent to which learners identify with their peers (Christensen et al., 2004;Wu et al., 2021).When people believe that their peers share common interests with them, they identify more with their peers and have a stronger desire to conform to the peers' behaviour (Christensen et al., 2004).
In this study context, there are no common requirements or rewards for all learners to pursue.However, when learners are presented with questions prior to learning, they develop a salient goal, that is, to explore course information to resolve the questions.Therefore, when a group of learners are presented with the same questions, they develop a common interest in pursuing learning materials to resolve the questions.In such a context, when learners are exposed to information about their peers' active learning behaviour, their perception of pursuing a shared learning goal is reinforced (Naylor et al., 2011).Such a perception of common goals leads learners to consider their peers as a valued reference group; they find what the others do to be relevant and easy to interpret (Al-Natour et al., 2011;Tsai & Pai, 2021).Therefore, learners are likely to use their peers' active learning behaviours as a standard for their own behaviours; they engage in learning because of their willingness to conform to norms.Devoting more effort to studying course materials and completing practice exercises in the learning process provokes deeper thinking, which then leads to better learning outcomes.
In contrast, if learners receive a general declarative statement as guidance for their learning instead of a question, although they are guided to learn the same content, they are less likely to generate doubts prior to learning; thus, it is more difficult for them to form an explicit and specific goal (Klein & Fishbach, 2014;Schmidt et al., 2011).
When there are also no further external incentives, learners are generally less likely to perceive a common learning goal and peers' active learning behaviour is less likely to be perceived as a demonstration of a common interest.
Accordingly, learners are less likely to be concerned about their peers' active learning participation and to follow their peers to engage in learning actively.The effect of peers' learning behaviour on learners' learning outcomes is thus weakened.Overall, peers' active learning behaviour is more likely to influence learners who are presented with questions prior to learning than those who are not.Thus, we propose the following hypotheses: The platform collaborated with the education bureau of Yunnan province, China, to launch a programme to offer various free online courses to primary and secondary school teachers.This programme aimed to help teachers achieve personal improvement, including their psychological well-being, and did not impose any requirements in terms of course completion or grading.This represents a general online learning setting in which learners learn without any external incentives (such as course requirements of degree programmes or certificates).The courses were offered during summer breaks so that the teachers had sufficient time to learn.The education bureau promoted the programme to local teachers and assigned platform accounts to each teacher so that they could enrol in the courses.
The teachers were informed that participation in the programme was completely voluntary.Almost all of the teachers in schools affiliated with the education bureau have accounts on this platform, and they formed our potential experimental pool of participants.This sample, while consisting primarily of teachers, likely represented general online MOOC learners for the following reasons.Specifically, these teachers taught basic subjects in primary and secondary schools in an offline setting.Like all other online learners, they chose to study online the subjects they were personally interested in.The focal course in our study was unrelated to what the participants taught in their profession (i.e., the prior knowledge of the participants was less likely to differ from that of other learners).
To investigate how to improve learners' learning engagement and outcomes by designing information interventions, we conducted a field experiment, focusing on a major course offered in the abovementioned programme (i.e., Positive Psychology).This course was considered to be a foundational course in the personal improvement programme.The course aimed to encourage people to look at the positive aspects of life and establish a positive outlook on life.The concepts introduced in the course, such as happiness and grit, were all closely tied to one's common feelings and experiences in everyday life; therefore, the course was not considered difficult to learn.The course was not graded, and no completion requirements were imposed on the learners.
The course consisted of seven chapters, each of which contained 3 to 6 course videos, for a total of 36 videos.
The average length of each video was 14.44 min.In addition to watching the course videos, the learners were also encouraged to complete one practice exercise each day (see Section 4.2 for a detailed description).The course instructor recommended a 2-week learning process to ensure that the learners could learn the course concepts effectively.Our experiment was conducted during a 2-week learning period upon the release of the course.During this period, learners were recommended to spend 2 days learning through the videos of each chapter and do a practice every day.
One instructor and two teaching assistants formed the teaching team that delivered the course.The learners were invited to join study groups associated with the course on WeChat;2 the teaching assistants posted relevant messages to the groups according to the recommended learning pace described above.This practice of organising learners in the same online course into associated social media study groups managed by teaching assistants has also been adopted by other platforms, such as Xuetangx.comand Schoology.com(Xu et al., 2020).Information about peers' learning behaviour and the question interventions were both manipulated in the messages, as elaborated on in the next section.After the 2-week learning period, the WeChat groups were disbanded.

| Experimental design and manipulations of independent variables
We conducted a field experiment with a 2 (peers' active learning behaviour information: presence vs. no presence) Â 2 (question interventions: presence vs. no presence) between-subjects design.Four WeChat study groups were created for this course.We randomised the participants when inviting each of them to join one of the four groups.The two teaching assistants joined all four of the WeChat groups, adopting the same management rules in each group and mainly in charge of sending messages according to the experimental design, as detailed below.
Throughout the 2-week learning process, the teaching assistants sent messages to the WeChat groups to remind participants to learn a new chapter and to do the daily practice.Specifically, the daily practice was introduced on the first day of the course, and a reminder on how to submit the practice was posted in the groups every day from the second day onward.The practice was the same on each day: it asked participants to write down three good things they had experienced that day, with an explanation of why those things had been positive, using the concepts they had learned from the course videos.These 'good things' could range from the relatively trivial (e.g., 'My co-worker made me a coffee today'.) to the significant (e.g., 'I earned a big promotion'.).The purpose of this daily practice was to help the participants review the course content and learn to apply it in real life.Because the course practice reflected personal experiences, no external standards were used to grade them.Furthermore, the submissions were completely voluntary.The specifications of the practice (announced on the first day) were the same for all four groups.After the first day of the course, the presence of peers' active learning behaviour information was manipulated in the daily practice reminders posted in the study groups.
Specifically, in the groups with peers' active learning behaviour information, the practice reminders reminded the learners to submit the practice on that day and informed them that many of the learners had done the practice the day before.In addition, three randomly selected submissions were also displayed below the reminder message,3 which was likely to reinforce the learners' perception that their peers were engaging in course learning.These submissions did not contain any personally identifiable information.The submissions of practice work were chosen to demonstrate peers' learning engagement because the practice work was a key learning activity in this course which was very helpful for understanding the important concepts (such as positive mind) and for developing a positive life attitude.In the groups without peers' learning behaviour information, the reminders only reminded learners to submit their practices on that day without providing information about peers' learning behaviour.
The presence of question interventions was manipulated in the messages reminding the participants to learn a new chapter, which was posted every other day (i.e., when a new chapter was scheduled to be introduced).In the groups with the question intervention, these reminder messages contained a few questions related to the content of the chapter to be studied in the following 2 days.All of the questions were created by the course instructor.As the purpose of the question intervention was to trigger the participants' exploration for information about the course materials to obtain answers to the questions, we made sure that the answers to the questions would not be obvious to people who had not studied the course materials. 4In the groups without the question intervention, the reminder message briefly mentioned the content of the focal chapter but did not raise any questions.We made sure that the length of the messages was approximately the same in both conditions (66 Chinese characters on average).
The reminder messages for all seven chapters (with and without question interventions) are presented in Table B1.
To ensure that the participants paid attention to the reminders posted in the groups, we kept other discussions in the groups to a minimum.In particular, the participants were informed that discussions unrelated to the course were not allowed in the groups.Any participant violating this rule was reminded by the teaching assistants to stop the discussion.Accordingly, very few messages were posted in the groups throughout the experiment.

| Experimental procedure
The learning platform assigned accounts to the pool of potential users and invited them to join the course.If they accepted the course invitation, they were then invited to the WeChat study group. 5The participants were told that a study group had been formed for them to receive timely, course-related announcements during a recommended 2-week learning period.They were also informed that the course videos would still be accessible after the 2-week period and that the course would not be graded.After the experiment, we debriefed the participants on the experiment and the purpose of the study.
Before the 2-week learning process began, the participants were asked to complete an online questionnaire containing questions related to certain important control variables in our study, including the frequency with which the participants had used MOOC platforms in the past (MoocFrequency), their motivation for using MOOC platforms (MoocMotivation), and their motivation for studying Positive Psychology (CourseMotivation). 6The participants were told that the purpose of this questionnaire was to help the course instructor to better understand the learners and to arrange course content accordingly.A total of 319 participants joined the study groups and completed the preexperiment questionnaire, with 76-82 participants in each group.The teaching assistants then posted a course introduction in the WeChat groups.
The experiment lasted for 2 weeks and consisted of seven learning sessions.Each learning session lasted 2 days and covered one chapter of the course.On the first day of each session, the teaching assistants sent a reminder message to each group to study the relevant chapter.The message was sent at 9:00 AM so that the participants could plan their course learning on that day.The presence of the question intervention was manipulated in this message.
At 6:00 PM on the first day of the experiment, the 'Three Good Things' daily practice was released to every group.
Beginning on the second day, at 6:00 PM every day, the teaching assistants sent a message reminding the participants in each group to submit their practices.The presence of information about peers' learning behaviour was manipulated in this message.In sum, over the course of the experiment (14 days), each participant received seven messages about studying course chapters (one message for each chapter) and 14 messages about the daily practices (one practice specification and 13 submission reminders).
After the 2-week learning period, the participants were informed of an online quiz to test their learning outcomes. 7The quiz contained 10 multiple-choice questions devised by the course instructor.It was distributed to the WeChat groups, and the participants were encouraged to complete it.The participants were told that the quiz served as a way to consolidate and deepen their knowledge and that there was no course grading.They could review the answers to the questions immediately upon completion.QuizScore (a participant's score on the quiz) was used as a measure of the participant's learning outcomes.A total of 256 participants who completed the pre-experiment questionnaire also completed the quiz.Table 2 presents an overview of the experimental procedure.
During the experiment, the participants remained in their respective study groups, and no participant withdrew.
We retrieved each participant's video viewing behaviour and practice submissions during the 2-week experiment period.The dependent variable (learning engagement) was measured in two ways: NumVideos, representing the number of videos watched by a participant (out of a total of 36) during the 2-week learning period, 8 and NumPractices, representing the number of practices submitted by a participant during the 2-week learning period (i.e., with a maximum of 14).

| DATA ANALYSIS
At the end of the experiment, we asked the participants whether they had noticed the reminder messages posted in the groups.Thirteen of the participants failed this manipulation check and were dropped from the data analysis.
Overall, our final sample included all 306 of the participants who joined in the experiment groups and completed the 6 We used a 7-point Likert scale to measure the control variables, including MoocFrequency (e.g., 'I often use MOOC platforms for learning'.), MoocMotivation (e.g., 'I'm very motivated to use MOOC platforms for learning'.),and CourseMotivation (e.g., 'I'm very motivated to learn about Positive Psychology'.). 7The participants did not know about this quiz at the beginning of the experiment. 8Repeated views were not counted, as the platform only recorded whether a participant had accessed and watched a full video.
pre-experiment questionnaire.Among them, 243 participants who completed the quiz were included in the analysis related to learning outcomes.9On average, the participants completed 33.94% of the course videos and 17.92% of the practices.Only 15.61% of the participants watched all of the course videos, and none did all of the practices.
The overall learning engagement and completion rate were thus quite low, which was consistent with those in other courses on this platform and on other MOOC platforms.The participants' demographic variables were provided by the MOOC platform.Specifically, 57.8% were female, their ages ranged from 23 to 59 years (average = 41.98), and 53.9% held a bachelor's degree.
Because NumVideos (as a measure of learning engagement) was a count variable, we used a Poisson regression model to estimate the effects of peer information and question interventions on this variable.We specify this model in Equations ( 1) and ( 2): where the independent variables Peer (i.e., the presence of peer information) and Question (i.e., the presence of question interventions) are binary.Specifically, Peer equals 1 for a participant assigned to a group with the peer information intervention, and 0 otherwise.Question equals 1 for a participant assigned to a group with the question intervention and 0 otherwise.Peer*Question represents the interaction term of Peer and Question.Six control variables were included.As a participant's prior frequency of and motivation for using MOOC platforms for learning were naturally correlated with that participant's engagement in MOOC learning, we controlled for these elements with the variables MoocFrequency and MoocMotivation, respectively.In addition, the participants' motivation to study the focal course prior to the experiment may also have influenced their learning engagement and outcomes.Therefore, we controlled for this with the variable CourseMotivation.General demographic variables such as age (Age), T A B L E 2 Experimental procedure.gender (Gender), and level of education (Education) were also controlled. 10 We also performed a randomization check and did not find any significant difference in terms of these dimensions across the four groups, attesting to the validity of our randomization procedure (see Appendix C).
Another measure of learning engagement, NumPractices, was also a count variable; we thus used a Poisson regression model.We present our model in Equations ( 3) and ( 4): We subsequently tested the effect of peer information and question interventions on QuizScore, a measure of learning outcomes, as shown in Equations ( 5) and ( 6).As QuizScore (the number of correct answers a participant obtained) is a count variable, a Poisson regression model was applied here as well.
Moreover, we supplemented our Poisson regression model with a negative binomial model as a robustness check.Table 3 presents the descriptive statistics and variance inflation factor (VIF).The VIF values for all of the variables were less than 10, indicating that multicollinearity was not likely to be a problem (Chatterjee & Hadi, 2015).
Table 4 presents the correlations of all of the variables.
Furthermore, the interaction effect between peer information and the question intervention was positive and significant on both measures of learning engagement (α 3 = 0.60, p < 0.01; β 3 = 0.38, p < 0.05).Further analyses showed that for the learners who were presented with questions, the peers' learning behaviour information had a positive effect on NumVideos (α = 0.39, p < 0.01) and NumPractices (β = 0.29, p < 0.01).However, this effect was absent for the learners who were not presented with questions (α = À0.19,p > 0.05 for NumVideos; β = À0.11,p > 0.05 for NumPractices).We further illustrate this interaction effect in panels (a) and (b) of Figure 1.Hypothesis 5 is thus supported.
A similar interaction effect was found on learning outcomes.The coefficient of Peer*Question was positive and statistically significant (γ 3 = 0.13, p < 0.05).Further comparisons showed that, when the learners were presented with questions, the peers' learning behaviour information had a positive effect on QuizScore (γ = 0.12, p < 0.01).
However, when the learners were not presented with questions, the peer information intervention did not play a significant role (γ = 0.01, p > 0.05).We further illustrate this interaction effect in Figure 2. Hypothesis 6 is thus supported.A summary of the results of hypothesis testing is presented in Table 6.
As a robustness check, we repeated the above estimations using a negative binomial model.Table 5 reports the negative binomial regression results for NumVideos (Column 2), NumPractices (Column 4), and QuizScore (Column 6).
All of the effects were consistent across the two regression models.

| Further analysis
We posited that peer information has a larger effect on learning engagement and outcomes when question interventions are presented because questions arouse learners' common interest (i.e., to resolve the question), thereby promoting a heightened sense of social identification when learners are informed about their peers' learning behaviour.
Learners who are more identified with the learning group will pay more attention to what their peers do, which makes the presence of peer information more influential.To obtain evidence regarding whether question interventions improve learners' attention to peer information, we asked the participants who were exposed to peer information how often they read their peers' practices posted in the WeChat groups (using a 5-point semantic scale).A total of 124 participants completed the questionnaire.The results indeed showed that compared with the case without question interventions (M without-question = 2.46, SD = 0.80), learners who received question interventions read their peers' practices more frequently (M with-question = 2.72, SD = 0.90, β = 0.30, p < 0.05).
Furthermore, when the presence of questions intrigues learners to pursue a common goal, the pace of their learning is likely to be aligned toward achieving the goal.Therefore, we also looked at the learners' timely learning behaviour, which reflected how well they followed the guidance they received (either with questions or just a declarative statement) to learn.In our context, participants were advised to follow a 2-week learning process in which they were expected to learn specific course content in each 2-day session.We recorded the number of course videos that each participant viewed on schedule (i.e., before the recommended learning session ended) as a supplementary measure of aligned learning engagement.Consistent with our main findings, the results showed that the positive effect T A B L E 3 Summary statistics.T A B L E 4 Correlation table.
We also looked into other potential indicators of learning engagement and outcome.We analysed the quality of course practices ('Three Good Things') submitted by the participants, which we also viewed as a potential measure of learning engagement.Specifically, while there were no objective criteria by which to assess the quality of the learners' practice content, the length of a submission could reflect the author's active engagement in the practice, as  longer submissions often require more effort (Ghose & Ipeirotis, 2011).We used the jieba package in R (a publicly available toolkit for Chinese word segmentation) to segment each submission into individual Chinese words, and to count the number of words in each submission after filtering out stop words and punctuation.We used the average number of words per submission for each participant as another indicator of the participant's learning engagement.
The results similarly showed that the interaction effect was positive and significant (β = 16.87,p < 0.05).For the learners who were presented with questions, the peers' learning behaviour information had a positive effect on the average number of words per practice submitted by each participant (β = 13.30,p < 0.01).However, this effect disappeared when learners were not presented with questions (β = À4.76,p > 0.05).
Furthermore, we analysed the learners' life satisfaction at the end of the 2-week course, which is another potential measure of learning outcome.Our experimental course, Positive Psychology, aimed to help people learn how to establish a positive outlook on life.We adopted Brunstein's (1993) measurement of life satisfaction in an attempt to measure to which extent learners could indeed apply what they had learned in their lives and maintain a positive attitude in daily life (a sample item being 'At present, I am completely satisfied with my life'.).The results showed that there was no significant effect of peer information (β = 0.07, p > 0.05) or question interventions (β = À0.12,p > 0.05) on learners' life satisfaction.However, the interaction effect was positive and significant (β = 0.36, p < 0.01).For the learners who were presented with questions, the peers' learning behaviour information had a F I G U R E 2 Plot of interaction effects on learning outcomes (measured by QuizScore).
T A B L E 6 Summary of hypotheses testing.

Hypothesis Result
H1: The presence of information about peers' active learning behaviour positively influences learning engagement.

Supported
positive effect on life satisfaction after the course (β = 0.44, p < 0.01).However, the effect was not observed in the learners who were not presented with questions (β = 0.09, p > 0.05).Thus, the results were generally in line with our results on learners' conceptual knowledge after learning.

| GENERAL DISCUSSION
This research focuses on an online learning context in which learners learn in isolation without any external incentives and examines the individual and interaction effects of information about peers' learning behaviour and question interventions on learning engagement and outcomes.Our findings show that, first, there was no universal positive effect of peer information on learning engagement and learning outcomes.This is plausible because in the given context, it was difficult for the learners to identify with the other learners as a social group because they cannot perceive any common learning requirements or goals.As a result, the peers' active learning behaviours may have been perceived as less relevant, meaning that there was little pressure to conform to the others' behaviour (Christensen et al., 2004;Spears, 2021).Second, consistent with the literature, our results reveal that the question intervention positively affected the participants' learning engagement and outcomes (Hidi & Renninger, 2006;Schmidt et al., 2011).Specifically, presenting the learners with content-related questions prior to learning effectively triggered their information exploration about the course content and activated their thinking during learning.
More importantly, our findings show an interaction effect between peers' learning behaviour information and the question intervention.Peer information had a strong positive influence on learning engagement and outcomes only when the learners were presented with question interventions.Those learners were guided to explore information about the learning content by the questions, and they deemed their peers' active learning behaviour as a demonstration of a common learning interest.A stronger identification with peers led to more conformity to group norms and stronger normative influence.Our results thus highlight the key role of instructors' learning guidance in the form of questions in strengthening learners' identification with their peers and, in turn, normative influences in online learning.

| Theoretical implications
This work makes several contributions to the academic literature.First, as online learning has become more popular and even central to people's lives (Hew & Cheung, 2014;Huang et al., 2021), low learning engagement and poor learning outcomes have become growing concerns for online learning platforms (Goli et al., 2022;Wang et al., 2019).
Our study extends the online learning literature by focusing on a general online learning context in which learners learn in isolation without external incentives, a setting in which low learning engagement is a particular concern.
Prior research on online learning has mostly focused on promoting interactions among peers to enhance learning engagement-for example, on social or gamified online learning platforms (Wang et al., 2019;Zhang et al., 2017)-or featuring peer information to improve learning engagement in graded courses (Huang et al., 2021;Li et al., 2021).
This research contributes to this stream of literature by revealing that in a learning setting without external incentives, information about peers' active learning behaviour may not always affect learners' engagement.This is plausible because each learner has their own learning goal and, therefore, their own path and evaluation criteria (Eryilmaz et al., 2013;Lu et al., 2022).This extends the research on how peer information may encourage learners, deepening our understanding of peer influence.In line with theories related to normative influences, we found that peer information mattered to the learners only when they deemed peers' active learning to be a signal of common learning interest and, therefore, viewed it as a standard of behaviour (Straub, 2009;Tsai & Pai, 2021).
Second, our findings indicate that timely question interventions prior to learning may trigger learners' information exploration about learning content and significantly improve learners' learning engagement and outcomes.While guiding learners via content-related questions is a widely adopted practice in teaching, it has rarely been implemented or investigated in online learning contexts, plausibly because it is often difficult for online instructors to communicate with learners in real time (Akcaoglu & Lee, 2018;Xu et al., 2020).Because maintaining learners' interest in learning content is critical in an isolated online learning context, this research highlights the importance of actively and constantly guiding and intriguing learners in the learning journey.In the context of this study, question interventions were delivered in social media groups as a form of instructors' active communication with learners.This study also responds to calls for investigations into how to cultivate learners' intrinsic motivation in learning in both Information Systems and Education literature (Kyewski & Krämer, 2018;Ryan & Deci, 2000).Future studies may further explore potential designs of question interventions, such as the specific format, number, and content of the questions.

| Practical implications
Our findings have practical implications for the design of information interventions for online learning platforms.
First, our results suggest that providing information about peers' learning may not always stimulate learners' engagement and enhance their learning outcomes in an online learning context.When there is no intervention to trigger learners' exploration of learning materials to find information, the positive effects of information about peers' behaviour may be largely weakened, as the learners have low identification with their learning peers (Chen et al., 2021;Kyewski & Krämer, 2018).Therefore, to achieve effective social interventions in online learning, learning platform designers should consider how to inspire and preserve learners' interest in the learning content-for example, platforms can incorporate push reminders with question guidance.Forming a learning community with common interests, thereby cultivating a high level of social conformity, is crucial for ensuring the effectiveness of peer information interventions.In addition to online learning settings where there are no external incentives, the findings could also be applied to some offline contexts where learners' sense of learning community (their identification with the group of learners) is generally weak, such as occasional, short-term training.However, the moderating role of question interventions on peer influence may be less evident in the cases where external incentives are salient.
Second, we believe that effective communication and guidance from instructors are particularly important in the online learning setting because such learners are socially isolated.Our study suggests that a basic form of guiding learners' learning progress-namely, instructors' raising timely and relevant questions prior to each learning sessioncan be an effective means of engaging their learners in learning.This can be achieved by forming online learners into social media groups, which makes communicating with learners as easy and efficient as other daily social networking activities.For online courses with a much larger learner base, social media groups may be difficult to manage; in this case, question interventions can also be implemented on learning platforms in the form of push notifications or emails (Huang et al., 2021;Leung et al., 2023).Various other online communication channels could be used to facilitate instructors in guiding learners' learning of course content (Hull et al., 2019;Jung et al., 2020).Nonetheless, it is possible that the effects of such information are weakened if it is sent as a system prompt rather than a message from the teaching team (especially the instructor) as learners can more directly perceive the involvement of the teaching team in the latter case and hence likely pay more attention to the information.Therefore, when question interventions are sent by systems on a large scale, specific designs may be needed to make them more attractive.

Hypothesis 5 .
Presenting information about peers' active learning behaviour has a stronger effect on learning engagement when learners are presented with question interventions than when no question interventions are presented.Hypothesis 6. Presenting information about peers' active learning behaviour has a stronger effect on learning outcomes when learners are presented with question interventions than when no question interventions are presented.4| RESEARCH METHODOLOGY4.1 | Research contextWe collaborated with a large-scale MOOC platform to conduct a field experiment to test our hypotheses.The platform is one of the leading online learning platforms in China.At the time this study was conducted, there were over 30 million registered learners on the platform, with more than 2600 courses offered and over 78 million total course enrolments.Each course is composed of multiple videos pre-recorded by instructors.Once learners enrol in a course, they can access course videos and other learning materials.
Correlation coefficients in bold indicate statistical significance at a 5% level or higher.

F
I G U R E 1 Plots of interaction effects on learning engagement.(a) NumVideos.(b) NumPractices.
The presence of information about peers' active learning behaviour positively influences learning outcomes.Not supported H3: The presence of question interventions positively influences learning engagement.Supported H4: The presence of question interventions positively influences learning outcomes.Supported H5: Presenting information about peers' active learning behaviour has a stronger effect on learning engagement when learners are presented with question interventions than when no question interventions are presented.Supported H6: Presenting information about peers' active learning behaviour has a stronger effect on learning outcomes when learners are presented with question interventions than when no question interventions are presented.
This research is not without limitations.First, this study only involved one course (a psychology course), which may limit the generalizability of our findings.Online learning platforms offer courses that vary in many dimensions, such as the level of difficulty.The effectiveness of question interventions may vary depending on learners' ability to form conjectures about unfamiliar topics or subjects in relatively difficult courses.Future studies may examine whether various characteristics of courses moderate the effects of information interventions, or how to design information interventions to engage learners in relatively challenging courses.Second, in this study, the intervention messages were manually designed by the teaching team and were identical for all learners.With the advancements in artificial intelligence technologies, future research can explore the possibilities of designing personalised and adaptive interventions that consider each learner's progress.By dynamically adjusting questions based on individual preferences and past performance, this adaptive approach ensures that learners receive content specifically tailored to their unique interests.Further investigation is needed to delve deeper into how data-driven instruction can optimise the effectiveness of question interventions on online learning platforms.Third, our field experiment lasted for only 2 weeks, which aligned with the instructors' recommended learning period.Future studies may explore the effects of information interventions through long-term, multi-stage experimental designs.Learners' responses to interventions may change over time as they progress in their learning journey.Examining the long-term effects would provide valuable insights into the adaptability and robustness of the effects of information interventions in various learning contexts.APPENDIX B T A B L E B 1 Course reminders in social media groups (English translations).
T A B L E 5 Regression results.Bootstrapped standard errors are in parentheses (both Poisson model and negative binomial model).
Is perfectionism good or bad?Do you think perfectionism can be an enemy of grit?Please continue with Chapter 4 (grit and self-discipline) to find the answer Perfectionism can be an enemy of grit.Log onto the platform to watch the videos and develop your grit.Please continue to study Chapter 4 (grit and self-discipline) What is the value of these lessons in life?Please continue with Chapter 7 (application of Positive Psychology) to find the answer.Taking this class is the first step to a positive life.Log onto the platform to watch the videos and apply the lessons to our daily life.Please continue to study Chapter 7 (application of Positive Psychology)