Applying and Evaluating Visualization Design Guidelines for a MOOC Dashboard to Facilitate Self-Regulated Learning Based on Learning Analytics.
Massive Open Online Courses (MOOCs) have made a paradigm change in higher education through the potentialities for diverse learners from all over the world to access open and free contents of high quality . One of the main characteristics in MOOCs is to provide diverse learners with an opportunity to personalize their learning in terms of topic, time, place, and method. The learners in MOOCs have different learning objectives and plans to utilize online courses according to their personal needs. Such characteristics lead the learners in choosing their own personalized path through self-regulated learning (SRL). Therefore, low completion and high drop-out rates have been key issues in MOOC, even if their popularity has become increasingly high   .
In fact, obtaining a certificate could be the objective of applying to a MOOC for some students who need to study the entire content across the MOOC. However, other types of students might have plans to study only interesting topics or contents except for what they already studied. Such student's purpose of applying the MOOC might be different. Thus, MOOCs require that students should be autonomous and further self-regulated during the learning process since learners start studying alone and keep learning without direct contact with teachers and peers .
As the field of learning analytics and big data for education has been rapidly growing, MOOC platforms should provide personalized learning environments with differentiated feedback for each individual student who sets his or her own learning goals. In a MOOC learning context, all activities of learners and teachers might be digitalized and collected as meaningful educational big-data which can be used for data mining . Through a dashboard, MOOCs can collect, analyze, and utilize students' abundant log data about their conditions and learning styles. In spite of such effectiveness of MOOCs, most students do not have suitable SRL competencies or easily feel bored and then drop-out  . Therefore, MOOC dashboards should support students' continuous or persistent learning and their successful performance by promoting their SRL cycle suggested by Zimmerman . Although a large body of prior research mentioned its importance in helping students to gain SRL abilities, there has been little research on improving SRL through the dashboard. Recently, Qu and Chen  indicated visual analytics could play a crucial role during the learning process of MOOCs in help of the revolutionary improvement of MOOC platforms. However, it is still challenging to extract meaningful information from data of students' activities and visualize the data intuitively and effectively, especially in terms of SRL. Accordingly, this study aims to develop and validate design strategies for visualizing a MOOC dashboard to facilitate SRL based on learning analytics. To achieve this goal, firstly, we systemically reviewed relevant previous studies and analyzed the learning analytics functions of the extant typical MOOC dashboards such as edX, K-MOOCs, Coursera, Khan Academy, and FutureLearn, and then tried to derive design guidelines which can be applied to the design and development of MOOC dashboards.
This study pays attention to the critical significance of visualization of MOOC Dashboards as visual feedback for students to facilitate SRL. This study, thus, aims to provide guidelines for visually designing and developing a MOOC dashboard to enhance students' SRL. We conducted our study to address the following research questions:
1) What are visualization guidelines of the MOOC dashboard to promote SRL based on learning analytics?
2) How do educational technology experts validate a prototype of the MOOC dashboard applying such visualization guidelines to promote SRL?
3) How do users/learners evaluate a prototype of the MOOC dashboard applying such visualization guidelines to promote SRL?
2. Theoretical Background
MOOCs are more interactive than traditional online courses in the sense that MOOC dashboards play an essential role in analyzing students' learning history, and in offering appropriate instructional interventions in addition to delivering learning resources such as video lectures, reading materials, textbooks, and exams. Another difference from other online learning and e-learning, which focused on proving participant's knowledge achievement, is that the motivation of MOOC learners aims to their individual interests and learning experiences . The majority of MOOC learners describe their intentions on taking courses as gaining fun, enjoying new experiences, and finding social relationships in their academic interests. Therefore, an indicator of MOOC completion might be process-focused rather than outcome-focused   since aims and motivation of MOOC participants are widely varied . It can be also demonstrated from a study which was done by Harvard University, whose results found that 39 percent of their learners have current and former teaching backgrounds . It demonstrates that the main objective of MOOCs helps to promote active learning experiences and SRL.
In fact, the SRL strategies have been studied for decades in traditional face-to-face (F2F) classroom settings. A widely accepted definition of SRL is the ability to plan, control, and manage learning activities and processes that enhance goal achievement  . One of the most established models of SRL in the previous literature might be Zimmerman and Martinez-Pons's  model that was comprised of ten SRL strategies. Furthermore, Zimmerman and Campillo  suggested the cyclic phase model of SRL, which comprises a forethought phase, a performance phase, and a reflection phase. The SRL strategies could help a learner develop and sustain learning for better engagement and learning outcomes. The present research builds on Zimmerman and Martinez-Pons'  model of SRL and Zimmerman and Campillo's  cyclic phase model of SRL. The main reason is since one of the MOOC characteristics is the course directed by users themselves which means to plan their own learning course by themselves, to perform learning directed by themselves, and to evaluate themselves (reflect themselves if they do not want to get a certificate).
Students who appropriately apply SRL skills are more successful in higher-order thinking such as critical thinking, problem solving and reasoning  . Moreover, they might have a higher academic performance, motivation, and learning interests . In online learning environments as well as in face-to-face (F2F) learning environments, there is growing empirical evidence for the role of SRL strategies on student engagement and learning outcomes in online learning environments      . Taken together, a large body of prior research in online learning environments as well as traditional learning environments suggests that prompting learners with SRL strategies can improve their academic performance and success in learning.
Prior studies have investigated applications of SRL strategies in online learning environments. Such online SRL strategies include goal setting (e.g. ), planning (e.g.  ), time management (e.g.   ) metacognition (e.g. ), motivations (e.g. ), self-monitoring (e.g.  ), and effort regulation (e.g. ).
In the current research of Kizilcec, and colleagues , SRL strategies of self-monitoring, self-evaluation, goal setting/planning, time management, effort regulation, and help seeking can be used for learners to succeed in their MOOCs. Thus, the success of learning MOOCs depends on SRL which involves setting up students' learning goals, exploring efficient, effective and appealing ways to learn, and monitoring their learning progress to achieve the goals. The new field of learning analytics is related to similar evolutions in big data, e-science, web analytics, and educational data mining    . The use of such learning analytics in MOOC environments mainly results in its dashboard. A wide variety of heterogeneous students easily experience difficulty in utilizing the analytical tools provided to explore the data of MOOC platforms. To tackle this problem, visual dashboard systems to analyze learning behaviors should be developed for data on MOOCs based on learning analytics . In accordance with results of the research conducted by Schaffer, Huynh, O'Donovan, Hollerer, Xia, and Lin , student behaviors relating to SRL in the context of MOOCs are categorized according to their features.
In recent years, the success of the visual dashboard systems and relevant models depends on achieving the intended behaviors of the students. To visualize learning patterns in MOOCs, Chen, Davis, Lin, Hauff, and Houben  have developed statistical models to predict drop-outs based on learner activity logs of MOOCs. Duval  proposed goal-oriented visualizations which of all the activities helped learners perceive their learning progress and manage their future activities. Visualizing MOOC dashboards based on learning analytics can benefit instructors, operators and institutions (i.e. university or company administrators) to guide students' behaviors interactively and support their persistent learning. In this respect, from the perspective of students, well-visualized MOOC dashboards might support their SRL by providing appropriate information on student's behaviors. Currently, Authors  suggested design strategies for visualizing information of MOOC dashboards on a basis of learning analytics.
3. Research Method
3.1 LM-Dashboard Design and Visualization Strategies
The LM-Dashboard in this study was designed as a dashboard prototype within MOOC environments based on the cyclic phase model of SRL suggested by Zimmerman and Campillo . In particular, the LM-Dashboard focused on the reflection phase among three cyclic phases when a dashboard plays a role in providing and visualizing learning data by analyzing student's behaviors and activities. Moreover, it helps students to become aware, reflect and sense-make in their progress and achievements through learning analytics toward the learning objectives . This study aims to design and develop LM-Dashboard prototypes whose visualization strategies through learning analytics can promote SRL in the MOOC environments on the premise that goal setting and planning has been done by students in the forethought phase, and learning on the MOOC platforms would be in the performance phase. Therefore, the achievement figures and numbers of the LM-Dashboard represent more individual and customized quantitative and qualitative data compared to learner's plans. The theoretical backgrounds of such quantitative and qualitative data would be on the sixth visualization strategies (6.1) to promote SRL on the MOOC dashboard shown in Table 3.
In the first prototype, the LM-Dashboard consists of three main menus for Course Progress, Learning Activities, Learning Evaluation shown in Fig. 1. The Course Progress page contains (a) Weekly learning achievement progress (compared to learner's plans); (b) Weekly learning records; (c) Weekly activities records; and (d) Pop-up menu for Weekly activity achievement records. The Learning Activities page includes (a) Activity records for quizzes; (b) Activity records within forums; (c) Activity records for assignments; (d) Activity records for reflection journals; and (e) a Pop-up menu for Content Analysis. On the learning evaluation, students can identify (a) My current status; (b) Course results and comparisons with peers; and (c) My achievement badges.
3.2 Research Procedure and Methods
To achieve this research objective, this study was conducted according to the following prodcure. First of all, the LM-Dashboard prototype was developed to represent the critical significance of visualization strategies of MOOC dashboards to promote individual student's SRL based on design guidelines of visualizing MOOC dashboards from the learning analytics perspective in Table 2. Then, in order to empirically validate the visualization strategies and evaluate the perceived effectiveness of the LM-Dashboard prototype, the Phase I usability study was conducted to validate the visualization strategies and improving the usability of the LM-Dashboard with eight educational technology professionals. The Phase I usability study focused on validity of each visualization items and additional comments to improve the visualization strategies.
After gathering comments from the Phase I usablity study, the LM-Dashboard prototypes were revised and redesigned based on the results of the Phase I study. The Phase II usability study was conducted with the revised version of LM-Dashboard to improve student's user experience and evaluate the perceived SRL effectiveness in a iterative manner. Therefore, students as a user on the LM-Dashboard were participated in the Phase II usability study. From the user-centered design, iterative usability evaluations with both experts and users would contribute to make the final products more user-friendly . Therefore, in this study, user-centered design methods were adopted to validate and improve user experience of the LM-Dashboard from both experts and user perspectives.
3.3 Participant Selection & Instruments
In the Phase I study, eight educational technology professionals were recruited. All eight participants are experts in their field with a PhD in educational technology or computer science education and have been working in educational technology sectors for more than five years. Most of them also have experiences with MOOC design and development research.
The instrument for evaluating the LM-Dashboard prototype in the phase I study was a questionnaire with the focus of the validation on the visualization strategies integrated into the MOOC dashboard for assisting learners with SRL. The questionnaire was designed to respond with the 5-point Likert scale on the validity with open and additional comments of whether each visualization items on the dashboard page can help to promote SRL. The pages on the prototype were captured and the visualization items on the page were explained based on previous studies.
In the Phase II study, students participated in the usability evaluation on the revised LM-Dashboard prototype based on the results of the expert reviews from the phase I study. Nielsen  found that 5 users could find more than 85% of usability problems in the design, so he recommends that multiple usability evaluations with 5 users in an iterative manner might be more effective than an usability evaluation with many participants. In this study, to achieve usability objectives of the LM-Dashboard prototype, two different evaluations with experts and users were conducted in an iterative manner and the number of the participants in the phase II study were decided as six users in an efficient way which mean that 90% of usability problems might be found according to Nielsen's  Graph. The consent form with the questionnaire was acquired, and a small incentive was offered to the participants.
3.4 Data Analysis
In the phase I study, the data on the quantitative validity scale were analyzed on the descriptive statistics of the mean and standard deviation. The content analysis was adopted to analyze the educational technology professional's qualitative opinions. Since the purpose of this study is to validate the design features on the visualization of the LM-Dashboard to promote SRL, the coding scheme is focused on the relationship between the visualization strategies and the effects of SRL on each feature. Therefore, the qualitative comments categorized into the visualization items on the LM-Dashboard shown in Table 3.
In the phase II study, the questionnaire with the working high-fidelity prototype was analyzed from both a quantative and a qualitative perspectives. The purpose of the phase II study focused on learners' understandability, usefulness, and usability  and perceived SRL effectiveness  . Therefore, in addition to the descriptive statistics, the qualitative analysis was conducted to classify participants into perceived strengths and weaknesses of the LM-Dashboard in terms of SRL and visualization.
3.5 Reliability and Validity
The peer-review by debriefing one other as collaborative researchers was adopted for aiding qualitative verification techniques . Throughout the research, collaborative researchers held in-depth discussions on the features and issues of the research process regularly. In addition, this study validated the visualization strategies integrated into the MOOC dashboard for learner-promoted SRL by triangulating data through both quantitative and qualitative study with diverse methodological approaches and with experts in the field and the MOOC users (students) .
This study's limitations include the confined context as well as the small group of educational professionals and participants of the study. In fact, the context and participants are not representative of MOOC users all around the world, so these small sample groups inhibit generalizability for all MOOC users. Therefore, this study did not aim to generalize the results, but to demonstrate how the MOOC dashboard might promote SRL for users and to validate the visualization strategies (principles) and the dashboard (practice) on the MOOC environments from educational professionals and users' perspectives. In addition, future research will be discussed to develop the visualization guidelines (principles) and dashboard design strategies (practice) for the learner's SRL at the end of this paper.
4.1 Phase I Study: Validity evaluation on the visualization strategies from experts
The results of the educational technology practitioners' responses to the five-point Likert scale on the validation of the MOOC dashboard designs based on the visualization strategies to promote SRL and the qualitative comments are presented in Table 6. First, the descriptive statistics including means (M) and standard deviations (SD) on the validation scale showed that the experts in educational technology evaluated most of visualization items that promoted SRL on the MOOC dashboard prototype, LM-Dashboard highly valid by rating more than 4.0, except for 4 items (1.2, 2.4, 2.5, 3.3).
The reasons for the low-rated 4 items can be derived from the qualitative comments. Firstly, five experts argued that social comparisons might not be very effective on the course progress feature compared to the social comparison in terms of the achievements. Furthermore, four experts also commented that the individual comparisons on the social network might not be very positive (1.2 and 3.3). They argued that it could also invade the student's private information. The comments on social comparisons from experts might have arisen from no accurate information about the criteria on how the social comparison might be calculated and who is compared on the learning analytics. In fact, experts had inquiries about accurate criteria on the x-axis and y-axis number of the graph rather than the visualization features themselves. On the learning activities page, the number difference (y-axis) between % of participation and % of efforts was ambiguous, so 5 experts said that the y-axis number should be clearer.
Secondly, most of the experts discussed that reflective activities might not be conducted as many as to be analyzed and visualized in a word-cloud (2.4, 2.5). In the current MOOC environments, it is because the reflection is not a main learning activity in which students are encouraged to complete. From this perspective, analysis also revealed that experts evaluated the information derived from the learning analytics rather than the visualization features themselves.
Most of experts presented concurring comments on the following aspects: firstly, they thought that the traffic light and digital badges might be very effective to encourage students in terms of SRL, but some of experts doubted which type of graph might be more intuitive. For instance, an expert disagreed that a radar graph might not be very informative. It can be translated that the experts are not professionals on visualization, so it seemed to be difficult for them to judge the visualization features in a professional way. Secondly, they recommended that simple information architect and design on the dashboard might be more effective and intuitive than too much information. Finally, it is noteworthy that an expert insisted that quantitative learning records on the MOOC environment are not as valuable as those on the offline learning and other online learning because massiveness and openness can lead to less rigorous evaluation from instructors.
The revisions made in LM-Dashboard prototype for Phase 1 are as follows:
* On the course progress page, the detailed information on the weekly basis such as weekly learning records and activity records is moved as a pop-up feature on the specific week clicked. Other repetitive features and data were deleted.
* On the learning activity page, the criteria on the y-axis number of each activity are shown in an intuitive way, for example, time on video clips watched, the number of posts on discussions and Q&A in terms of comparing student's individual progress to his/her plan.
* On the learning evaluation page, private information about individual comparison was encrypted.
4.2 Phase II Study: Learner's User experience and perceived SRL effectiveness evaluation
The questionnaire items to evaluate learners' experiences on the 2nd LM-Dashboard consisted of whether it is useful, understandable and usable for their SRL. The following in Table 7 presents the results of learners' ratings to survey questions on the user experience according to each visualization feature on the LM-Dashboard. The average scores are distributed from 4.00 to 4.33 which are somewhat higher. From the results, it was proven that the visualization features and information on the LM-Dashboard offered a positive learning experience.
The following results in Table 8 presents the learners' ratings to survey questions on their perceived SRL effectiveness using the LM-Dashboard. These questions consisted of seven summative evaluation items about the LM-Dashboard prototype in terms of perceived SRL effectiveness, which were developed based on previous studies  . The average scores are distributed from 2.67 to 4.33. As shown in Table 8, three questions (Q1, Q2, Q7) rated means higher than 4.0, which means that some parts of SRL such as monitoring learning progress, evaluating learning outcomes, and managing time effectively would be improved through the LM-Dashboard experience. However, four items (Q3, Q4, Q5, Q6) had means lower than 4.0, which can be interpreted that such SRL perspectives were not handled in the LM-Dashboard. The reasons were analyzed in alignment with the findings from the qualitative comments at the end of this section.
The results from the analysis of the qualitative comments are presented in Appendix 1. The learners' feedback or comments to open-ended questions regarding evaluating user experience and perceived SRL effectiveness on LM-Dashboard were also analyzed by classifying them into perceived strengths and weaknesses of the LM-Dashboard in terms of SRL and visualization. The learners' qualitative comments on the strengths and weaknesses of the LM-Dashboard prototype support validity on some of the design guidelines for visualizing a MOOC learning dashboard to promote SRL based on learning analytics. To explain, the strengths from the results show that most learners discussed the positive impact on the design features of the LM-Dashboard prototype, regarding some aspects of their SRL improvements such as the effectiveness to monitor and evaluate their status of learning progress compared to their plans at a glance, usefulness on which part they have to study more, and a sense of accomplishment and motivation through digital badge and achievement reports.
However, from the weaknesses of the designed prototype noted by learners, the following suggestions can be made for future improvements of the visualization design guidelines on MOOC dashboards. Firstly, Learners A, C, and F pointed out that social comparison of personal scores can demotivate students to learn. They suggested that the MOOC dashboard should provide only an average score of peers or achievement level of their own step-by-step goals for SRL. In fact, the social comparison feature was one of topics which should be argued with the opposite opinions. Secondly, on the course progress page, Learner B mentioned that it would be more useful if peers reviews or comments on their postings were added. Thirdly, on the learning activities page, Learner F reported that proper instructor's tips can change from the 'attention' level to the 'excellent' level in the students' performance. Fourthly, Learners B and C required the concrete and accurate criteria for acquiring badges or gaining scores. Additionally, Learner E claimed that the clear description of the figures in graphs was needed at a glance. With regards to the overall user interface, it is suggested to eliminate complex information, increase readability, and adjust the appropriate amount of information on one page.
In summary, the results showed that most learners reported positive learning experiences on the 2nd LM-Dashboard in terms of understandability, usability, and usefulness, but they presented mixed opinions about the perceived SRL effectiveness. Learners gave high ratings on the help of managing their own learning progress, evaluating learning outcomes, and time management, but low ratings on promoting confidence to achieve their learning goals and keep their learning on track. Regarding the motivation, the average evaluation score for learners was 3.67, with a mean lower than 4.0, yet promising mean score. From the results, it can be shown that the 2nd LM-Dashboard prototype provided learners with a positive learning experience regarding monitoring their learning progress, evaluating themselves, and managing their learning time, which play a role in promoting parts of the SRL competencies  . However, learners did not provide a confident answer about the effectiveness of keeping them on track according to their plans and achieving their goals without giving up. They thought that the information analyzed and visualized their learning data on the LM-Dashboard prototype would not be enough to promote SRL. The reasons can be also found in alignment with the qualitative analysis from the open-ended comments. Most of learners required the prescriptive tips and recommended comments based on their learning progress and status in addition to the visualization information analyzed. In this respect, the visualization information might be more helpful with prescriptions and recommendations in terms of promoting SRL.
The revised and suggested features in visualizing the LM-Dashboard prototype according to learner evaluation results in Phase II are as follows:
* On the course progress page, the prescriptive tips on the progress and recommended stage are provided to keep on proper track and succeed in learning objective. In addition, detailed information on the status and date of the assignment submission is offered. On the learning activity page, instructor advice or peer tips for motivating students whose level of learning activities is significantly lower than others are added.
* On the learning evaluation page, self-set goals or step-by-step recommended goals are presented to help students who struggle with confidence. Additionally, the fair and clear achievement criteria or terms of relevant information is offered.
* On the overall interface design, optional buttons on social comparisons and badges features as well as individual data analyzed are offered to allow learners to visualize their own selected information according to their preferences. Duplicate, repetitive or redundant information should be minimized.
The prescriptive comments on the dashboard in MOOC environments is a big research topic, so this feature is suggested, so the completed features could not be revised on our LM-Dashboard prototype.
5. Discussion and Implications
As mentioned in the earlier limitations section, the purpose of this study is not to generalize the findings, but to provide MOOC designers and professionals on learning analytics with design guidelines and visualization strategies on the MOOC dashboard to promote SRL.
However, the results from the perceived SRL effectiveness by student participants revealed that some of SRL aspects were not addressed through the LM-Dashboard. Learners evaluated that the visualization features might positively influenced on monitoring their learning progress and assessing their achievement with efficient time management but might not help to completing their learning goals with the persistence by solving difficulties in learning. They additionally requested prescriptive tips and recommendations to keep on track and achieve their learning objectives.
The implications from the results of this study can be discussed as follows. First of all, to make decisions on the type of graph which is more understandable and intuitive to learners was one crucial and difficult task. Depending on experts, opinions about which type of graph might be more effective on some specific visualization features were different. To explain, it was agreeable to adopt the line chart on the comparisons among two or three numbers related to student's activity plan and progress, but the slider chart was recommended instead of the traffic lights, and an expert thought that the radar chart was not informative. For the decisions on revisions of the second prototype, previous research about the visualization on the dashboard was studied. The traffic light can be utilized to emphasize an early warning to signal the status of the learners compared to a progress bar , so we made decisions on retaining the traffic light feature since our purpose on use of the traffic light is to give learners appropriate signals on their progress. However, we made the traffic light more varied in color from 3 steps to 5 steps to adopt the advantage of the slider chart. In terms of radar graph, Lupu-dima, Corbu, and Edelhauser  also argued that the type of graph shows multi-dimensional data in a more intuitive way, so it represents embodied metrics in a small space and is easily comparable with different colors layered. It was judged that the advantages of the radar chart might be attractive to learners during the first revision of the prototype, and learners from the Phase II evaluated that the radar graph was useful to look at information at a glance. From the examples, it was revealed that the visualization type and method on individual information from the learning analytics has its own characteristics and merits, so designers should make decisions on the type of visualization based on the purpose as Santos, Govaerts, Verbert, and Duval  emphasized and that the visualization should be linked to the intended purpose.
Secondly, opposite opinions on the social comparisons from both experts and students were discussed. Some experts argued that the purpose of the dashboard in this study is to facilitate individual student's SRL, so self-definition of goals and progress comparison is a more important than the social comparison. On the other hand, other experts agreed on the merits of the social comparison. Student participants also were divided on the positive and negative sides regarding the peer comparison feature. In fact, since this study did not conduct experimental or quantitative research, selection or generalization on a specific feature was not on the focus. Davis, Jivet, and Kizilcec  recently found that the social comparisons in the MOOC environments contributed to increasing student's completion rates, and Zimmerman and Campillo  also suggested social comparison even if the studies were done in the traditional learning environment. However, it should be carefully designed to provide an individual comparison with a specific peer group, as a learner participant suggested that the average score of the peers might be more informative than the individual score of peers. In the case of research by Davis and colleagues , successful group of learners and role models among peer groups were adopted to facilitate the social comparisons and give social cues in the visualization. In addition, it might be better to give accurate criteria and guidelines on how students translate the social comparison in terms of SRL. Davis and colleagues  developed the social comparison visualization based on the analysis of the role models and successful learners in a personalized feedback system. Therefore, our conclusion regarding the social comparison feature is to give students an option to toggle the feature on/off based on the visualization criteria from Davis and colleagues . However, future in-depth research about the criteria on the selection of peers and more effective visualization for social comparison in the MOOC environments to promote SRL is suggested.
Thirdly, the effectiveness of the digital badge feature was also discussed by some of the expert and student participants. From the previous studies, Anderson and Staub  emphasized the merit of the badge as an authentic assessment tool of performance-based activity and Fain  argued that the digital badges play a role in representing student's learning experiences. In particular, Mah  insisted that the digital badges influenced the student's motivation, planning and retention in a positive way. In this respect, even if two experts and one learner argued the merit of the digital badge, the final prototype retained the digital badge feature based on the opposite opinions among both experts and learners as well as findings from the previous studies. Whether the digital badges are effective or not, it is more important to visualize accurate criteria on awarding the digital badges to improve the effectiveness as two student participants asked in the usability evaluation. It can be suggested to make criteria on the award and validation of the effectiveness for the digital badges in the MOOC environments. Furthermore, the personalized menu options should be developed so that students can select a specific button to make them see only their preferred information.
Fourthly, learner participants required the prescriptive comments with the visualization of the learning progress on the dashboard. They argued that intuitive visualization on their learning patterns and analysis data would be more helpful with the recommendations on their learning. There are many previous studies on learning analytics and recommendations , but the research contexts were different though not in MOOC environments. In fact, MOOC makes it possible to collect big data from diverse learners all over the world. In this respect, in-depth studies on the customized prescriptions and recommendations in accordance with individual learning paths or progresses based on learning analytics to promote SRL are suggested.
Finally, both experts and students required simple visualization on the important components from learning analytics without repetitive contents. They argued that vivid colored badges and many graphs analyzed in a varied way interfere with their intuitive understanding. Therefore, it can be concluded that it is more efficient and effective to present crucial features with simple and intuitive visualization on the dashboard.
6. Conclusion and Future Directions
Most dashboards fail to communicate efficiently and effectively, not because of inadequate technology (at least not primarily), but because of poorly designed implementations . The LM-Dashboard prototype has informed us of how important it is to offer instructional interventions such as feedback, advice and tips as well as to collect and analyze the learning experiences of MOOCs and visualize individual learner's progress and activities in an intuitive and efficient way for the successful cycle of SRL. This study provides useful insights into the visual design of MOOC dashboards and the ways how expert validation and learner evaluation could inform continuous improvements throughout iterations. The findings on the LM-Dashboard prototype in this study can help instructional designers, system developers, and UI/UX visual designers to establish a fundamental understanding into designing a visualization dashboard in MOOC environments by offering meaningful quantitative and qualitative information from analyzed data of students' learning progress and activities intuitively and effectively. In particular, this study showed some parts of SRL competencies of learners improved through the visualization information and features on the MOOC dashboard based on learning analytics. Therefore, the results of this study can provide instructional designers, system developers, and UI/UX visual designers with the exemplary design and development process of the MOOC dashboards.
However, there are still challenges in accommodating individual preferences and characteristics of massive learners and raising a confidence to achieve learner's goal without giving up when they faced learning difficulties. In fact, the SRL includes the (meta-) cognitive, behavioral, and affective aspects of learning . While self-regulated cognitions and behaviors are critically important, self-regulated affective components seem as significant in determining learners' attitudes and abilities for SRL  . In this study, the learners' reactions to the LM-Dashboard prototype reveal somewhat low confidence or motivation persisting in their learning. Therefore, it is required to assess comments that promote students' affective SRL competencies to enhance their self-regulated emotions and motivation in accordance with their learning progress and status in addition to the visualization information.
Future studies can collect real data from the system and evaluate the user log data with completion rate by implementing a dashboard that promotes SRL based on the design implications induced from this study. MOOCs and learning analytics are well reciprocal in which learning behaviors could provide instructors, instructional designers, IT and HCI professionals, and institutional operators with a broad sense of the opportunities of personalization and prediction in educational big data . The customized dashboard based on state-of-the-art AI technology in the recent future could solve the problem with individual preferences by offering the individualized interface and information on some of features which showed mixed opinions about the perceived preferences and effectiveness of social comparisons and digital badges in this study. Thus, it is more important that the dashboard be designed to promote a student's SRL based on the quantitative and qualitative data through learning analytics in a more intuitive and effective way. Research on visualizing MOOC dashboards to facilitate SRL is not limited to the field of visual/graphic design but is also found in a wide range of areas such as educational psychology, instructional design, computer science, HCI and engineering. The collaboration and interaction in these research areas are needed for its development.
 A. Littlejohn, and C. Milligan, "Designing MOOCs for professional learners: Tools and patterns to encourage self-regulated learning," eLearning, vol.42, no.4, pp.1-10, 2015. Article (CrossRef Link).
 S. Halawa, D. Greene, and J. Mitchell, "Dropout prediction in MOOCs using learner activity features," Experiences and best practices in and around MOOCs, vol.7, pp.3-12, 2014.
 D. F. Onah, J. E. Sinclair, and R. Boyatt, "Exploring the use of MOOC discussion forums," in Proc. of London International Conference on Education, pp.1-4, November, 2014. Article (CrossRef Link).
 R. Rivard, "Measuring the MOOC dropout rate," Inside Higher Ed, vol.8, 2013. Article (CrossRef Link).
 T. Park, H. Cha, and G. Lee, "A study on design guidelines of learning analytics to facilitate self-regulated learning in MOOCs," Educational Technology International, vol.17, no.1, pp.1-34. Article (CrossRef Link).
 K. Jordan, "MOOC completion rates: the data," 2013. Article (CrossRef Link).
 R. F. Kizilcec, and E. Schneider, "Motivation as a lens to understand online learners: Toward data-driven design with the OLEI scale," ACM Transactions on Computer-Human Interaction (TOCHI), vol.22, no.2, p.6, 2015. Article (CrossRef Link).
 S. Zheng, M. B. Rosson, P. C. Shih, and J. M. Carroll, "Understanding student motivation, behaviors and perceptions in MOOCs," in Proc. of the 18th ACM conference on computer supported cooperative work & social computing, pp.1882-1895, February, 2015. Article (CrossRef Link).
 B. J. Zimmerman, "Becoming a self-regulated learner: An overview," Theory into practice, vol.41, no.2, pp.64-70, 2002. Article (CrossRef Link).
 H. Qu, and Q. Chen, "Visual analytics for MOOC data," IEEE computer graphics and applications, vol.35, no.6, pp.69-75, 2015. Article (CrossRef Link).
 I. Frolov, and S. Johansson, "An adaptable usability checklist for MOOCs: A usability evaluation instrument for Massive Open Online Course," Master Thesis, Department of Informatics, HCI, UMEA, 2013.
 I. de Waard, M. S. Gallagher, R. Zelezny-Green, L. Czerniewicz, S. Downes, A. Kukulska-Hulme, and J. Willems, "Challenges for conceptualising EU MOOC for vulnerable learner groups," in Proc. of the European MOOC Stakeholder Summit 2014, pp.33-42, 2014. Article (CrossRef Link).
 S. Downes, "The quality of Massive Open Online Courses," International Handbook of E-learning, vol.1, pp.65-77, 2013. Article (CrossRef Link).
 A. Creelman, U. Ehlers, and E. Ossiannilsson, "Perspectives on MOOC quality-An account of the EFQUEL MOOC Quality Project," INNOQUAL-International Journal for Innovation and Quality in Learning, vol.2, no.3, pp.78-87, 2014. Article (CrossRef Link).
 A. Ho, I. Chuang, J. Reich, C. A. Coleman, J. Whitehill, C. G. Northcutt, J. J. Williams, J. D. Hansen, G. Lopez, and R. Petersen, "HarvardX and MITx: Two years of Open Online Courses Fall 2012-Summer 2014," March, 2015. Article (CrossRef Link).
 M. Boekaerts, "Self-regulated learning: where we are today," International Journal of Educational Research, vol.31, pp.445-457, 1999. Article (CrossRef Link).
 P. R. Pintrich, "The role of goal orientation in self-regulated learning," Handbook of self-regulation, pp.451-502, 2000. Article (CrossRef Link).
 B. J. Zimmerman, and M. M. Pons, "Development of a structured interview for assessing student use of self-regulated learning strategies," American educational research journal, vol.23, no.4, pp. 614-628, 1986. Article (CrossRef Link).
 B. J. Zimmerman, and M. Campillo, "Motivating self-regulated problem solvers," The psychology of problem solving, pp.233-262, 2003. Article (CrossRef Link).
 M. C. English, and A. Kitsantas, "Supporting student self-regulated learning in problem-and project-based learning," Interdisciplinary journal of problem-based learning, vol.7, no.2, p.6, 2013. Article (CrossRef Link).
 R. A. Kuiper, and D. J. Pesut, "Promoting cognitive and metacognitive reflective reasoning skills in nursing practice: self-regulated learning theory," Journal of Advanced Nursing, vol.45, no.4, PP.381-391, 2004. Article (CrossRef Link).
 R. Azevedo, D. C. Moos, J. A. Greene, F. I. Winters, and J. G. Cromley, "Why is externally-facilitated regulated learning more effective than self-regulated learning with hypermedia?," Educational Technology Research and Development, vol.56, no. 1, pp.45-72, 2008. Article (CrossRef Link).
 M. Bannert, and C. Mengelkamp, "Scaffolding hypermedia learning through metacognitive prompts," International handbook of metacognition and learning technologies, pp. 171-186, Springer, New York, NY, 2013. Article (CrossRef Link).
 J. Broadbent, and W. L. Poon, "Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review," The Internet and Higher Education, vol.27, pp.1-13, 2015. Article (CrossRef Link).
 X. Lin, and J. D. Lehman, "Supporting learning of variable control in a computer-based biology environment: Effects of prompting college students to reflect on their own thinking," Journal of research in science teaching, vol.36, no.7, pp.837-858, 1999. Article (CrossRef Link).
 M. Taub, R. Azevedo, F. Bouchet, and B. Khosravifar, "Can the use of cognitive and metacognitive self-regulated learning strategies be predicted by learners' levels of prior knowledge in hypermedia-learning environments?," Computers in Human Behavior, vol.39, pp.356-367, 2014. Article (CrossRef Link).
 R. F. Kizilcec, M. Perez-Sanagustin, and J. J. Maldonado, "Self-regulated learning strategies predict learner behavior and goal attainment in Massive Open Online Courses," Computers & Education, vol.104, pp. 18-33, 2017. Article (CrossRef Link).
 T. Anderson (Ed.), "The theory and practice of online learning," Athabasca University Press, 2008. Article (CrossRef Link).
 A. Croll, and S. Power, "Complete web monitoring: watching your visitors, performance, communities, and competitors," O'Reilly Media, Inc., 2009. Article (CrossRef Link).
 T. Hey, and A. E. Trefethen, "Cyberinfrastructure for e-Science," Science, vol.308, no. 5723, pp.817-821, 2005. Article (CrossRef Link).
 C. Romero, and S. Ventura, "Educational data mining: A survey from 1995 to 2005," Expert systems with applications, vol.33, no.1, pp.135-146, 2007. Article (CrossRef Link).
 J. Schaffer, B. Huynh, J. O'Donovan, T. Hollerer, Y. Xia, and S. Lin, "An analysis of student behavior in two massive open online courses," in Proc. of 2016 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, pp. 380-385, August, 2016. Article (CrossRef Link).
 G. Chen, D. Davis, J. Lin, C. Hauff, and G. J. Houben, "Beyond the MOOC platform: gaining insights about learners from the social web," in Proc. of the 8th ACM Conference on Web Science, pp. 15-24, ACM, May, 2016. Article (CrossRef Link).
 E. Duval, "Attention please!: learning analytics for visualization and recommendation," in Proc. of the 1st international conference on learning analytics and knowledge, pp. 9-17, February, 2011. Article (CrossRef Link).
 L. Ali, M. Hatala, D. Gasevic, and J. Jovanovic, "A qualitative evaluation of evolution of a learning analytics tool," Computers & Education, vol.58, no. 1, pp.470-489, 2012. Article (CrossRef Link).
 SAM DashBoard S/W. https://www.samlearning.com
 FutureLearn dashboard. https://www.futurelearn.com
 F. Grunewald, C. Meinel, M. Totschnig, and C. Willems, "Designing MOOCs for the support of multiple learning styles," in Proc. of European Conference on Technology Enhanced Learning, pp. 371-382, Springer, Berlin, Heidelberg, September, 2013. Article (CrossRef Link).
 B. Rienties, A. Boroowa, S. Cross, C. Kubiak, K. Mayles, and S. Murphy, "Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK," Journal of Interactive Media in Education, vol.2016, no.1, pp.2, 2016. Article (CrossRef Link).
 S. B. Shum, and R. Ferguson, "Social learning analytics : five approaches," in Proc. of the 2nd International Conference on Learning Analytics and Knowledge, pp. 23-33, 2012. Article (CrossRef Link).
 S. Knight, and K. Littleton, "Discourse-centric learning analytics: mapping the terrain," Journal of Learning Analytics, vol.2, no.1, pp.185-209, 2015. Article (CrossRef Link).
 K. Verbert, S. Govaerts, E. Duval, J. L. Santos, F. Van Assche, G. Parra, and J. Klerkx, "Learning dashboards: an overview and future research opportunities," Personal and Ubiquitous Computing, vol.18, no.6, pp.1499-1514, 2014. Article (CrossRef Link).
 Y. Ishikawa, and M. Hasegawa, "T-scroll: Visualizing trends in a time-series of documents for interactive user exploration," in Proc. of International Conference on Theory and Practice of Digital Libraries, pp. 235-246, Springer, Berlin, Heidelberg, September, 2007. Article (CrossRef Link).
 J. Nielsen, "Why you only need to test with 5 users," Nielsen Norman Group: World leaders in research-based user experience, 2000. Article (CrossRef Link).
 D. Henriksen, C. Richardson, and R. Mehta, "Design thinking: A creative approach to educational problems of practice," Thinking Skills and Creativity, vol.26, pp. 140-153, 2017. Article (CrossRef Link).
 S. Patton, "Admissions professionals ask: Are graduate schools ready for MOOCs?," The Chronicle of Higher Education, APRIL 26, 2013. Article (CrossRef Link).
 D. Gasevic, S. Dawson, and G. Siemens, "Let's not forget: Learning analytics are about learning," TechTrends, vol.59, no.1, pp.64-71, 2015. Article (CrossRef Link).
 L. Lupu, E. C. Corbu, and E. Edelhauser, "Dashboards and Radar Charts, Performance Analytics Instruments in Higher Education," in Proc. of International Conference on Current Economic Trends in Emerging and Developing Countries (TIMTED-2017), Timisoara, May, 2017. Article (CrossRef Link).
 J. L. Santos, S. Govaerts, K. Verbert, and E. Duval, "Goal-oriented visualizations of activity tracking: a case study with engineering students," in Proc. of the 2nd International Conference on Learning Analytics and Knowledge, pp.143-152, Vancouver, Canada, April 29 - May 02, 2012. Article (CrossRef Link).
 D. Davis, I. Jivet, R. F. Kizilcec, G. Chen, C. Hauff, and G. J. Houben, "Follow the successful crowd: raising MOOC completion rates through social comparison at scale," in Proc. of the Seventh International Learning Analytics & Knowledge Conference, pp.454-463, ACM, March, 2017. Article (CrossRef Link).
 D. M. Anderson, and S. Staub, "Postgraduate digital badges in higher education: Transforming advanced programs using authentic online instruction and assessment to meet the demands of a global marketplace," Procedia-Social and Behavioral Sciences, vol.195, pp.18-23, 2015. Article (CrossRef Link).
 P. Fain, "Badging From Within," Changing Student Pathways Washington DC: Inside Higher Ed. 2014.
 D. K. Mah, "Learning analytics and digital badges: potential impact on student retention in higher education," Technology, Knowledge and Learning, vol.21, no.3, pp. 285-305, 2016. Article (CrossRef Link).
 S. Few, "Information dashboard design: The effective visual communication of data. Sebastopol," CA: O'Reilly Media, Inc., 2006.
 E. Panadero, "A review of self-regulated learning: six models and four directions for research," Frontiers in psychology, vol.8, p. 422, 2017. Article (CrossRef Link).
 A. R Artino, and K. D. Jones, "Exploring the complex relations between achievement emotions and self-regulated behaviors in online learning," The Internet and Higher Education, vol.15, no.3, pp. 170-175, 2012. Article (CrossRef Link).
 A. Ben-Eliyahu, and L. Linnenbrink-Garcia, "Extending self-regulated learning to include self-regulated emotion strategies," Motivation and Emotion, vol.37, no.3, pp.558-573, 2013. Article (CrossRef Link).
 J. Knox, "From MOOCs to Learning Analytics: Scratching the surface of the 'visual'," eLearn, November, 2014. Article (CrossRef Link).
Appendix 1. The analysis results of learners' comments Comments Strengths Weaknesses Items 1, * Graphs on learning * It would be better if it Course achievement progress offers proper instructors' Progress compared to initial plans tips to change from the are useful for reflecting risky stage to the and focusing on the excellent stage, further study parts especially for students (A, B, C, D, E, F) with a low level of willingness or motivation of learning (F) * Weekly study plan * It would be better to and learning status by check when assignments are course index is easy to submitted as well as understand and to check whether assignments are the learning progress at submitted (E) a glance (A, C) 2. * It is useful to easily * It would be more useful Learning monitor in-depth if peer reviews of Activities information on postings on my discussion discussions (A, are added (B) B, C, E) * Tips or comments on * It is useful to be able motivating students to check my grades whose level of learning against colleagues activities is (A, B, D, E) significantly lower compared to other * Evaluation feedback learners should be on quizzes, assignments provided (C) and tests is very useful * Information on the (A, B, C, D, E) number of video plays and * Information on the the playing time is not number of video plays meaningful and comparing and the playing time with others is not is very useful (E) necessary (C, E) * It is useful to reflect * It is questionable whether by comparing my own it will motivate learning plans and progress (F) by comparing the number of comments and postings of colleagues with mine (F) 3. * It is easy and useful * It is difficult to Learning to compare my learning understand what the x-axis Evaluation achievements with and y-axis of my current peers (A, B, C, E) position analysis are (B, * Badges according to the C) degree of achievement * It is difficult to know are useful to check the the criteria for receiving parts which student performance badges (B, C) needs to study (A, E, F) * Providing only the * Report card with course average score of peers status is useful for sense rather than each peer's of accomplishment and personal score would motivation (B, C, lessen the burden (F) D, E, F) * It is likely that students will be motivated more continuously if they present their own goal or a step-by-step recommended goal rather than a relative position compared to other students (A, C) * The badge may rather cause adult learners to have side effects that rely on rewards rather than focusing on learning (E) 4. No comment * Preferences could vary Overall according to learner interface characteristics (age, design gender, academic level, etc.) (F) * The amount of information presented on a screen is too much (C, E) * To attain better readability, some texts should be enlarged (A, B) * It would be nice if visual design was done with optimal page structure according to screen size (B) * Information on the graphs should not be duplicated in texts (C, E)
Hyunjin Cha is an invited professor in School of General Education, at Dankook University in Cheonan, Republic of Korea. Dr. Cha worked for an assistant professor, Soonchunhyang University, Asan, Korea. Dr. Cha obtained a M.Sc. in Human Computer Interaction with Ergonomics from University College London, and a Ph.D. in Educational Technology from Hanyang University, Seoul, Korea. Her current research areas include Ubiquitous Teaching and Learning Environments and Human Computer Interaction in Learning Science.
Taejung Park is an assistant professor in the Division of General Studies at the College of Liberal Arts and Interdisciplinary Studies, Kyonggi University in Suwon, Republic of Korea. She received her Master Degree in Multimedia English Education from Ewha Graduate School of Education, and her Ph.D. in Educational Technology from Seoul National University. Her current research interests focus on instructional design, MOOCs, AR/VR/MR, Maker-based instruction, flipped learning, and future school. Email:firstname.lastname@example.org (82+1022567542)
Hyun-Jin Cha (1) and Taejung Park (2)
(1) School of General Education, Dankook University 119, Dandae-ro, Dongnam-gu, Cheonan-si, Chungnam, 31116, Republic of Korea [e-mail: email@example.com]
(2) College of Liberal Arts and Interdisciplinary Studies, Kyonggi University 154-42, Gwanggyosan-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16227, Republic of Korea [e-mail: firstname.lastname@example.org]
(*) Corresponding author: Taejung Park
Received September 3, 2018; revised November 18, 2018; accepted December 22, 2018; published June 30, 2019
Table 1. SRL strategies from successful MOOC learners  SRL strategies Example(s) Metacognitive Self-monitoring Preparing summaries and mind strategies maps about what I learned. Self-evaluation I used what I learned in the course back in school classes so that I could internalize what I learned. Goal setting/planning The most important thing is to have clear objectives and to be organized when accomplishing them. Resource Time Taking the course was management management challenging because I have other strategies daily duties. What helped me is devoting a specific time of the day to work on the course. Effort regulation The day I started a chapter, I would also finish it. Help seeking It was a great help and source of motivation to take the course with friends. Study Find some quiet time at home and environment concentrate on the computer. Table 2. Design Guidelines of visualizing MOOC learning dashboards to promote SRL based on learning analytics  Dimensions of Design guidelines To visualize SRL strategies for facilitating guidelines of MOOC SRL in MOOC dashboard for learning environments promoting SRL based on learning analytics 1. Self-evaluation 1.1. Content analysis of To visualize learner's reflections quantitative and qualitative information based on the analysis of learner's contents from discussions, reflections, etc. 1.2. Learning history To visualize compared to others quantitative (achievements, progress, information for activities, e-portfolio, each activity on etc.) learning progress and achievement To visualize quantitative information for comparing a student's learning progress and achievement to those of average or good learners To visualize profile information of peers on social network 2. Organizing and 2.1. Learner's preferred To visualize transforming content types (video quantitative clips, texts, images, information showing voices, etc.) student preferred content types 2.2. Student's participant To visualize activity records for quantitative uploading and authoring information such as contents the number of uploading, downloading, or authoring contents 3. Goal-setting 3.1. Setting learning To visualize and planning objectives and plans for quantitative effective time information on management learner's achievement by their plan and execution of learning activities in a timely manner (by day, by week, by month) To visualize quantitative information on the degree of activity participation using graphs each distinguished by color and detailed information 3.2. Monitoring learner's To visualize plans, styles, and patterns quantitative and qualitative information on students' learning patterns, preferences and emotions in accordance with the flow of learning time (e.g., weekly) 4. Keeping 4.1. Records of student's To visualize records and learning activities such quantitative and monitoring as note-taking, searching, qualitative downloading, and information on printing weekly records of activities such as note-taking, uploading or downloading learning materials and printing materials out 5. Rehearsing 5.1. Details about To visualize and participation in the quantitative memorizing exercise, discussion, information on the homework, etc. number of times that students have read and posted in discussions, assignments, collaborative learning tasks and exams 6. Reviewing 6.1. Quantitative and To visualize records qualitative analysis of quantitative learning exercises such as information on the quizzes, discussions and scores of student's exams for reviewing effort, calculated by required time and planning time 7. Seeking 7.1. References and links To visualize information referred to by learners quantitative and and others qualitative information on external learning resources utilized by students 8. Seeking 8.1. Q&A to overcome or To visualize social solve problems quantitative and assistance qualitative information on analyzing content of students' Q&A 9. Self-Consequences 9.1. History of To visualize certificates or credits quantitative with invested time and information on earned achievement certification scores history 9.2. Enrolled and To visualize completed rates of quantitative courses monthly or information on annually weekly performance against learning objectives of individual student and overall average among learners 10. Structuring 10.1. Recommending To visualize personalized courses for each learner's quantitative learning level or interest information on environments weekly, monthly and yearly student achievement 10.2. Feedback on To visualize learning success and quantitative failure appropriate for information on the individual learning styles prescription of or patterns learning patterns /styles Dimensions of Design guidelines for References SRL strategies facilitating SRL in MOOC learning environments 1. Self-evaluation 1.1. Content analysis of  learner's reflections 1.2. Learning history   compared to others (achievements, progress, activities, e-portfolio, etc.) 2. Organizing and 2.1. Learner's preferred  transforming content types (video clips, texts, images, voices, etc.) 2.2. Student's participant   activity records for uploading and authoring contents 3. Goal-setting 3.1. Setting learning    and planning objectives and plans for effective time management 3.2. Monitoring learner's    plans, styles, and patterns  4. Keeping 4.1. Records of student's  records and learning activities such as monitoring note-taking, searching, downloading, and printing 5. Rehearsing 5.1. Details about  and participation in the memorizing exercise, discussion, homework, etc. 6. Reviewing 6.1. Quantitative and  records qualitative analysis of learning exercises such as quizzes, discussions and exams for reviewing 7. Seeking 7.1. References and links  information referred to by learners and others 8. Seeking 8.1. Q&A to overcome or    social solve problems assistance 9. Self-Consequences 9.1. History of    certificates or credits with invested time and earned achievement scores 9.2. Enrolled and    completed rates of courses monthly or annually 10. Structuring 10.1. Recommending    personalized courses for each learner's  learning level or interest environments 10.2. Feedback on   - learning success and failure appropriate for individual learning styles or patterns Table 3. The theoretical backgrounds for the menus of the LM-Dashboard Menu Visualization items Design guidelines from the on the LM-Dashboard Table 2 Course Weekly learning 3.1 progress achievement progress Weekly learning records 1.2, 9.2 Learning Weekly activities records 2.2 activities Weekly activities 4.1 achievement records Activity records for quiz 9.1, 1.2 Activity records 9.1, 1.2 for forum/discussion Activity records 5.1, 1.2 for assignment Activity records 1.2 for reflection journal Content analysis 1.1 Learning My current status 9.2 Evaluation My achievement badges 3.1, 9.1 Course results and 9.1, 1.2 comparison with peers Table 4. The theoretical backgrounds for the menus of the LM-Dashboard Menu Visualization items Design guidelines from the on the LM-Dashboard Table 3  Course Weekly learning 3.1 progress achievement progress Weekly learning records 1.2, 9.2 Learning Weekly activities records 2.2 activities Weekly activities 4.1 achievement records Activity records for quiz 9.1, 1.2 Activity records 9.1, 1.2 for forum/discussion Activity records 5.1, 1.2 for assignment Activity records 1.2 for reflection journal Content analysis 1.1 Learning My current status 9.2 Evaluation My achievement 3.1, 9.1 badges Course results and 9.1, 1.2 comparison with peers Table 5. Learners' profile MOOC learning SRL experience competency level ID Job Age Gender No. of No. of enrolled certified course /completed course A graduate student 20-29 M 2 - High B Undergraduate 20-29 M 1 - Mid student C graduate student 20-29 F 1 - Mid D graduate student 20-29 F 1 - Mid E graduate student 30-39 F 5 2 Very High F graduate student 20-29 F 2 - High Table 6. Expert evaluation on the LM-Dashboard prototype to promote SRL Design items Expert Comments visualized on the evalilation prototype the promote SRL M SD 1.1 4.0 0.87 Traffic light is very Weekly effective. learning It is possible to use the achievement slider chart instead of the progress traffic light in order to represent more detailed levels of the learner's status and progress. 1.2 Weekly 3.86 0.6 In terms of the course learning progress, the social 1. records comparison might not be very useful to promote SRL. Course 1.3 Weekly 4.0 1 It is not very meaningful progress activities simply to visualize the page records number of the activities learners participated 1.4 Weekly 4.0 1 Radar graph (Spider activities chart) might not be very achievement informative to learners records Downloading and printing might not be very meaningful SRL activities 1. Overall 4.0 0.87 Too much repetitive information on the visualization graph might not be very effective. It might be better to provide detailed information upon request (if the user clicked the details) 2.1. Activity 4.25 0.66 The difference between records for % of participation and % quiz of efforts is not obvious, so the guidelines and accurate criteria to translate the visualization should be offered. 2. 2.2. Activity 4.25 0.66 The MOOC environments Learning records for usually do not provide activities forum reflection opportunities, so /discussion the information about the reflection activity might be difficult to collect. It also leads that the word cloud analysis might be very effective. 2.3. Activity 4.13 0.6 No Comment records for assignment 2.4 Activity 3.63 1.11 No Comment records for reflection journal 2.5. Content 3.88 0.78 No Comment analysis 2. Overall 4.13 0.78 It is better that the type of graph is consistent, for example, as a line chart, unless the different type is valuable to choose. 3.1. My 4.5 1 No Comment current status 3.2. My 4.38 0.7 The digital badges might be achievement effective, but the badges appropriate level award the digital badges might be considered since too many digital badges might not be very attractive. 3. Learning 3.3. Course 3.63 1.11 The social comparison might Evaluation results and be effective, but the comparison individual comparisons on with peers the social network need to be validated. In particular, the private information on the social comparison should be carefully considered so as not reveal sensitive user information. 3. Overall 4.5 0.5 Accurate criteria about the social comparison should be offered. Table 7. Learners' ratings of user experience evaluation questions on the LM-Dashboard Items Learning Experience M SD 1.1 Understandability 4.33 0.47 Course Progress Usefulness 4.33 0.75 Usability 4.00 0.82 1.2 Understandability 4.33 0.75 Learning Usefulness 4.00 0.82 Activities Usability 4.17 0.69 1.3 Understandability 4.00 0.82 Learning Usefulness 4.33 0.75 Evaluation Usability 4.33 0.75 Table 8. Learners' ratings of perceived SRL effectiveness evaluation on the LM-Dashboard Items Perceived SRL Effectiveness M SD 1. This dashboard can help you manage and 4.33 0.75 check your own learning process. 2. This dashboard can help you identify and 4.33 0.47 evaluate your own learning outcomes. 3. This dashboard can help motivate and 3.67 0.75 maintain learning motivation. 4. This dashboard can help you to have 3.33 0.94 confidence that you will achieve your goals successfully in a given learning situation. 5. This dashboard can help you continue 2.67 0.75 your learning without giving up even if you have difficulties in continuing your studies. 6. This dashboard help you and your 3.67 0.94 colleagues to get assistance. 7. This dashboard can help you to use your 4.00 1.15 limited study time efficiently.
|Printer friendly Cite/link Email Feedback|
|Author:||Cha, Hyun-Jin; Park, Taejung|
|Publication:||KSII Transactions on Internet and Information Systems|
|Date:||Jun 1, 2019|
|Previous Article:||Internet of Things based Smart Energy Management for Smart Home.|
|Next Article:||AutoScale: Adaptive QoS-Aware Container-based Cloud Applications Scheduling Framework.|