Key Findings

This post was co-authored by Dr. Stephanie Teasley, Research Professor, School of Information, University of Michigan

Last year, Blackboard developed new Learning Analytics functionality for the Ultra experience of Blackboard Learn that included student-facing data dashboards. While dashboards are commonplace in some fields (e.g. finance, healthcare), they are not widely used in educational technologies – and when they are present, are primarily intended for faculty, administrators, and other professionals. There is little known about the impact of student-facing dashboards from educational research. To improve our understanding, Blackboard collaborated on a research study with Dr. Stephanie Teasley, Research Professor at the School of Information at the University of Michigan and Director of the Learning, Education and Design Lab.

The study sought to answer the following questions:

  1. Are students able to interpret the information provided by such systems, and do they know what to do with it?
  2. Which students find this information motivating versus demotivating, and under which circumstances?

We addressed these questions through an experimental study with a stratified random sample of undergraduate students. The final sample of 47 students were randomly assigned to either a “high” or a “low” performance feedback condition. Students attended a 20-30 minute in-person session in which they were interviewed to understand their demographic background and prior educational experience. Students participated in a 20-30 minute in-person session to view the dashboards and Ultra visualizations and provide their feedback. At the end of the session, students completed a survey. You can access the full report here.

Example of a Course Activity dashboard

Students were analyzed in two distinct groups: those with relatively “high” and “low” GPAs, to understand the potential differences in reaction based on a student’s academic history. Given the environment where the study was conducted, “low” GPA was defined as students with a GPA below 3.0; these students were not in a traditional “risk” category but were low achieving relative to their peers within the institution.

Students were also balanced in the experimental conditions by other demographic criteria. Using a 2 (feedback condition) by 2 (GPA) design, an ANOVA analysis was used to test for significant differences between the four experimental conditions.

Some of the results confirmed what we had hoped to find: overall, students found the dashboards to be useful, clear and informative. The most surprising result was that students from low GPA backgrounds found the results to be the most meaningful, and were more likely to act on the information that was provided than high GPA students. This means that the students who have the most to gain from these visualizations in terms of gains in their academic performance are the most likely to use them – a fantastic combination in terms of making a difference in student learning. It is also quite possible that some high achieving students fall outside the typical relationship between activity and grade.

We also discovered that student confidence in the results of the visualizations increased over time, and that they were unfamiliar with the idea of using behavioral data as indicators of student study practices and outcomes. These findings taught us that explaining dashboards to students is important for them to be used effectively. Finally, the differences between students suggested that it would be ideal to provide different visualizations based on students’ backgrounds and other criteria. See the image below (or the text alternative version) for more details on the results and key findings.
Chart 2

Of course, this is a single study with a specific student population and we need to be careful to not overgeneralize these results. In particular, it will be important to understand if these results extend to students with lower GPAs and other indicators that place them at risk of not succeeding in their academic aspirations. If you’re interested in conducting a similar study, we’d be happy to share our protocol and compare results. Blackboard will also be conducting follow-on studies and behavioral data analysis, and the University of Michigan team is also continuing research focused in this area.

If you’re interested in the details, please download the report here; you can also come to our session “Analytics Research: Effective Tool Use Patterns and Student Responses to Data-Driven Interventions” at the ELI Annual Meeting.

We welcome your feedback, interpretations, and challenges for what we should do next to improve our understanding of educational technologies and effective practices to improve student learning.

Download the full report: The impact of student-facing dashboards

Text alternative

Key Findings: Low GPA students are more motivated to take immediate action based on alerts; high GPA students are less likely to check dashboards and turn on notifications; all students who received low feedback were significantly more likely to find follow-up actions useful

Chart 2: Students who received feedback consistent with their prior academic performance (High Feedback/High GPA and Low Feedback/Low GPA) tended to agree more that the graphs helped them understand their class position and make decisions, as compared to the students who received feedback that was inconsistent with their prior performance (High Feedback/Low GPA and Low Feedback/High GPA).

 

Related Posts

Share This Article

Twitter Facebook LinkedIn Pinterest Email