Article originally published on E-Learn Magazine on Jul 24, 2018 – Click here for the Spanish version
“The purpose of learning analytics is the strategic application of the data toward the goals education institutions have,” says Rachel Scherer, director of analytics at Blackboard. Explore three learning analytics stories from Australia, Canada and the Netherlands for a better understanding of how learning analytics can impact colleges and universities and help them achieve their strategic goals.
Charles Sturt University: Using Learning Analytics to Share Information at Scale
When Charles Sturt University, in Australia, began using Blackboard Analytics for Learn, they had a singular goal: empower faculty to see how students were engaging (or not) when studying online. This was a critical matter for faculty, especially when considering that more than half of the university’s 43,140 students are online learners.
“Just as in a classroom a teacher can see who is present, who is engaging and who is staring out the window, we wanted our faculty to be able to get the same information from Blackboard Analytics for Learn,” says Ian Holder, adaptive learning and teaching analyst at Charles Sturt University (CSU).
In order to do that, CSU’s Learning & Teaching (L&T) specialists implemented three of the standard reports available from Blackboard Analytics for Learn: Course at a Glance, Activity and Grade Scatter Plot, and Activity Matrix, as well as Content Access Statistics, a customized report in use at another Australian university, which shows statistical information on student course content usage in the learning environment.
Additionally, CSU created charts and visualizations of blog and discussion forum activity in courses. “Discussion forums are an area of increasing interest that we hope to continue to work on and enhance our reporting in,” says Holder.
A Few Outcomes of Learning Analytics Implementation at CSU
Learning analytics helps institutions gather evidence and then leverage this information for decision-making. Today, CSU is still at a stage where learning analytics are focused at the individual course level. In the future, they aim to bring in additional data sets (such as demographics and course survey results) and look across colleges or the university as a whole to make bigger decisions. That said, CSU has achieved some early results.
“One such outcome, across departments, is instructional designers using their dashboard to review Blackboard course site structure and student engagement within this structure,” Holder explains. “The structures of the course sites were reviewed to ensure a reasonable amount of content and the appropriate number of folders and folder depth. Student engagement was also compared to the course and course site design to determine if students were engaging as and when expected, and if required, changes were implemented,” he adds.
Instructional designers leveraged course analytics to inform course site redesign resulting in better access to content, and improved engagement through optional tests and discussion forums. “We rejoice in this success, but course site design and the actual content and tools in the course site both change regularly, so we see it as a continuous improvement process,” Holder points out.
Leveraging learning analytics data, CSU’s College of Science proactively reached out to students who had not yet accessed their course site before the census date, the last date to withdraw without penalty. As a result, a number of students withdrew, saving them time and money by dropping out of courses for which they may not be suited. According to Holder, between 30% and 46% of students in the campaign withdrew before census date in the subjects piloted.
CSU aggregates learning analytics data across multiple courses to enhance their knowledge, challenge their thinking, and to stimulate conversation regarding educational technology in a project they call “The Pulse.” This information is now shared publicly through an infographic on Learning Analytics that is published three times a year.
In the future, Holder foresees artificial intelligence and increased data visualization taking a much larger role in learning analytics. “For many tasks, we currently need to extract the data out and process it through some other tool; I see in the short term these will increasingly become embedded in the learning analytics tools themselves. As will the areas of cognitive computing and natural language analytics,” he believes.
For CSU in particular, Holder believes learning analytics will become more popular among faculty over time, considering it still remains a niche area. “As learning analytics becomes easier to use, more powerful, and increasingly customizable to individual needs, the future seems to be one where more staff will take up the opportunities provided by it to reflect on their teaching and to provide personalized support to students. This engagement, we hope, will lead to more and better actions towards students to improve both their learning and success,” Holder concludes.
Identifying Students at Risk: How the University of Groningen Began Using Learning Analytics
When the University of Groningen, a traditional 400-year-old educational institution located in the Netherlands decided to use Blackboard Analytics for Learn, they had a clear vision in mind. For the Dutch institution, learning analytics offered opportunity to identify at-risk students at an early stage in their studies.
Currently at Groningen, the student advisor usually waits for the first exam results – normally at least 10 weeks after classes have started – before reaching out to students and offer support. “We feel that it takes too long to ensure a good start of their studies,” says Hans Beldhuis, program and change manager of Educational Innovation and Strategy at the University of Groningen.
For their pilot program with Blackboard Analytics for Learn, the university decided to change the approach by using login exceptions, minutes spent in the LMS, and grades in the Grade Center to identify at-risk students during the first three weeks of any given course. Both grades and login exceptions proved to be good measurements for identifying at-risk students.
However, at this moment, their goal was not yet achieved for all courses. Beldhuis believes learning analytics must be fed with valid information before one can draw some kind of conclusion “on the numbers,” and faculty need to be aware of that. “The course needs to contain intermediate tests in its early stages and the grading should be incorporated in the grade book,” explains Beldhuis. If instructors fail to meet these demands, Learning Analytics is less effective.
The Ethics of Reaching Out to At-Risk Students
At Groningen, the university is not only concerned about providing at-risk students with the right support, but also to ensure their identities are not available to everyone on campus – Privacy issues, in particular, are of high importance in Europe due to a new General Data Protection Regulation (GDPR) legislation put in place in May 2018.
“We do not think that a professor needs to know at what time students logged on to the system. For faculty, it would be sufficient to see, on a group level, how many students have logged on in a certain time period. In class, the professor can refer to that number and the students might draw their own conclusion on what actions to take,” points out Jan Tjeerd Groenewoud, project lead EWS at the University of Groningen.
“Having said that, it is true that a professor’s general remarks on the engagement of the group as a whole has far less impact than very specific, private feedback to individual students. The protection of privacy can be a higher priority than the efficacy of occasional professor-student feedback,” Groenewoud admits.
Although the university has not been able to measure and analyze project outcomes so far, a qualitative analysis was performed through questionnaires. Students were asked to reflect on their opinion on the personal course report and to share their comfort level with being contacted by the department based on learning analytics, and the results were promising.
Outcomes of Learning Analytics Usage at Groningen
- 70% of students knew how to access their learning analytics report, and 75% of those respondents reported they have consulted the reports two or more times.
- 6.5% of students were contacted by the department based on learning analytics information, with 80% of them feeling “positive” or “very positive” about being contacted.
Building Curriculum Maps with Learning Analytics at University of Windsor
Curriculum maps are a way to visualize and analyze the structure of a program for both summative program reporting and formative planning and enhancement. They illustrate how individual courses work together to support student success in achieving outcomes, providing information that is very useful for improving both program and course design.
The University of Windsor had three goals in mind when they decided to give learning analytics a try: (1) to inform program review; (2) to facilitate accreditation reporting, demonstrating where and how frequently outcomes are being assessed throughout the programs; and (3) to chart student performance with respect to specific learning outcomes throughout a program. “Although we have been doing all of these things for years, the work was manual,” remembers Allyson Skene, teaching & learning specialist at the University of Windsor.
Their first experience was a pilot at the university’s Faculty of Engineering. Work that was traditionally done by sifting through course syllabi, analyzing course grades, re-assessing achievement in assignment artifacts gleaned from various stages in student degree programs, and finally documenting and analyzing all the information on spreadsheets, was streamlined. Canadian Engineering Accreditation Board (CEAB) outcomes were aligned to specific assignments or grade center columns in Blackboard, and the information was pulled into a series of reports in A4L.
“A4L is the University of Windsor’s first learning analytics solution. One of the main goals in purchasing it was to automate at least parts of what has long been an arduous and time-consuming task,” says Skene.
The pilot was a success as the resulting reports were consistent with CEAB reporting requirements and able to illustrate where and how frequently students were being assessed, as well as their performance on key outcomes.
The Future of Learning Analytics: Customers Speak
“Blackboard Analytics for Learn has given us the capability to answer many faculty questions and to provide them the evidence they require before making decisions. Whether it is seeing who has logged on or how many times they have logged on to the course site through the Course at a Glance report, or seeing which students have posted to discussion forums through a dashboard (rather than having to go through each forum and count!), faculty now have powerful analytics they can apply to their course and their course site’s specific structure.”
Ian Holder, Adaptive Learning and Teaching Analyst, Learning Technologies Unit, Division of Learning and Teaching at Charles Sturt University
“Teachers could benefit from learning analytics if they see it as their role to warn students about procrastinating behavior, based on group level log-in information. Learning analytics could play a role in selecting and inviting low achievers, based on intermediate testing. With analytics, we could also support management to identify courses that need attention from the instructor, together with instructional designers, assessment experts, and others.”
Hans Beldhuis, Program and Change Manager Educational Innovation and Strategy at the University of Groningen
“We are currently working on integrating other data sources into the A4L platform, with the hopes of being able to generate more robust reporting that doesn’t rely exclusively on that which is mined from the LMS.”
Allyson Skene, Teaching & Learning Specialist at the University of Windsor
Photos by: AFP – William Philpott