I recently came across an interesting learning analytics infographic developed by Andrianes Pinantoan and his team at Open Colleges:
There is growing interest in the use and potential abuse of learning analytics in higher education. An example is how Educause has taken an active role in educating its membership on the topic. My interest in learning analytics grew when I had the pleasure of attending the second Learning Analytics Knowledge conference, LAK12, in Vancouver this year. The first LAK conference initially defined learning analytics as “… the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” Could this definition mean that if we have the right data, we could help learners learn? If we can help students be successful in achieving learning outcomes, can we then assume that we could increase retention as well?
This definition begins with the assumption that institutions will need some data to analyze. What types of data should an institution use? Most learning management systems do allow instructors to track student behaviors such as how often students access the system, interact with it (e.g., download a document, post a discussion board, post to a blog). Learning management system (LMS) providers such as Blackboard, Instructure, and Desire2Learn have released learning analytics tools to help educators, students, and administrators sift through LMS-related data linked also to an institution’s student information system (e.g., grade point average, demographic data)
The use of learning analytics is a process that requires careful planning from beginning to end. Linda Boer and John Campbell remind of us this fact given the chapter they wrote on learning analytics in Educause’s “Game Changers” book. An institution will need to be prepared to identify specific questions that it hopes learning analytics can help answer. An institution will need to create cross-functional committees that include students, faculty members, and administrative staff representatives. This group will need to discuss the types of data that will need to be collected from student information systems, LMS’s, and other sources to help the institution increase the likelihood that it will have data that collectively describes the learning experiences of its students. Such work will need to be guided by the use of appropriate data governance policies and procedures to internally and externally communicate how data will be collected and student identities will be protected.
A challenge, as well as a great opportunity, is interpreting learning analytics data and using it in ways to make key improvements in student learning. Committees charged with engaging in learning analytics activities will need to include members with the appropriate knowledge and skill to effectively use learning analytics tools and statistical techniques to analyze and make predictions using the data selected.
Institutions will also need to think carefully about how to use the statistical findings in a way that benefits student learning. The design, development, and adoption of pilot studies will give an institution the opportunity to test the effectiveness of its proposed strategies. Pilot study findings will help an institution appropriately adjust their interventions prior to ramping up their use.
Using learning analytics tools is not a panacea. If we are to use learning analytics to improve learning outcomes, institutions will need to ensure their internal processes are well-planned and communicated clearly throughout the institution. Institutions will also need to be prepared to have strategies in place to tackle any problems the data reveal.