First up, I’d like to thank Simon Buckingham-Shum for his recent post on learning analytics. Rarely do I read a blog post which not only considers an ‘entry level’ question, but also then follows through multiple steps in the analysis of the topic in the way this post did. When I read the post I had the unnerving sense that the author had already thought through every question I might have considered, and answered it before I could pause to take a breath and even form the question properly in my head. I’ve been mulling the post over for the last week, and I decided that I really needed to put his post in context alongside some of my own thoughts (even if this is, in academic terms, like parking a Bugatti Veyron ‘in context alongside’ a Toyota Corolla, but never mind).

In my recent post-ascilite blog post, I hypothesised that one of the gaps we have at the moment in the field of learning analytics is that of interpretation as a discipline (or an art form), to quote Buckingham-Shum’s post:

‘who is supposed to make sense of the dazzling dashboards embedded in every e-learning product pitch?’
Regardless of how much effort has been put in to construct interpretive models of how successful students are in their learning journey, there exists somewhere in the mix either a ‘black box’ attempting to draw inference on data, or a person (or people) serving this same purpose. This poses a risk, again from Buckingham-Shum’s post, that “…data points in a graph are tiny portholes onto a rich human world…proxy indicators that do not do justice to the complexity of real people, and the rich forms that learning take.” This concept captured me, as somewhere, some time, someone needs to draw some kind of inference from the data in order for it to be put to effective use. Who this is, and how it is done – there lies the question.

…somewhere, some time, someone needs to draw some kind of inference from the data in order for it to be put to effective use
As I then think back to Buckingham-Shum’s post, and the final paragraph where he states a need to focus on the upper right corner of the Knowledge-Agency Window, I start to look ahead and consider what the characteristics of a learning analytics function would need to have in order to support this kind of self-directed learning, open-ended enquiry kind of learning. Three main things stand out to me as being necessary (but not necessarily sufficient) characteristics of such a system:

It must start by initially providing simple, easy to understand, and almost impossible to misinterpret measures to consumers of the data. This acknowledges that the journey to analysis of the upper right hand quadrant of the Knowledge-Agency Window starts at the beginning – in the lower left hand quadrant, but it starts there as nothing more than a ‘seed’ of curiosity for the consumer.

It must be flexible enough to allow the consumers of the data – both students and institutions – to build on this simple start, asking follow up questions, and allowing them to ‘follow the white rabbit’ in terms of digging deeper into the information, but doing it in a way which allows them to create the measures which are meaningful for their context. If I think back to the Moodle Engagement Analytics tool designed by Phillip Dawson and developed by the team at NetSpot (now a part of Blackboard), then it did this to an extent – it gave users a starting point of a common framework of typical interaction measures on a course, but it also supported (if not encouraged) data consumers to tweak these measures in order to get a better result based on the specifics of their course. Granted, this was purely driven by the course designer, but it did at least support this concept to an extent – extend this concept to a truly user-driven model of analytics construction and I think the end result could be immensely powerful.

Finally, it must be designed in such a way which helps the consumers of the information to build their own interpretation skills as they follow their own white rabbit, whether they be a student, teacher or organisation – teaching them to question the inferences they can draw, the limitations of the data, and what else they should be considering as part of the broader picture. Like my son learning how to critically evaluate the information he finds on the internet, all consumers of learning analytics information must learn how to critically interpret their own data, and then to plan actions based on it.

(To read the entire post, go to

Related Posts

Share This Article

Twitter Facebook LinkedIn Pinterest Email