Outcome-based learning, or Outcome-based education, is a way of assessing learning efficacy. It’s a logical way to approach, understand, and measure the value of education by examining what a student can do after they’ve received their education. In higher education, there are typically three large buckets of outcomes that are tracked and discussed:

  1. Outcomes to support accreditation compliance. In this context, student output is examined against a standard rubric in order to identify the percentage of students performing at various levels. The output is tied to larger learning objectives, implying a causal link between What is intended to be learned => What is taught => What is actually learned => What is demonstrated.
  2. Outcomes related to graduation rates. In this context, primarily due to public outcry against both poor graduation rates and tax expenditure accountability, student “output” is taken to be the number of students who actually graduate – it’s easy to then make public statements about the incline or decline of quality tracked against historic data.
  3. Outcomes related to career placement. In this context, the “success” of an educational program is if it translates directly into vocational placement. Parents and learners searching for a return on educational investment typically use job placement data to substantiate the high cost of a college education, and admissions departments proactively communicate this outcome to prospective students.

It makes sense that people would want to quantify and measure higher education, given the amount of money at stake and the politics associated with state-funded institutions. But a desire to objectively track outcomes becomes problematic when attempting to use these outcomes narrowly and retrospectively – to describe, in hindsight, best practice or positive learning methodology. This use of quantitative data ignores the complexity of the system of education – the countless experiences an individual student has along the course of their studies, both in and out of school, that shape their success in achieving academic success, graduation rates, or career placement. A backwards-looking outcome approach attempts to identify causality where there may be only correlation, and it ignores the emotional contextual data in which learning occurs. As I discussed in a previous post, the emotional context of learning plays a central role on outcomes like completion. I suggested educators look to interaction designers for inspiration on customer journey mapping. I also think it is useful to integrate the systems perspective that designers bring to their work, particularly focusing on participatory design activities.

Designers sometimes talk about “systems-thinking” as a way of acknowledging the complexity of a situation with multiple inputs and an ill-defined sense of causality. Rather than trying to identify a “root cause”, systems-thinking acknowledges the richness of experience, but realizes that experiences can be “shepherded” in a certain direction. It’s a way of thinking about relationships between entities that are harmonious but not necessarily logical, and that elude causality. This is a fuzzier place to be, and it can feel uncomfortable because it trades control and predictability for a much more participatory style of thinking.

In an educational context, this form of participatory systems thinking means that the professor is no longer viewed as the central, authoritative source of information; they don’t “have the knowledge”, and their role is not “to give the student the knowledge.” Instead, they exist to establish scaffolds for positive experiences related to learning, where knowledge may come from other students, field experts, textbooks, online resources, and so-on. One of the reasons the “flipped classroom” has gained so much success is, I think, because of its relationship to experiential systems thinking: that it recognizes the role of the professor is to drive action within a framework, and not to simply disseminate data in a top-down fashion. This has long been the model design schools use, where the majority of learning takes place in a studio environment - learning by doing, with constraints around experiences, but with a flexible way of thinking about the system of knowledge acquisition.

Outcomes assessment is a mechanism of providing insight into what’s happening in the classroom, quantifying progress, and allowing for collaborative views of data, decision making. It’s a valid way of identifying progress on a local level – a single assignment, single class, or single course. But it ignores the systems perspective of education, one that realizes all influencers on a student will impact how they learn (and all means all – including the cultural context in which they learn, the socioeconomic background in which they grew up, their family, and so-on). There’s a gestalt principle in design that the whole is greater than the sum of the parts. This holds true in education, and so a future thinking outcomes platform should recognize and allow for tracking and understanding across all parts of the academic journey. In developing a future outcomes tool, we should focus on understanding and tracking highly subjective qualities, like sentiment and engagement. A successful outcomes instrument of the future will realize a form of “abstraction layer” between the outcome and the learning method, implying that there are infinite possibilities for teaching – that teaching need not be cookie cutter or rote. It should leverage predictive analytics during the student journey in order to help students actually change course midstream, rather than simply report compliance in the retrospective.

Outcome tracking in the future should attempt to understand the educational experience in the whole, not simply in detail related to politically charged metrics like accreditation compliance, job placement or graduation rates. We should look to an ecological precedent of systems thinking, participatory design, and the topic of experience in order to better structure outcome tracking tools and policies.

Share This Article

Twitter Facebook LinkedIn Pinterest Email