I have never blogged before. I use Twitter . . . more as a social experiment and to keep my girlfriend up to date with my business travels, but I don’t really consider that blogging. Sending a text message to my Twitter account that I’m in the process of eating a cheese sandwich just doesn’t seem to be very interesting to anyone . . . including me.
My goal in posting to EducateInnovate this year is for this blogging experience to be a little different, because 2008 is the year we at Blackboard take a new look at “Client Success.” In my role as Blackboard’s senior vice president for Client Success, I continually interact with students, faculty members and administrators who explain to me the impact Blackboard products are having at their institutions, and about the benefits they are receiving.
It’s easy to see that e-Learning, in general, is scaling rapidly across the world. New clients in new markets are coming online with Blackboard software everyday. Institutional and political leaders are supporting major investments in e-Learning infrastructure. Online teaching and learning using Blackboard has clearly become mission-critical to institutions around the globe, and its importance continues to grow.
For Blackboard this means our clients’ expectations of us our growing, as well. Implementations of our software are more complex than ever, and the stakes are certainly higher. We’ll focus much of our time, energies and resources this year on improving the quality of our product development and the experiences our users have with Client Support, making Blackboard an easier company to do business with, and measuring the success of our efforts in these areas.
Those of you who know me, know I’m a metrics nerd. I don’t start anything unless I know exactly how I’m going to measure how well it works. Here’s an example why:
I read an article recently about Continental Airlines. The company decided several years back that it would give financial bonuses to its pilots for using less fuel on their flights. Seems like a simple enough of an idea: Save money by having pilots cut back on unnecessary fuel use and then pass some of the savings onto the pilots.
Of course, something else happened that Continental couldn’t predict.
The pilots started to wait to turn the air conditioning on in their planes until they were pulling back to the gates. That meant passengers sitting on the planes got hot, sweaty and angry. Then the pilots started slowing the planes down, as this uses less fuel . . . and also means planes were arriving at their destinations late, and passengers were missing their connections. The hot, sweaty and angry passengers naturally took their rage out on Continental’s gate agents, who had to work overtime to rebook their passengers on other airlines. What a disaster. Continental’s pilots received their bonuses, but they sure didn’t make their passengers happy, nor did the company save the money it was hoping to.
An interesting aspect to this story is that I would bet the person who originally proposed this less-fuel program did so with the very best intentions. Yet the impact of the program on both the airline’s clients and employees was awful.
We at companies like Blackboard must understand thoroughly what we’re doing; how we’re doing it; our impact on clients; and, most importantly, how we’re measuring the success of our efforts. Communicating what is working and what isn’t working is how a company – or learning institution – builds credibility with both staff members and clients.
I interact with many companies that say they measure quality in their organizations, but I wonder how often the data they capture is shared with their internal business units, let alone their client bases.
Recently, I hired Matt Painter to be Blackboard’s Business Intelligence Analyst. Matt previously worked in Client Support and, after a brief stint away from Blackboard, is back and dedicated to helping us measure client success throughout our organization.
Matt’s role will be to ensure we have control over the day-to-day analytical reporting of the client experience. Specifically, this means measuring the impact our programs have on our clients and employees. We’ll look to share this data with our clients as well, so you can see how we’re measuring up . . . pun intended.
Here’s looking forward to a measurable 2008. Pass the peanuts.