What is this madness? Giving data to everybody, all the time. Madness, right?” –Kara Bosch (Director of Financial systems and Business Intelligence Solutions, Central Piedmont Community College)

It is a common expression among athletes that “experience is what you get the moment after you needed it.” The same can be said for data governance.

The importance of institutional data is now so widely accepted within higher education that it has become almost cliché to talk about it. The role of institutional research offices has increased in both intensity and in scope. They work to meet rapidly increasing reporting demands at the same time as they meet the information needs of administrators engaged in evidence-based decision-making. And yet, most institutions, including many large public and R1 universities, still lack strong data governance policies. In the absence of the right people, policies, practices, and technologies the question is not whether you will experience a data crisis. It’s when.

Supporting good data governance with powerful technology

Data governance involves policies, systems, and practices meant to ensure that an institution’s data is accurate, complete, consistent, reliable, and available. Humans have a way of introducing error. Effective data governance is basically about meeting the information needs of humans in a way that excludes them from the data management process as much as possible. No matter how good a data governance policy looks on paper, it is all for naught if it is inconsistently applied. That’s where technology comes in. Extract, transform, and load procedures like those built into the Blackboard Intelligence data warehouse automate the process of obtaining, cleaning, and archiving data. In doing so, it forces institutions to think carefully about what their data mean, and establishes a single source of truth so that reporting out of the university is always consistent.

It is perhaps easy to think of a data warehouse as merely a repository of stored data– the raw material out of which users can derive additional variables and conduct analyses according to their specific purposes. It is vital from a data governance perspective that an institution’s data warehouse strive to be as complete as possible. I know of institutions, for example, that have built their infrastructure from the ground up, and to meet a limited number of use cases. When it comes time to do more than the most basic reporting, analysts are forced to supplement data from the warehouse with additional information pulled from transactional systems, and write custom queries to create derived variables. When queries and transformations are stored on the desktop computer of an analyst, reliability becomes a huge issue. Trying to reverse engineer the undocumented complex logic of a past employee can be a nightmare, and it can be unnerving to realize that you can’t reproduce publicly reported figures.

Doing amazing things

Of course, data governance is about much more than mitigating risk of public embarrassment. If you have confidence in the accuracy, completeness, consistency, and reliability of your institution’s data, and it is made available centrally so you have a single source of truth, it becomes possible to do amazing things. A good and consistently applied data governance policy breeds trust, which is vital if you are going to promote a culture of evidence-based decision-making. When you know what data you have and are confident in role-based security rules, you can democratize your data. Like Coppin State University, you can put your student success data in everyone’s hands in such a way as to orient your entire campus around a specific goal. By democratizing its data, Coppin State University was able to increase its enrollment by 50% after several years of decline. But opening up your institution’s data without the right policies and technologies in place would be incredibly unwise.

Creating and implementing a data governance policy can sound like a lot of hard work. And it is. Fortunately, you don’t have to boil the ocean overnight. As Kara Bosch remarked at the 2017 Blackboard Analytics Symposium, a policy begins with a plan. And a plan begins with creating opportunities to get the right people in the room to agree on its importance.

Making a case for data governance

A comprehensive data warehouse can be incredibly helpful in making a case for data governance in the first place. For many of our customers, simply installing Blackboard Intelligence and going through the process of data validation has brought to light significant data integrity issues, and has forced conversations that have led to common data definitions.

There are many ways to approach data governance. Developing an effective strategy is a campus-wide effort involving people and culture as much as it involves technology. It is a long process that may take years, and that requires constant feeding and nurturing. Establishing a strong data governance policy overnight is unrealistic. Starting now, however, is essential.

To learn more about how Blackboard’s data warehouse solution, Blackboard Intelligence, can support your institution’s data governance policy, please visit our product web page or contact us to schedule a demo.

Related Posts

Share This Article

Twitter Facebook LinkedIn Pinterest Email