Data governance maturity models – Stanford
As an educator, speaker, and data governance professional I sometimes get asked,
“How can you measure the effectiveness of your data governance program?”
From my experience, there are two main ways to do so:
- Through a data governance scorecard
- By evaluating it against a data governance maturity model
This series of articles is focusing on the second option by offering an introduction to some of the existing maturity models. So far I’ve covered IBM’s maturity model and it makes most sense to follow it with Stanford’s as it follows the same maturity levels as IBM’s.
Stanford Maturity Model
Overview: Developed in 2011 by Stanford University’s Data Governance Office, the model was adapted from other models, such as IBM’s and CMM’s. It is based on the structure of their data governance program, with a focus on both foundational and project aspects of data governance.
The foundational aspects focus on measuring core data governance competencies and development of critical program resources, as follows:
- Awareness: The extent to which individuals within the organization have knowledge of the roles, rules, and technologies associated with the data governance program.
- Formalization: The extent to which roles are structured in an organization and the activities of the employees are governed by rules and procedures.
- Metadata: Data that describes other data and IT assets (such as databases, tables and applications) by relating essential business and technical information; and data which facilitates the consistent understanding of the characteristics and usage of data. Technical metadata describes data elements and other IT assets as well as their use, representation, context and interrelations. Business metadata answers who, what, where, when, why and how for users of the data and other IT assets.
The project grouping measure how effectively data governance concepts are applied in the course of funded projects (Stanford, 2011):
- Stewardship: The formalization of accountability for the definition, usage, and quality standards of specific data assets within a defined organizational scope.
- Data Quality: The continuous process for defining the parameters for specifying acceptable levels of data quality to meet business needs, and for ensuring that data quality meets these levels.
- Master Data: Business-critical data that is highly shared across the organization. Master data are often codified data, data describing the structure of the organization or key data entities (such as “patient”, “employee”, or “student”).
If you want to know about the different types of a data steward, please read our other article, too.
Next, the following three dimensions further subdivide each of the mentioned six maturity components:
- People: Roles and organization structures.
- Policies: Development, auditing and enforcement of data policies, standards and best practices.
- Capabilities: Enabling technologies and techniques.
Stanford also provides the following guiding questions for each of the six components across the three dimensions which are very useful to guide you in your assessment.
To gauge the maturity of the qualitative aspects of the program, use a table similar to the one presented below to record your score in the component/ dimension matrix. The average attained across each Component and Dimension is the maturity level of your organization in each respective area.
Take away: The model was designed with their institution’s goals, priorities and competencies in mind, though it can also be customized to meet the needs of your organization. An initial assessment in the early stages of your data governance program is recommended and then remeasured annually.
- Data Governance at Stanford: The Stanford DG Maturity Model (DG newsletter from October 2011)
- Measuring Data Governance Maturity at Stanford (presentation from November 2011)
- DG Maturity Model and Descriptions (document from October 2011)
The next maturity model I’ll cover will be from DataFlux.