Data Governance Star Wars: Balancing Bureaucracy and Agility

I was recently discussing data governance best practices with Rob Karel, the well respected analyst at Forrester Research, and our conversation migrated to one of data governance’s biggest challenges — how to balance bureaucracy and business agility.

So Rob and I thought it would be fun to tackle this dilemma in a Star Wars themed debate across our individual blog platforms with Rob taking the position for Bureaucracy as the Empire and me taking the opposing position for Agility as the Rebellion.

(Yes, the cliché is true, conversations between self-proclaimed data geeks tend to result in Star Wars or Star Trek parallels.)

Disclaimer: Remember that this is a true debate format where Rob and I are intentionally arguing polar opposite positions with full knowledge that the reality is data governance success requires effectively balancing bureaucracy and agility.

Please take the time to read both of our blog posts, then we encourage your comments — and your votes (see the poll below).

Data Governance Star Wars

If you are having trouble viewing this video, you can watch it on Vimeo by clicking on this link: Data Governance Star Wars

The Force is Too Strong with This One

“Don’t give in to Bureaucracy—that is the path to the Dark Side of Data Governance.”

Data governance requires the coordination of a complex combination of a myriad of factors, including executive sponsorship, funding, decision rights, arbitration of conflicting priorities, policy definition, policy implementation, data quality remediation, data stewardship, business process optimization, technology enablement, and, perhaps most notably, policy enforcement.

When confronted by this phantom menace of complexity, many organizations believe that the only path to success must be command and control—institute a rigid bureaucracy to dictate policies, demand compliance, and dole out punishments.  This approach to data governance often makes policy compliance feel like imperial rule, and policy enforcement feel like martial law.

But beware.  Bureaucracy, command, control—the Dark Side of Data Governance are they.  Once you start down the dark path, forever will it dominate your destiny, consume your organization it will.

No Time to Discuss this as a Committee

“There is a great disturbance in the Data, as if millions of voices suddenly cried out for Governance but were suddenly silenced.  I fear something terrible has happened.  I fear another organization has started by creating a Data Governance Committee.”

Yes, it’s true—at some point, an official Data Governance Committee (or Council, or Board, or Galactic Senate) will be necessary.

However, one of the surest ways to guarantee the failure of a new data governance program is to start by creating a committee.  This is often done with the best of intentions, bringing together key stakeholders from all around the organization, representatives of each business unit and business function, as well as data and technology stakeholders.  But when you start by discussing data governance as a committee, you often never get data governance out of the committee (i.e., all talk, mostly arguing, no action).

Successful data governance programs often start with a small band of rebels (aka change agents) struggling to restore quality to some business-critical data, or struggling to resolve inefficiencies in a key business process.  Once news of their successful pilot project spreads, more change agents will rally to the cause—because that’s what data governance truly requires, not a committee, but a cause to believe in and fight for—especially after the Empire of Bureaucracy strikes back and tries to put down the rebellion.

Collaboration is the Data Governance Force

“Collaboration is what gives a data governance program its power.  Its energy binds us together.  Cooperative beings are we.  You must feel the Collaboration all around you, among the people, the data, the business process, the technology, everywhere.”

Many rightfully lament the misleading term “data governance” because it appears to put the emphasis on “governing data.”

Data governance actually governs the interactions among business processes, data, technology and, most important—people.  It is the organization’s people, empowered by high quality data and enabled by technology, who optimize business processes for superior corporate performance.  Data governance reveals how truly interconnected and interdependent the organization is, showing how everything that happens within the enterprise happens as a result of the interactions occurring among its people.

Data governance provides the framework for the communication and collaboration of business, data, and technical stakeholders, and establishes an enterprise-wide understanding of the roles and responsibilities involved, and the accountability required to support the organization’s business activities, and materialize the value of the enterprise’s data as positive business impacts.

Enforcing data governance policies with command and control is the quick and easy path—to failure.  Principles, not policies, are what truly give a data governance program its power.  Communication and collaboration are the two most powerful principles.

“May the Collaboration be with your Data Governance program.  Always.”

Always in Motion is the Future

“Be mindful of the future, but not at the expense of the moment.  Keep your concentration here and now, where it belongs.”

Perhaps the strongest case against bureaucracy in data governance is the business agility that is necessary for an organization to survive and thrive in today’s highly competitive and rapidly evolving marketplace.  The organization must follow what works for as long as it works, but without being afraid to adjust as necessary when circumstances inevitably change.

Change is the only galactic constant, which is why data governance policies can never be cast in stone (or frozen in carbonite).

Will a well-implemented data governance strategy continue to be successful?  Difficult to see.  Always in motion is the future.  And this is why, when it comes to deliberately designing a data governance program for agility: “Do or do not.  There is no try.”

Click here to read Rob “Darth” Karel’s blog post entry in this data governance debate

Please feel free to also post a comment below and explain your vote or simply share your opinions and experiences.

Listen to Data Governance Star Wars on OCDQ Radio — In Part 1, Rob Karel and I discuss our blog mock debate, which is followed by a brief Star Wars themed intermission, and then in Part 2, Gwen Thomas joins us to provide her excellent insights.

Related Posts

DQ-View: Roman Ruts on the Road to Data Governance

The Data Governance Oratorio

Zig-Zag-Diagonal Data Governance

Data Governance and the Buttered Cat Paradox

Beware the Data Governance Ides of March

The Collaborative Culture of Data Governance

Connect Four and Data Governance

The Role Of Data Quality Monitoring In Data Governance

Podcast: Data Governance is Mission Possible

Jack Bauer and Enforcing Data Governance Policies

The Prince of Data Governance

MacGyver: Data Governance and Duct Tape

Beyond a “Single Version of the Truth”

This post is involved in a good-natured contest (i.e., a blog-bout) with two additional bloggers: Henrik Liliendahl Sørensen and Charles Blyth.  Our contest is a Blogging Olympics of sorts, with the United States, Denmark, and England competing for the Gold, Silver, and Bronze medals in an event we are calling “Three Single Versions of a Shared Version of the Truth.” 

Please take the time to read all three posts and then vote for who you think has won the debate (see poll below).  Thanks!


The “Point of View” Paradox

In the early 20th century, within his Special Theory of Relativity, Albert Einstein introduced the concept that space and time are interrelated entities forming a single continuum, and therefore the passage of time can be a variable that could change for each individual observer.

One of the many brilliant insights of special relativity was that it could explain why different observers can make validly different observations – it was a scientifically justifiable matter of perspective. 

It was Einstein's apprentice, Obi-Wan Kenobi (to whom Albert explained “Gravity will be with you, always”), who stated:

“You're going to find that many of the truths we cling to depend greatly on our own point of view.”

The Data-Information Continuum

In the early 21st century, within his popular blog post The Data-Information Continuum, Jim Harris introduced the concept that data and information are interrelated entities forming a single continuum, and that speaking of oneself in the third person is the path to the dark side.

I use the Dragnet definition for data – it is “just the facts” collected as an abstract description of the real-world entities that the enterprise does business with (e.g., customers, vendors, suppliers).

Although a common definition for data quality is fitness for the purpose of use, the common challenge is that data has multiple uses – each with its own fitness requirements.  Viewing each intended use as the information that is derived from data, I define information as data in use or data in action.

Quality within the Data-Information Continuum has both objective and subjective dimensions.  Data's quality is objectively measured separate from its many uses, while information's quality is subjectively measured according to its specific use.


Objective Data Quality

Data quality standards provide a highest common denominator to be used by all business units throughout the enterprise as an objective data foundation for their operational, tactical, and strategic initiatives. 

In order to lay this foundation, raw data is extracted directly from its sources, profiled, analyzed, transformed, cleansed, documented and monitored by data quality processes designed to provide and maintain universal data sources for the enterprise's information needs. 

At this phase of the architecture, the manipulations of raw data must be limited to objective standards and not be customized for any subjective use.  From this perspective, data is now fit to serve (as at least the basis for) each and every purpose.


Subjective Information Quality

Information quality standards (starting from the objective data foundation) are customized to meet the subjective needs of each business unit and initiative.  This approach leverages a consistent enterprise understanding of data while also providing the information necessary for day-to-day operations.

But please understand: customization should not be performed simply for the sake of it.  You must always define your information quality standards by using the enterprise-wide data quality standards as your initial framework. 

Whenever possible, enterprise-wide standards should be enforced without customization.  The key word within the phrase “subjective information quality standards” is standards — as opposed to subjective, which can quite often be misinterpreted as “you can do whatever you want.”  Yes you can – just as long as you have justifiable business reasons for doing so.

This approach to implementing information quality standards has three primary advantages.  First, it reinforces a consistent understanding and usage of data throughout the enterprise.  Second, it requires each business unit and initiative to clearly explain exactly how they are using data differently from the rest of your organization, and more important, justify why.  Finally, all deviations from enterprise-wide data quality standards will be fully documented. 


The “One Lie Strategy”

A common objection to separating quality standards into objective data quality and subjective information quality is the enterprise's significant interest in creating what is commonly referred to as a “Single Version of the Truth.”

However, in his excellent book Data Driven: Profiting from Your Most Important Business Asset, Thomas Redman explains:

“A fiendishly attractive concept is...'a single version of the truth'...the logic is compelling...unfortunately, there is no single version of the truth. 

For all important data, there are...too many uses, too many viewpoints, and too much nuance for a single version to have any hope of success. 

This does not imply malfeasance on anyone's part; it is simply a fact of life. 

Getting everyone to work from a single version of the truth may be a noble goal, but it is better to call this the 'one lie strategy' than anything resembling truth.”

Beyond a “Single Version of the Truth”

In the classic 1985 film Mad Max Beyond Thunderdome, the title character arrives in Bartertown, ruled by the evil Auntie Entity, where people living in the post-apocalyptic Australian outback go to trade for food, water, weapons, and supplies.  Auntie Entity forces Mad Max to fight her rival Master Blaster to the death within a gladiator-like arena known as Thunderdome, which is governed by one simple rule:

“Two men enter, one man leaves.”

I have always struggled with the concept of creating a “Single Version of the Truth.”  I imagine all of the key stakeholders from throughout the enterprise arriving in Corporatetown, ruled by the Machiavellian CEO known only as Veritas, where all business units and initiatives must go to request funding, staffing, and continued employment.  Veritas forces all of them to fight their Master Data Management rivals within a gladiator-like arena known as Meetingdome, which is governed by one simple rule:

“Many versions of the truth enter, a Single Version of the Truth leaves.”

For any attempted “version of the truth” to truly be successfully implemented within your organization, it must take into account both the objective and subjective dimensions of quality within the Data-Information Continuum. 

Both aspects of this shared perspective of quality must be incorporated into a “Shared Version of the Truth” that enforces a consistent enterprise understanding of data, but that also provides the information necessary to support day-to-day operations.

The Data-Information Continuum is governed by one simple rule:

“All validly different points of view must be allowed to enter,

In order for an all encompassing Shared Version of the Truth to be achieved.”


You are the Judge

This post is involved in a good-natured contest (i.e., a blog-bout) with two additional bloggers: Henrik Liliendahl Sørensen and Charles Blyth.  Our contest is a Blogging Olympics of sorts, with the United States, Denmark, and England competing for the Gold, Silver, and Bronze medals in an event we are calling “Three Single Versions of a Shared Version of the Truth.” 

Please take the time to read all three posts and then vote for who you think has won the debate.  A link to the same poll is provided on all three blogs.  Therefore, wherever you choose to cast your vote, you will be able to view an accurate tally of the current totals. 

The poll will remain open for one week, closing at midnight on November 19 so that the “medal ceremony” can be conducted via Twitter on Friday, November 20.  Additionally, please share your thoughts and perspectives on this debate by posting a comment below.  Your comment may be copied (with full attribution) into the comments section of all of the blogs involved in this debate.


Related Posts

Poor Data Quality is a Virus

The General Theory of Data Quality

The Data-Information Continuum

Blog-Bout: “Risk” versus “Monopoly”

A “blog-bout” is a good-natured debate between two bloggers.  This blog-bout is between Jim Harris and Phil Simon, where they debate which board game is the better metaphor for an Information Technology (IT) project: “Risk” or “Monopoly.”


Why “Risk” is a better metaphor for an IT Project

By Jim Harris

IT projects and “Risk” have a great deal in common.  I thought long and hard about this while screaming obscenities and watching professional sports on television, the source of all of my great thinking.  I came up with five world dominating reasons.

1. Both things start with the players marking their territory.  In Risk, the game begins with the players placing their “armies” on the territories they will initially occupy.  On IT projects, the different groups within the organization will initially claim their turf. 

Please note that the term “Information Technology” is being used in a general sense to describe a project (e.g. Data Quality, Master Data Management, etc.) and should not be confused with the IT group within an organization.  At a very high level, the Business and IT are the internal groups representing the business and technical stakeholders on a project.

The Business usually owns the data and understands its meaning and use in the day-to-day operation of the enterprise.  IT usually owns the hardware and software infrastructure of the enterprise's technical architecture. 

Both groups can claim they are only responsible for what they own, resist collaborating with the “other side” and therefore create organizational barriers as fiercely defended as the continental borders of Europe and Asia in Risk.

2. In both, there are many competing strategies.  In Risk, the official rules of the game include some basic strategies and over the years many players have developed their own fool-proof plans to guarantee victory.  Some strategies advocate focusing on controlling entire continents, while others advise fortifying your borders by invading and occupying neighboring territories.  And my blog-bout competitor Phil Simon half-jokingly claims that the key to winning Risk is securing the island nation of Madagascar.

On IT projects, you often hear a lot of buzzwords and strategies bandied about, such as Lean, Agile, Six Sigma, and Kaizen, to name but a few.  Please understand – I am an advocate for methodology and best practices, and there are certainly many excellent frameworks out there, including the paradigms I just mentioned.

However, a general problem that I have with most frameworks is their tendency to adopt a one-size-fits-all strategy, which I believe is an approach that is doomed to fail.  Any implemented framework must be customized to adapt to an organization’s unique culture. 

In part, this is necessary because implementing changes of any kind will be met with initial resistance, but an attempt at forcing a one-size-fits-all approach almost sends a message to the organization that everything they are currently doing is wrong, which will of course only increase the resistance to change. 

Starting with a framework simply provides a reference of best practices and recommended options of what has worked on successful IT projects.  The framework should be reviewed in order to determine what can be learned from it and to select what will work in the current environment and what simply won't.     

3. Pyrrhic victories are common during both endeavors.  In Risk, sacrificing everything to win a single battle or to defend your favorite territory can ultimately lead you to lose the war.  Political fiefdoms can undermine what could otherwise have been a successful IT project.  Do not underestimate the unique challenges of your corporate culture.

Obviously, business, technical and data issues will all come up from time to time, and there will likely be disagreements regarding how these issues should be prioritized.  Some issues will likely affect certain stakeholders more than others. 

Keeping data and technology aligned with business processes requires getting people aligned and free to communicate their concerns.  Coordinating discussions with all of the stakeholders and maintaining open communication can prevent a Pyrrhic victory for one stakeholder causing the overall project to fail.

4. Alliances are the key to true victory.  In Risk, it is common for players to form alliances by combining their resources and coordinating their efforts in order to defend their shared borders or to eliminate a common enemy. 

On IT projects, knowledge about data, business processes and supporting technology are spread throughout the organization.  Neither the Business nor IT alone has all of the necessary information required to achieve success. 

Successful projects are driven by an executive management mandate for the Business and IT to forge an alliance of ongoing and iterative collaboration throughout the entire project.

5. The outcomes of both are too often left to chance.  IT projects are complex, time-consuming, and expensive enterprise initiatives.  Success requires people taking on the challenge united by collaboration, guided by an effective methodology, and implementing a solution using powerful technology.

But the complexity of an IT project can sometimes work against your best intentions.  It is easy to get pulled into the mechanics of documenting the business requirements and functional specifications, drafting the project plan and then charging ahead on the common mantra: “We planned the work, now we work the plan.”

Once an IT project achieves some momentum, it can take on a life of its own and the focus becomes more and more about making progress against the tasks in the project plan, and less and less on the project's actual business goals.  Typically, this leads to another all too common mantra: “Code it, test it, implement it into production, and then declare victory.”

In Risk, the outcomes are literally determined by a roll of the dice.  If you allow your IT project to lose sight of its business goals, then you treat it like a game of chance.  And to paraphrase Albert Einstein:

“Do not play dice with IT Projects.”

Why “Monopoly” is a better metaphor for an IT Project

By Phil Simon

IT projects and “Monopoly” have a great deal in common.  I thought long and hard about this at the gym, the source of all of my great thinking.  I came up with six really smashing reasons.

1. Both things take much longer than originally expected.  IT projects typically take much longer than expected for a wide variety of reasons.  Rare is the project that finishes on time (with expected functionality delivered).

The same holds true for Monopoly.  Remember when you were a kid and you wanted to play a quick game?  Now, I consider the term “a quick game of Monopoly” to be the very definition of an oxymoron.  You’d better block off about four to six hours for a proper game.  Unforeseen complexities will doubtlessly delay even the best intentions.

2. During both endeavors, screaming matches typically erupt.  Many projects become tense.  I remember one in which two participants nearly came to blows.  Most projects have key players engage in very heated debates over strategic vision and execution.

With Monopoly, especially after the properties are divvied up, players scream and yell over what constitutes a “fair” deal.  “What do you mean Boardwalk for Ventnor Avenue and Pennsylvania Railroad isn’t reasonable?  IT’S COMPLETELY FAIR!”  Debates like this are the rule, not the exception.

3. While the basic rules may be the same, different people play by different rules.  The vast majority of projects on which I have worked have had the usual suspects: steering committees, executive sponsors, PMOs, different stages of testing, and ultimately system activation.  However, different organizations often try to do things in vastly different ways.  For example, on two similar projects in different organizations, you are likely to find differences with respect to:

  • the number of internal and external folks assigned to a project
  • the project’s timeline and budget
  • project objectives

By the same token, people play Monopoly in somewhat different ways.  Many don’t know about the auction rule.  Others replenish Free Parking with a new $500 bill after someone lands on it.  Also, many people disregard altogether the property assessment card while sticklers like me assess penalties when that vaunted red card appears.

4. Personal relationships can largely determine the outcome in both.  Negotiation is key on IT projects.  Clients negotiate rates, prices, and responsibilities with consulting vendors and/or software vendors.

In Monopoly, personal rivalries play a big part in who makes a deal with whom.  Often players chime in (uninvited, of course) with their opinions on potential deals, without a doubt to affect the outcome.

5. Little things really matter, especially at the end.  Towards the end of an IT project, snakes in the woodwork often come out to bite people when they least expect it.  A tightly staffed or planned project may not be able to withstand a relatively minor problem, especially if the go-live date is non-negotiable.

In Monopoly, the same holds true.  Laugh all you want when your opponent builds hotels on Mediterranean Avenue and Baltic Avenue, but at the end of the game those $250 and $450 charges can really hurt, especially when you’re low on cash.

6. Many times, each does not end; it is merely abandoned.  A good percentage of projects have their plugs pulled prior to completion.  A CIO may become tired with an interminable project and decide to simply end it before costs skyrocket even further.

I’d say that about half of the Monopoly games that I’ve played in the last fifteen years have also been called by “executive decision.”  The writing is on the board, as 1 a.m. rolls around and only two players remain.  Often player X simply cedes the game to player Y.


You are the Referee

All bouts require a referee.  Blog-bouts are refereed by the readers.  Therefore, please cast your vote in the poll and also weigh in on this debate by sharing your thoughts by posting a comment below.  Since a blog-bout is co-posted, your comments will be copied (with full attribution) into the comments section of both of the blogs co-hosting this blog-bout.


About Jim Harris

Jim Harris is the Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ), which is an independent blog offering a vendor-neutral perspective on data quality.  Jim is also an independent consultant, speaker, writer and blogger with over 15 years of professional services and application development experience in data quality (DQ), data integration, data warehousing (DW), business intelligence (BI), customer data integration (CDI), and master data management (MDM).  Jim is also a contributing writer to Data Quality Pro, the leading online magazine and community resource dedicated to data quality professionals.


About Phil Simon

Phil Simon is the author of the acclaimed book Why New Systems Fail: Theory and Practice Collide and the highly anticipated upcoming book The Next Wave of Technologies: Opportunities from Chaos.  Phil is also an independent systems consultant and a dynamic public speaker for hire focusing on how organizations use technology.  Phil also writes for a number of technology media outlets.