Finding Data Quality


Have you ever experienced that sinking feeling, where you sense if you don’t find data quality, then data quality will find you?

In the spring of 2003, Pixar Animation Studios produced one of my all-time favorite Walt Disney Pictures—Finding Nemo

This blog post is an hommage to not only the film, but also to the critically important role into which data quality is cast within all of your enterprise information initiatives, including business intelligence, master data management, and data governance. 

I hope that you enjoy reading this blog post, but most important, I hope you always remember: “Data are friends, not food.”

Data Silos


“Mine!  Mine!  Mine!  Mine!  Mine!”

That’s the Data Silo Mantra—and it is also the bane of successful enterprise information management.  Many organizations persist on their reliance on vertical data silos, where each and every business unit acts as the custodian of their own private data—thereby maintaining their own version of the truth.

Impressive business growth can cause an organization to become a victim of its own success.  Significant collateral damage can be caused by this success, and most notably to the organization’s burgeoning information architecture.

Earlier in an organization’s history, it usually has fewer systems and easily manageable volumes of data, thereby making managing data quality and effectively delivering the critical information required to make informed business decisions everyday, a relatively easy task where technology can serve business needs well—especially when the business and its needs are small.

However, as the organization grows, it trades effectiveness for efficiency, prioritizing short-term tactics over long-term strategy, and by seeing power in the hoarding of data, not in the sharing of information, the organization chooses business unit autonomy over enterprise-wide collaboration—and without this collaboration, successful enterprise information management is impossible.

A data silo often merely represents a microcosm of an enterprise-wide problem—and this truth is neither convenient nor kind.

Data Profiling


“I see a light—I’m feeling good about my data . . . Good feeling’s gone—AHH!”

Although it’s not exactly a riddle wrapped in a mystery inside an enigma,  understanding your data is essential to using it effectively and improving its quality—to achieve these goals, there is simply no substitute for data analysis.

Data profiling can provide a reality check for the perceptions and assumptions you may have about the quality of your data.  A data profiling tool can help you by automating some of the grunt work needed to begin your analysis.

However, it is important to remember that the analysis itself can not be automated—you need to translate your analysis into the meaningful reports and questions that will facilitate more effective communication and help establish tangible business context.

Ultimately, I believe the goal of data profiling is not to find answers, but instead, to discover the right questions. 

Discovering the right questions requires talking with data’s best friends—its stewards, analysts, and subject matter experts.  These discussions are a critical prerequisite for determining data usage, standards, and the business relevant metrics for measuring and improving data quality.  Always remember that well performed data profiling is highly interactive and a very iterative process.

Defect Prevention


“You, Data-Dude, takin’ on the defects.

You’ve got serious data quality issues, dude.


Even though it is impossible to truly prevent every problem before it happens, proactive defect prevention is a highly recommended data quality best practice because the more control enforced where data originates, the better the overall quality will be for enterprise information.

Although defect prevention is most commonly associated with business and technical process improvements, after identifying the burning root cause of your data defects, you may predictably need to apply some of the principles of behavioral data quality.

In other words, understanding the complex human dynamics often underlying data defects is necessary for developing far more effective tactics and strategies for implementing successful and sustainable data quality improvements.

Data Cleansing


“Just keep cleansing.  Just keep cleansing.

Just keep cleansing, cleansing, cleansing.

What do we do?  We cleanse, cleanse.”

That’s not the Data Cleansing Theme Song—but it can sometimes feel like it.  Especially whenever poor data quality negatively impacts decision-critical information, the organization may legitimately prioritize a reactive short-term response, where the only remediation will be fixing the immediate problems.

Balancing the demands of this data triage mentality with the best practice of implementing defect prevention wherever possible, will often create a very challenging situation for you to contend with on an almost daily basis.

Therefore, although comprehensive data remediation will require combining reactive and proactive approaches to data quality, you need to be willing and able to put data cleansing tools to good use whenever necessary.



“It’s like he’s trying to speak to me, I know it.

Look, you’re really cute, but I can’t understand what you’re saying.

Say that data quality thing again.”

I hear this kind of thing all the time (well, not the “you’re really cute” part).

Effective communication improves everyone’s understanding of data quality, establishes a tangible business context, and helps prioritize critical data issues. 

Keep in mind that communication is mostly about listening.  Also, be prepared to face “data denial” when data quality problems are discussed.  Most often, this is a natural self-defense mechanism for the people responsible for business processes, technology, and data—and because of the simple fact that nobody likes to feel blamed for causing or failing to fix the data quality problems.

The key to effective communication is clarity.  You should always make sure that all data quality concepts are clearly defined and in a language that everyone can understand.  I am not just talking about translating the techno-mumbojumbo, because even business-speak can sound more like business-babbling—and not just to the technical folks.

Additionally, don’t be afraid to ask questions or admit when you don’t know the answers.  Many costly mistakes can be made when people assume that others know (or pretend to know themselves) what key concepts and other terminology actually mean.

Never underestimate the potential negative impacts that the point of view paradox can have on communication.  For example, the perspectives of the business and technical stakeholders can often appear to be diametrically opposed.

Practicing effective communication requires shutting our mouth, opening our ears, and empathically listening to each other, instead of continuing to practice ineffective communication, where we merely take turns throwing word-darts at each other.



“Oh and one more thing:

When facing the daunting challenge of collaboration,

Work through it together, don't avoid it.

Come on, trust each other on this one.

Yes—trust—it’s what successful teams do.”

Most organizations suffer from a lack of collaboration, and as noted earlier, without true enterprise-wide collaboration, true success is impossible.

Beyond the data silo problem, the most common challenge for collaboration is the divide perceived to exist between the Business and IT, where the Business usually owns the data and understands its meaning and use in the day-to-day operation of the enterprise, and IT usually owns the hardware and software infrastructure of the enterprise’s technical architecture.

However, neither the Business nor IT alone has all of the necessary knowledge and resources required to truly be successful.  Data quality requires that the Business and IT forge an ongoing and iterative collaboration.

You must rally the team that will work together to improve the quality of your data.  A cross-disciplinary team will truly be necessary because data quality is neither a business issue nor a technical issue—it is both, truly making it an enterprise issue.

Executive sponsors, business and technical stakeholders, business analysts, data stewards, technology experts, and yes, even consultants and contractors—only when all of you are truly working together as a collaborative team, can the enterprise truly achieve great things, both tactically and strategically.

Successful enterprise information management is spelled E—A—C.

Of course, that stands for Enterprises—Always—Collaborate.  The EAC can be one seriously challenging place, dude.

You don’t know if you know what they know, or if they know what you know, but when you know, then they know, you know?

It’s like first you are all like “Whoa!” and they are all like “Whoaaa!” then you are like “Sweet!” and then they are like “Totally!”

This critical need for collaboration might seem rather obvious.  However, as all of the great philosophers have taught us, sometimes the hardest thing to learn is the least complicated.

Okay.  Squirt will now give you a rundown of the proper collaboration technique:

“Good afternoon. We’re gonna have a great collaboration today.

Okay, first crank a hard cutback as you hit the wall.

There’s a screaming bottom curve, so watch out.

Remember: rip it, roll it, and punch it.”

Finding Data Quality


As more and more organizations realize the critical importance of viewing data as a strategic corporate asset, data quality is becoming an increasingly prevalent topic of discussion.

However, and somewhat understandably, data quality is sometimes viewed as a small fish—albeit with a “lucky fin”—in a much larger pond.

In other words, data quality is often discussed only in its relation to enterprise information initiatives such as data integration, master data management, data warehousing, business intelligence, and data governance.

There is nothing wrong with this perspective, and as a data quality expert, I admit to my general tendency to see data quality in everything.  However, regardless of the perspective from which you begin your journey, I believe that eventually you will be Finding Data Quality wherever you look as well.


An Enterprise Carol

This blog post is sponsored by the Enterprise CIO Forum and HP.

Since ‘tis the season for reflecting on the past year and predicting the year ahead, while pondering this post my mind wandered to the reflections and predictions provided by the ghosts of A Christmas Carol by Charles Dickens.  So, I decided to let the spirit of Jacob Marley revisit my previous Enterprise CIO Forum posts to bring you the Ghosts of Enterprise Past, Present, and Future.


The Ghost of Enterprise Past

Legacy applications have a way of haunting the enterprise long after they should have been sunset.  The reason that most of them do not go gentle into that good night, but instead rage against the dying of their light, is some users continue using some of the functionality they provide, as well as the data trapped in those applications, to support the enterprise’s daily business activities.

This freaky feature fracture (i.e., technology supporting business needs being splintered across new and legacy applications) leaves many IT departments overburdened with maintaining a lot of technology and data that’s not being used all that much.

The Ghost of Enterprise Past warns us that IT can’t enable the enterprise’s future if it’s stuck still supporting its past.


The Ghost of Enterprise Present

While IT was busy battling the Ghost of Enterprise Past, a familiar, but fainter, specter suddenly became empowered by the diffusion of the consumerization of IT.  The rapid ascent of the cloud and mobility, spirited by service-oriented solutions that were more focused on the user experience, promised to quickly deliver only the functionality required right now to support the speed and agility requirements driving the enterprise’s business needs in the present moment.

Gifted by this New Prometheus, Shadow IT emerged from the shadows as the Ghost of Enterprise Present, with business-driven and decentralized IT solutions becoming more commonplace, as well as begrudgingly accepted by IT leaders.

All of which creates quite the IT Conundrum, forming yet another front in the war against Business-IT collaboration.  Although, in the short-term, the consumerization of IT usually better services the technology needs of the enterprise, in the long-term, if it’s not integrated into a cohesive strategy, it creates a complex web of IT that entangles the enterprise much more than it enables it.

And with the enterprise becoming much more of a conceptual, rather than a physical, entity due to the cloud and mobile devices enabling us to take the enterprise with us wherever we go, the evolution of enterprise security is now facing far more daunting challenges than the external security threats we focused on in the past.  This more open business environment is here to stay, and it requires a modern data security model, despite the fact that such a model could become the weakest link in enterprise security.

The Ghost of Enterprise Present asks many questions, but none more frightening than: Can the enterprise really be secured?


The Ghost of Enterprise Future

Of course, the T in IT wasn’t the only apparition previously invisible outside of the IT department to recently break through the veil in a big way.  The I in IT had its own coming-out party this year also since, as many predicted, 2012 was the year of Big Data.

Although neither the I nor the T is magic, instead of sugar plums, Data Psychics and Magic Elephants appear to be dancing in everyone’s heads this holiday season.  In other words, the predictive power of big data and the technological wizardry of Hadoop (as well as other NoSQL techniques) seem to be on the wish list of every enterprise for the foreseeable future.

However, despite its unquestionable potential, as its hype starts to settle down, the sobering realities of big data analytics will begin to sink in.  Data’s value comes from data’s usefulness.  If all we do is hoard data, then we’ll become so lost in the details that we’ll be unable to connect enough of the dots to discover meaningful patterns and convert big data into useful information that enables the enterprise to take action, make better decisions, or otherwise support its business activities.

Big data will force us to revisit information overload as we are occasionally confronted with the limitations of historical analysis, and blindsided by how our biases and preconceptions could silence the signal and amplify the noise, which will also force us to realize that data quality still matters in big data and that bigger data needs better data management.

As the Ghost of Enterprise Future, big data may haunt us with more questions than the many answers it will no doubt provide.


“Bah, Humbug!”

I realize that this post lacks the happy ending of A Christmas Carol.  To paraphrase Dickens, I endeavored in this ghostly little post to raise the ghosts of a few ideas, not to put my readers out of humor with themselves, with each other, or with the season, but simply to give them thoughts to consider about how to keep the Enterprise well in the new year.  Happy Holidays Everyone!

This blog post is sponsored by the Enterprise CIO Forum and HP.


Related Posts

Why does the sun never set on legacy applications?

Are Applications the La Brea Tar Pits for Data?

The Diffusion of the Consumerization of IT

The Cloud is shifting our Center of Gravity

More Tethered by the Untethered Enterprise?

A Swift Kick in the AAS

The UX Factor

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

The IT Consumerization Conundrum

OCDQ Radio - The Evolution of Enterprise Security

The Cloud Security Paradox

The Good, the Bad, and the Secure

The Weakest Link in Enterprise Security

Can the Enterprise really be Secured?

Magic Elephants, Data Psychics, and Invisible Gorillas

Big Data el Memorioso

Information Overload Revisited

The Limitations of Historical Analysis

Data Silence

The Data Governance Imperative

OCDQ Radio is a vendor-neutral podcast about data quality and its related disciplines, produced and hosted by Jim Harris.

During this episode, Steve Sarsfield and I discuss how data governance is about changing the hearts and minds of your company to see the value of data quality, the characteristics of a data champion, and creating effective data quality scorecards.

Steve Sarsfield is a leading author and expert in data quality and data governance.  His book The Data Governance Imperative is a comprehensive exploration of data governance focusing on the business perspectives that are important to data champions, front-office employees, and executives.  He runs the Data Governance and Data Quality Insider, which is an award-winning and world-recognized blog.  Steve Sarsfield is the Product Marketing Manager for Data Governance and Data Quality at Talend.


The Data Governance Imperative

Additional listening options:


Win a copy of the Book

Steve Sarsfield wants to give one OCDQ Radio listener a free copy of The Data Governance Imperative


Here is how the book contest will work:


(1) Book Contest Question — Name at least one of the characteristics of a data champion that Steve Sarsfield described during this OCDQ Radio episode.


(2) Book Contest Deadline — By or before April 30, 2012, Email Jim Harris with your answer to the book contest question.


(3) Book Contest Winner — In May 2012, one winner will be randomly selected from the emails containing the correct answer to the contest question, and Steve Sarsfield (or his publisher) will email the winner requesting a shipping address for the book.


Related Posts

Data Governance and Data Quality

MacGyver: Data Governance and Duct Tape

Data Governance Frameworks are like Jigsaw Puzzles

The Three Most Important Letters in Data Governance

Data Governance and the Adjacent Possible

Data Governance Star Wars: Balancing Bureaucracy and Agility

Beware the Data Governance Ides of March

Aristotle, Data Governance, and Lead Rulers

Data Governance and the Buttered Cat Paradox

The Data Governance Oratorio

Video: Declaration of Data Governance

The Collaborative Culture of Data Governance


Related OCDQ Radio Episodes

Clicking on the link will take you to the episode’s blog post:

  • Data Governance Star Wars — Special Guests Rob Karel and Gwen Thomas joined this extended, and Star Wars themed, discussion about how to balance bureaucracy and business agility during the execution of data governance programs.
  • The Johari Window of Data Quality — Guest Martin Doyle discusses helping people better understand their data and assess its business impacts, not just the negative impacts of bad data quality, but also the positive impacts of good data quality.

The Algebra of Collaboration

Most organizations have a vertical orientation, which creates a division of labor between functional areas where daily operations are carried out by people who have been trained in a specific type of business activity (e.g., Product Manufacturing, Marketing, Sales, Finance, Customer Service).  However, according to the most basic enterprise arithmetic, the sum of all vertical functions is one horizontal organization.  For example, in an organization with five vertical functions, 1 + 1 + 1 + 1 + 1 = 1 (and not 5).

Other times, it seems like division is the only mathematics the enterprise understands, creating perceived organizational divides based on geography (e.g., the Boston office versus the London office), or hierarchy (e.g., management versus front-line workers), or the Great Rift known as the Business versus IT.

However, enterprise-wide initiatives, such as data quality and data governance, require a cross-functional alignment reaching horizontally across the organization’s vertical functions, fostering a culture of collaboration combining a collective ownership with a shared responsibility and an individual accountability, requiring a branch of mathematics I call the Algebra of Collaboration.

For starters, as James Kakalios explained in his super book The Physics of Superheroes, “there is a trick to algebra: If one has an equation describing a true statement, such as 1 = 1, then one can add, subtract, multiply, or divide (excepting division by zero) the equation by any number we wish, and as long as we do it to both the left and right sides of the equation, the correctness of the equation is unchanged.  So if we add 2 to both sides of 1 = 1, we obtain 1 + 2 = 1 + 2 or 3 = 3, which is still a true statement.”

So, in the Algebra of Collaboration, we first establish one of the organization’s base equations, its true statements, for example, using the higher order collaborative equation that attempts to close the Great Rift otherwise known as the IT-Business Chasm:

Business = IT

Then we keep this base equation balanced by performing the same operation on both the left and right sides, for example:

Business + Data Quality + Data Governance = IT + Data Quality + Data Governance

The point is that everyone, regardless of their primary role or vertical function, must accept a shared responsibility for preventing data quality lapses and for responding appropriately to mitigate the associated business risks when issues occur.

Now, of course, as I blogged about in The Stakeholder’s Dilemma, this equation does not always remain perfectly balanced at all times.  The realities of the fiscal calendar effect, conflicting interests, and changing business priorities, will mean that the amount of resources (money, time, people) added to the equation by a particular stakeholder, vertical function, or group will vary.

But it’s important to remember the true statement that the base equation represents.  The trick of algebra is just one of the tricks of the collaboration trade.  Organizations that are successful with data quality and data governance view collaboration not just as a guiding principle, but also as a call to action in their daily practices.

Is your organization practicing the Algebra of Collaboration?


Related Posts

The Business versus IT—Tear down this wall!

The Road of Collaboration

The Collaborative Culture of Data Governance

Collaboration isn’t Brain Surgery

Finding Data Quality

Being Horizontally Vertical

The Year of the Datechnibus

Dot Collectors and Dot Connectors

No Datum is an Island of Serendip

The Three Most Important Letters in Data Governance

The Stakeholder’s Dilemma

Are you Building Bridges or Digging Moats?

Has Data Become a Four-Letter Word?

The Data Governance Oratorio

Video: Declaration of Data Governance

The Fall Back Recap Show

OCDQ Radio is a vendor-neutral podcast about data quality and its related disciplines, produced and hosted by Jim Harris.

On this episode, I celebrate the autumnal equinox by falling back to look at the Best of OCDQ Radio, including discussions about Data, Information, Business-IT Collaboration, Change Management, Big Analytics, Data Governance, and the Data Revolution.

Thank you for listening to OCDQ Radio.  Your listenership is deeply appreciated.

Special thanks to all OCDQ Radio guests.  If you missed any of their great appearances, check out the full episode list below.


The Fall Back Recap Show

Additional listening options:


Previous OCDQ Radio Episodes

Clicking on the link will take you to the episode’s blog post:


Good-Enough Data for Fast-Enough Decisions

OCDQ Radio is a vendor-neutral podcast about data quality and its related disciplines, produced and hosted by Jim Harris.

On this episode, Julie Hunt and I discuss the intersection of data quality and business intelligence, especially the strategy of good-enough data for fast-enough decisions, a necessity for surviving and thriving in the constantly changing business world.

Julie Hunt is an accomplished software industry analyst and business technology strategist, providing market and competitive insights for software vendors.  Julie Hunt has the unique perspective of a hybrid, which means she has extensive experience in the technology, business, and customer/people-oriented aspects of creating, marketing and selling software.  Working in the B2B software industry for more than 25 years, she has hands-on experience for multiple solution spaces including data integration, business intelligence, analytics, content management, and collaboration.  She is also a member of the Boulder BI Brain Trust.

Julie Hunt regularly shares her insights about the software industry on Twitter as well as via her highly recommended blog.


Good-Enough Data for Fast-Enough Decisions

Additional listening options:


Related Posts

Thaler’s Apples and Data Quality Oranges

Data Confabulation in Business Intelligence

Data In, Decision Out

The Data-Decision Symphony

The Business versus IT—Tear down this wall!

The Real Data Value is Business Insight

Is your data complete and accurate, but useless to your business?

Data, Information, and Knowledge Management

DQ-View: From Data to Decision

OCDQ Radio - Big Data and Big Analytics

OCDQ Radio - Data Profiling Early and Often

OCDQ Radio - A Brave New Data World

The IT Consumerization Conundrum

This blog post is sponsored by the Enterprise CIO Forum and HP.

The consumerization of IT is a disruptive force that many organizations are struggling to come to terms with, especially their IT departments.  As R "Ray" Wang recently blogged about this challenge, “technologies available to consumers at low cost, or even for free, are increasingly pushing aside enterprise applications.  For IT leaders accustomed to having control over corporate technology, this represents a huge challenge — and it’s one they’re not meeting very well.”

Speed and agility are the most common business drivers for implementing new technology.  The consumer technology trifecta of cloud computingSaaS, and mobility has enabled business users to directly purchase off-premises applications that quickly provide only the features they currently need.  Meanwhile, on-premises applications, although feature-rich, become user-poor because of their slower time to implement, and less-than-agile reputation for dealing with change requests and customizations.

However, the organization still relies on some of the functionality, and especially the data, provided by legacy applications, which IT is required to continue to support.  IT is also responsible for assisting the organization with any technology challenges encountered when using modern applications.  This feature fracture (i.e., the technology supporting business needs being splintered across legacy and modern applications) often leaves IT departments overburdened, and causes them to battle against the disruptive force of business-driven consumer technology.

“IT and business leaders need to work together and operate in parallel,” Wang concludes.  “If IT slows down the business capability to innovate, then the company will suffer as new business models emerge and infrastructure will fail to keep up.  If business moves ahead of IT in technology, then the company fails because IT will spend years cleaning up technology messes.”

This is the IT Consumerization Conundrum.  Although, in the short-term, it usually better services the technology needs of the organization, in the long-term, if it’s not properly managed and integrated into the IT Delivery strategy of the organization, then it can create a complex web of technology that entangles the organization much more than it enables it.

Or to borrow the words of Ralph Loura, it can “cause technology to become a business disabler instead of a business enabler.”

This blog post is sponsored by the Enterprise CIO Forum and HP.


Related Posts

The IT Prime Directive of Business First Contact

A Sadie Hawkins Dance of Business Transformation

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Partly Cloudy CIO

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

The IT Prime Directive of Business First Contact

This blog post is sponsored by the Enterprise CIO Forum and HP.

Every enterprise requires, as Ralph Loura explains, “end to end business insight to generate competitive advantage, and it’s hard to gain insight if the business is arms length away from the data and the systems and the processes that support business insight.”

Loura explains that one of the historical challenges with technology has been that most IT systems have traditionally taken years to deploy and are supported on timelines and lifecycles that are inconsistent with the dynamic business needs of the organization, which has, in some cases, caused technology to become a business disabler instead of a business enabler.

The change-averse nature of most legacy applications is the antithesis of the agile nature of most modern applications.

“It wasn’t too long ago,” explains John Dodge, “when speed didn’t matter, or was considered an enemy of a carefully laid out IT strategy based largely on lowest cost.”  However, speed and agility are now “a competitive imperative.  You have to be fast in today’s marketplace and no department feels the heat more than IT, according to the Enterprise CIO Forum Council members.”

“If you think in terms of speed and the dynamic nature of business,” explains Joseph Spagnoletti, “clearly the organization couldn’t operate at that pace or make the necessary changes without IT woven very deeply into the work that the business does.”

Spagnoletti believes that cloud computing, mobility, and analytics are the three technology enablers for the timely delivery of the information that the organization requires to support its constantly evolving business needs.

“Embedding IT into an organization optimizes a business’s competitive edge,” explains Bill Laberis, “because it empowers the people right at the front lines of the enterprise to make better, faster and more informed decisions — right at the point of contact with customers, partners and clients.”

Historically, IT had a technology-first mindset.  However, the new IT prime directive must become business first contact, embedding advanced technology right at the point of contact with the organization’s business needs, enabling the enterprise to continue its mission to explore new business opportunities with the agility to boldly go where no competitor has gone before.

This blog post is sponsored by the Enterprise CIO Forum and HP.


Related Posts

A Sadie Hawkins Dance of Business Transformation

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Partly Cloudy CIO

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

A Sadie Hawkins Dance of Business Transformation

This blog post is sponsored by the Enterprise CIO Forum and HP.

In the United States, a Sadie Hawkins Dance is a school-sponsored semi-formal dance, in which, contrary to the usual custom, female students invite male students.  In the world of information technology (IT), a Sadie Hawkins Dance is an enterprise-wide initiative, in which, contrary to the usual custom, a strategic business transformation is driven by IT.

Although IT-driven business transformation might seem like an oxymoron, the reality is a centralized IT department is one of the few organizational functions that regularly interacts with the entire enterprise.  Therefore, IT is strategically positioned to influence enterprise-wide business transformation—and CIOs might be able to take a business leadership role in those activities.

Wayne Shurts, the CIO of Supervalu, recently discussed how CIOs can make the transition to business leader by “approaching things from a business point of view, as opposed to a technology point of view.  IT must become intensely business driven.”

One thing Shurts emphasized is necessary for this shift in the perception of the CIO is that other C-level executives must realize “technology can be transformative for the organization, especially since it is transforming the consumer behavior of customers.”


Business Transformation through IT

David Steiner and Puneet Bhasin, the CEO and CIO of Waste Management, recently recorded a great two-part video interview called Business Transformation through IT, which you can check out using the following links: Part 1, Part 2, Transcript

“From day one,” explained Steiner, “I knew that the one way we could transform our company was through technology.”  Steiner then set out to find a CIO that could help him realize this vision of technology being transformative for the organization.

“If you’re going to be a true business partner,” explained Steiner, “which is what every CEO is looking for from their CIO, you have to go understand the business.”  Steiner explained that one of the first things that Bhasin did after he was hired as CIO was go out into the field and live the life of a customer service rep, a driver, a dispatcher, and a route manager—so that before Bhasin tried to do anything with technology, he first sought to understand the business so that he could become a true business partner.

“So the best advice I could give to any CIO would be,” concluded Steiner, “be a business partner, not a technologist.  Know the technology.  You’ve got to know how to apply the technology.  But be a business partner.”

“My advice to CEOs,” explained Bhasin, “would be look for a business person first and a technologist second.  And make sure that your CIO is a part of the decision-making strategic body within the organization.  If you are looking at IT purely as an area to reduce cost, that’s probably the wrong thing.  To me the value of IT is certainly in the area of efficiencies and cost reduction.  I think it has a huge role to play in that.  But I think it has an even greater role to play in product design, and growing customers, and expanding segments, and driving profitability.”

John Dodge recently blogged that business transformation is the CIO’s responsibility and opportunity.  Even though CIOs will eventually need their business partners to take the lead once they get out on the dance floor, CIOs may need to initiate things by inviting their business partners to A Sadie Hawkins Dance of Business Transformation.

This blog post is sponsored by the Enterprise CIO Forum and HP.


Related Posts

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Partly Cloudy CIO

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

Has Data Become a Four-Letter Word?

In her excellent blog post 'The Bad Data Ate My Homework' and Other IT Scapegoating, Loraine Lawson explained how “there are a lot of problems that can be blamed on bad data.  I suspect it would be fair to say that there’s a good percentage of problems we don’t even know about that can be blamed on bad data and a lack of data integration, quality and governance.”

Lawson examined whether bad data could have been the cause of the bank foreclosure fiasco, as opposed to, as she concludes, the more realistic causes being bad business and negligence, which, if not addressed, could lead to another global financial crisis.

“Bad data,” Lawson explained, “might be the most ubiquitous excuse since ‘the dog ate my homework.’  But while most of us would laugh at the idea of blaming the dog for missing homework, when someone blames the data, we all nod our heads in sympathy, because we all know how troublesome computers are.  And then the buck gets (unfairly) passed to IT.”

Unfairly blaming IT, or technology in general, when poor data quality negatively impacts business performance is ignoring the organization’s collective ownership of its problems, and its shared responsibility for the solutions to those problems, and causes, as Lawson explained in Data’s Conundrum: Everybody Wants Control, Nobody Wants Responsibility, an “unresolved conflict on both the business and the IT side over data ownership and its related issues, from stewardship to governance.”

In organizations suffering from this unresolved conflict between IT and the Business—a dysfunctional divide also known as the IT-Business Chasm—bad data becomes the default scapegoat used by both sides.

Perhaps, in a strange way, placing the blame on bad data is progress when compared with the historical notions of data denial, when an organization’s default was to claim that it had no data quality issues whatsoever.

However, admitting bad data not only exists, but that bad data is also having a tangible negative impact on business performance doesn’t seem to have motivated organizations to take action.  Instead, many appear to prefer practicing bad data blamestorming, where the Business blames bad data on IT and its technology, and IT blames bad data on the Business and its business processes.

Or perhaps, by default, everyone just claims that “the bad data ate my homework.”

Are your efforts to convince executive management that data needs to treated like a five-letter word (“asset”) being undermined by the fact that data has become a four-letter word in your organization?


Related Posts

The Business versus IT—Tear down this wall!

Quality and Governance are Beyond the Data

Data In, Decision Out

The Data-Decision Symphony

The Reptilian Anti-Data Brain

Hell is other people’s data

Promoting Poor Data Quality

Who Framed Data Entry?

Data, data everywhere, but where is data quality?

The Circle of Quality

The Business versus IT—Tear down this wall!

Business Information Technology

This diagram was published in the July 2009 blog post Business Information Technology by Steve Tuck of Datanomic, and was based on a conference conversation with Gwen Thomas of the Data Governance Institute, about the figurative wall, prevalent in most organizations, which literally separates the Business, who usually own its data and understand its use in making critical daily business decisions, from Information Technology (IT), who usually own and maintain the hardware and software infrastructure of its enterprise data architecture.

The success of all enterprise information initiatives requires that this wall be torn down, ending the conflict between the Business and IT, and forging a new collaborative union that Steve and Gwen called Business Information Technology.


Isn’t IT a part of the Business?

In his recent blog post Isn’t IT a Part of “the Business”?, Winston Chen of Kalido examined this common challenge, remarking how “IT is often a cost center playing a supporting role for the frontline functions.  But Finance is a cost center, too.  Is Finance really the Business?  How about Human Resources?  We don’t hear HR people talk about the Business versus HR, do we?”

“Key words are important in setting the tone for communication,” Winston explained.  “When our language suggests IT is not a part of the Business, it cements a damaging us-versus-them mentality.”

“It leads to isolation.  What we need today, more than ever, is close collaboration.”


Purple People

Earlier this year in his blog post “Purple People”: The Key to BI Success, Wayne Eckerson of TDWI used a colorful analogy to discuss this common challenge within the context of business intelligence (BI) programs.

Wayne explained that the color purple is formed by mixing two primary colors: red and blue.  These colors symbolize strong, distinct, and independent perspectives.  Wayne used red to represent IT and blue to represent the Business.

Purple People, according to Wayne, “are key intermediaries who can reconcile the Business and IT and forge a strong and lasting partnership that delivers real value to the organization.”

“Pure technologists or pure business people can’t harness BI successfully.  BI needs Purple People to forge tight partnerships between business people and technologists and harness information for business gain.”

I agree with Wayne, but I believe all enterprise information initiatives, and not just BI, need Purple People for success.


Tearing down the Business-IT Wall

My overly dramatic blog post title is obviously a reference to the famous speech by United States President Ronald Reagan at the Berlin Wall on June 12, 1987.  For more than 25 years, the Berlin Wall had stood as a symbol of not only a divided Germany and divided political ideologies, but more importantly, it was both a figurative and literal symbol of a deeper human divide.

Although Reagan’s speech was merely symbolic of the numerous and complex factors that eventually lead to the dismantling of the Berlin Wall and the end of the Cold War, symbolism is a powerful aspect of human culture—including corporate culture.

The Business-IT Wall is only a figurative wall, but it literally separates the Business and IT in most organizations today.

So much has been written about the need for Business-IT Collaboration on successful enterprise information initiatives that the message is often ignored because people are sick and tired of hearing about it.

However, although there are other barriers to success, and people, process, and technology are all important, by far the most important factor for true and lasting success to be possible is—peoplecollaborating.

Organizations must remove all symbolic obstacles, both figurative and literal, which contribute to the human divide preventing enterprise-wide collaboration within their unique corporate culture.

As for the Business-IT Wall, and all other similar barriers to our collaboration and success, the time is long overdue for us to:

Tear down this wall!

Related Posts

The Road of Collaboration

Finding Data Quality

Data Transcendentalism

Declaration of Data Governance

Podcast: Business Technology and Human-Speak

Not So Strange Case of Dr. Technology and Mr. Business

Data Quality is People!

You're So Vain, You Probably Think Data Quality Is About You

The Road of Collaboration

The Road Not Taken by Robert Frost I grew up and lived most of my life in the suburbs of Boston, Massachusetts.  But just prior to relocating to the Midwest for work seven years ago, I lived in Derry, New Hampshire, just down the road from the historic landmark where Robert Frost, the famous American poet who was also a four-time recipient of the Pulitzer Prize for Poetry, wrote many of his best poems, including the one shown to the left, The Road Not Taken, which has always remained one of my favorite poems—and also provides the inspiration for this blog post.

Historically, there have been only two “roads” diverged in the corporate world, two well-traveled ways: The Road of Business and The Road of Technology.

Although these two roads have a common starting point near the center of an organization, they will almost always extend away from each other, and in completely opposite directions, leaving most employees to choose which road they wish to travel—often without being sorry that they could not travel both.

I don’t believe that I am taking too much of a poetic license in describing this common calamity as how an organization is “a house divided against itself,” which to paraphrase Abraham Lincoln, cannot succeed.  I believe that no organization can succeed as half business and half technical.  But I also do not believe that any organization must become either all business or all technical.

There is a third option—there is a third road diverged in the corporate world.

Organizations struggle with the business/technical divided house because they believe the corporate world is comprised of technical workers delivering and maintaining the things that enable business workers to do their things.

And of course, there can be an almost Lincoln–Douglas debate about what exactly each of those things are because, in part, it is commonly perceived that they operate independently of one another—whereas the truth is that they are highly interdependent.

However, it’s no debate that organizations suffer from this perception of a deep divide separating the business side of the house, who usually own its data and understand its use in making critical daily business decisions, from the technical side of the house, who usually own and maintain its hardware and software infrastructure, which comprise its enterprise data architecture.

The success of all enterprise information initiatives is highly dependent upon enterprise-wide interdependence—aka collaboration.

Therefore, in order for success to be possible with data quality, data integration, master data management, data warehousing, business intelligence, data governance, etc., your organization needs to travel the third road diverged in the corporate world.

The Road of Collaboration is long and winding, a seemingly strange and unfamiliar road, quite distinct from the well-traveled, long, but straight and narrow, and somewhat easily foreseeable paths of The Road of Business and The Road of Technology.

Your organization must abandon the comforts of the familiar roads and embrace the discomfort of the unfamiliar road, the road that although less traveled by, definitely makes all the difference between whether your entire house will succeed or fail.

But if The Road of Collaboration does not yet exist within your organization, then you can not afford to settle for continuing to travel down whatever path you currently follow.  Instead, you must follow the trailblazing advice of Ralph Waldo Emerson:

“Do not go where the path may lead; go instead where there is no path and leave a trail.”

Neither trailblazing, nor taking the road less traveled by, will be an easy journey.  And there is no escaping the harsh reality that The Road of Collaboration will always be the path of the greatest resistance.

But which story do you want to be telling—and without a sigh—somewhere ages and ages hence?

Do you want to tell the story about how your organization continued to walk away from each other by traveling separately down The Road of Business and The Road of Technology—leaving The Road of Collaboration as The Road Not Taken?

Or do you want to tell the story about how your organization chose to walk together by traveling The Road of Collaboration?

Three roads diverged in the corporate world, and our organization—
Our organization took the one less traveled by,
And that has made all the difference.

Related Posts

Scrum Screwed Up

The Idea of Order in Data

Finding Data Quality

Data Transcendentalism

Declaration of Data Governance

The Prince of Data Governance

Jack Bauer and Enforcing Data Governance Policies

Podcast: Business Technology and Human-Speak

The Dumb and Dumber Guide to Data Quality

Not So Strange Case of Dr. Technology and Mr. Business

Scrum Screwed Up

This was the inaugural cartoon on Implementing Scrum by Michael Vizdos and Tony Clark, which does a great job of illustrating the fable of The Chicken and the Pig used to describe the two types of roles involved in Scrum, which, quite rare for our industry, is not an acronym, but one common approach among many iterative, incremental frameworks for agile software development.

Scrum is also sometimes used as a generic synonym for any agile framework.  Although I’m not an expert, I’ve worked on more than a few agile programs.  And since I am fond of metaphors, I will use the Chicken and the Pig to describe two common ways that scrums of all kinds can easily get screwed up:

  1. All Chicken and No Pig
  2. All Pig and No Chicken

However, let’s first establish a more specific context for agile development using one provided by a recent blog post on the topic.


A Contrarian’s View of Agile BI

In her excellent blog post A Contrarian’s View of Agile BI, Jill Dyché took a somewhat unpopular view of a popular view, which is something that Jill excels at—not simply for the sake of doing it—because she’s always been well-known for telling it like it is.

In preparation for the upcoming TDWI World Conference in San Diego, Jill was pondering the utilization of agile methodologies in business intelligence (aka BI—ah, there’s one of those oh so common industry acronyms straight out of The Acronymicon).

The provocative TDWI conference theme is: “Creating an Agile BI Environment—Delivering Data at the Speed of Thought.”

Now, please don’t misunderstand.  Jill is an advocate for doing agile BI the right way.  And it’s certainly understandable why so many organizations love the idea of agile BI.  Especially when you consider the slower time to value of most other approaches when compared with, following Jill’s rule of thumb, how agile BI would have “either new BI functionality or new data deployed (at least) every 60-90 days.  This approach establishes BI as a program, greater than the sum of its parts.”

“But in my experience,” Jill explained, “if the organization embracing agile BI never had established BI development processes in the first place, agile BI can be a road to nowhere.  In fact, the dirty little secret of agile BI is this: It’s companies that don’t have the discipline to enforce BI development rigor in the first place that hurl themselves toward agile BI.”

“Peek under the covers of an agile BI shop,” Jill continued, “and you’ll often find dozens or even hundreds of repeatable canned BI reports, but nary an advanced analytics capability. You’ll probably discover an IT organization that failed to cultivate solid relationships with business users and is now hiding behind an agile vocabulary to justify its own organizational ADD. It’s lack of accountability, failure to manage a deliberate pipeline, and shifting work priorities packaged up as so much scrum.”

I really love the term Organizational Attention Deficit Disorder, and in spite of myself, I can’t help but render it acronymically as OADD—which should be pronounced as “odd” because the “a” is silent, as in: “Our organization is really quite OADD, isn’t it?”


Scrum Screwed Up: All Chicken and No Pig

Returning to the metaphor of the Scrum roles, the pigs are the people with their bacon in the game performing the actual work, and the chickens are the people to whom the results are being delivered.  Most commonly, the pigs are IT or the technical team, and the chickens are the users or the business team.  But these scrum lines are drawn in the sand, and therefore easily crossed.

Many organizations love the idea of agile BI because they are thinking like chickens and not like pigs.  And the agile life is always easier for the chicken because they are only involved, whereas the pig is committed.

OADD organizations often “hurl themselves toward agile BI” because they’re enamored with the theory, but unrealistic about what the practice truly requires.  They’re all-in when it comes to the planning, but bacon-less when it comes to the execution.

This is one common way that OADD organizations can get Scrum Screwed Up—they are All Chicken and No Pig.


Scrum Screwed Up: All Pig and No Chicken

Closer to the point being made in Jill’s blog post, IT can pretend to be pigs making seemingly impressive progress, but although they’re bringing home the bacon, it lacks any real sizzle because it’s not delivering any real advanced analytics to business users. 

Although they appear to be scrumming, IT is really just screwing around with technology, albeit in an agile manner.  However, what good is “delivering data at the speed of thought” when that data is neither what the business is thinking, nor truly needs?

This is another common way that OADD organizations can get Scrum Screwed Up—they are All Pig and No Chicken.


Scrum is NOT a Silver Bullet

Scrum—and any other agile framework—is not a silver bullet.  However, agile methodologies can work—and not just for BI.

But whether you want to call it Chicken-Pig Collaboration, or Business-IT Collaboration, or Shiny Happy People Holding Hands, a true enterprise-wide collaboration facilitated by a cross-disciplinary team is necessary for any success—agile or otherwise.

Agile frameworks, when implemented properly, help organizations realistically embrace complexity and avoid oversimplification, by leveraging recurring iterations of relatively short duration that always deliver data-driven solutions to business problems. 

Agile frameworks are successful when people take on the challenge united by collaboration, guided by effective methodology, and supported by enabling technology.  Agile frameworks allow the enterprise to follow what works, for as long as it works, and without being afraid to adjust as necessary when circumstances inevitably change.

For more information about Agile BI, follow Jill Dyché and TDWI World Conference in San Diego, August 15-20 via Twitter.

The Acronymicon

Image created under a Creative Commons Attribution License using: Wordle

“The beginning of wisdom is the definition of terms.” – Socrates

“The end of wisdom is the definition of acronyms.” – Jim Harris

The Acronymicon

The Necronomicon

The Necronomicon is a fictional grimoire (i.e., a textbook containing instructions on how to perform magic), which first appeared in the classic horror stories written by H. P. Lovecraft, and later appeared in other works, including some films, such as Army of Darkness, starring Bruce Campbell, which is one of my favorites—it’s a comedy and it’s highly recommended.

Therefore, the explanation for the rather unusual title of this blog post is that I could think of no better term to describe the fictional textbook containing instructions on how to discuss enterprise information initiatives by using acronyms, and only acronyms, other than:

The Acronymicon


Acronyms Gone Wild

For whatever reason, enterprise information initiatives (EIIs?)  have a great fondness for TLAs (two or three letter acronyms): ERP (Enterprise Resource Planning), DW (Data Warehousing), BI (Business Intelligence), MDM (Master Data Management), DG (Data Governance), DQ (Data Quality), CDI (Customer Data Integration), CRM (Customer Relationship Management), PIM (Product Information Management), BPM (Business Process Management), and so many more—truly too many to list.

Additionally, we have apparently become so accustomed to TLAs, that we needed to take it to the next level with Acronyms 2.0 by starting the fun new trend of FLAs (four letter acronyms) such as software as a service (SaaS), platform as a service (PaaS), data as a service (DaaS), service oriented development of applications (SODA), and so many frakking more four letter acronyms.

I also have it on very good authority that by the end of this decade, the Semantic Web will deliver Acronyms 3.0 by creating an Ontology of Unambiguous Acronyms (OOUA), which will be written using a RDFS (Resource Description Framework Schema), in the FOAF (Friend of a Friend) vocabulary, which we will obviously query using SPARQL, which is itself a recursive acronym for SPARQL Protocol and RDF Query Language.



Now, don’t get me wrong.  I do appreciate how acronyms and other lexicons of terminology can be used as a convenient way of more efficiently discussing the complex concepts often underlying enterprise information initiatives. 

However, too often acronyms are used without ever being defined, which can lead to conversations like that scene in the movie Good Morning, Vietnam where Adrian Cronauer (played by Robin Williams) responds to the overuse of military acronyms used by an officer in charge to describe an upcoming press conference by then former Vice President Richard Nixon with the question:

“Excuse me, sir.  Seeing as how the VP is such a VIP, shouldn’t we keep the PC on the QT?  Because if it leaks to the VC, he could end up MIA, and then we’d all be put out in KP.”

An even worse offense than not defining what the acronym stands for, is only providing what it stands for as the definition. 

For example, when someone asks you the question “what is MDM?” and you respond by stating “Master Data Management,” that really doesn’t help all that much, does it?

Even when you use a better definition, such as the following one from the book Master Data Management by David Loshin:

“Master Data Management (MDM) incorporates business applications, information management methods, and data management tools to implement the policies, procedures, and infrastructures that support the capture, integration, and subsequent shared use of accurate, timely, consistent, and complete master data.”

This is only the beginning of a more detailed discussion, the specifics of which will vary based on your particular circumstances, including the unique corporate culture of your organization, which will greatly influence such things as how exactly the “policies, procedures, and infrastructures” are defined, and what “accurate, timely, consistent, and complete” actually mean.

For that matter, you shouldn’t even assume that everyone knows what you are referring to when you say “master data.”

My point is that you should always make sure that the key concepts of your enterprise information initiatives are clearly defined and in a language that everyone can understand.  I am not just talking about translating the techno-mumbojumbo, because even business-speak can sound more like business-babbling—and not just to the technical folks.

Additionally, don’t be afraid to ask questions or admit when you don’t know the answers.  Many costly mistakes can be made when people assume that others know (or pretend to know themselves) what acronyms and other terminology actually mean.


Instructions for using The Acronymicon

If you absolutely insist on using The Acronymicon to discuss enterprise information initiatives at your organization, please just remember that before you even open the book, you must first carefully recite the following words:

“Clatto Verata Nicto!”

No, wait—that’s not quite right.  I think it’s something more like, you must first carefully recite the following words:

“Klaatu Barada Nikto!” 

No, that doesn’t sound right either.  Somebody should just create an acronym for it—they’re much easier to recite and remember.


Related Posts

Podcast: Business Technology and Human-Speak

Not So Strange Case of Dr. Technology and Mr. Business

Podcast: Open Your Ears

Shut Your Mouth

Hailing Frequencies Open

The Game of Darts – An Allegory


Follow OCDQ

If you enjoyed this blog post, then please subscribe to OCDQ via my RSS feed, my E-mail updates, or Google Reader.

You can also follow OCDQ on Twitter, fan the Facebook page for OCDQ, and connect with me on LinkedIn.