Common Change

I recently finished reading the great book Switch: How to Change Things When Change Is Hard by Chip Heath and Dan Heath, which examines why it can be so difficult for us to make lasting changes—both professional changes and personal changes.

“For anything to change,” the Heaths explain, “someone has to start acting differently.  Ultimately, all change efforts boil down to the same mission: Can you get people to start behaving in a new way?”

Their metaphor for change of all kinds is making a Switch, which they explain requires the following three things:

  1. Directing the Rider, which is a metaphor for the rational aspect of our decisions and behavior.
  2. Motivating the Elephant, which is a metaphor for the emotional aspect of our decisions and behavior.
  3. Shaping the Path, which is a metaphor for the situational aspect of our decisions and behavior.

Despite being the most common phenomenon in the universe, change is almost universally resisted, making most of us act as if change is anything but common.  Therefore, in this blog post, I will discuss the Heaths three key concepts using some common terminology: Common Sense, Common Feeling, and Common Place—which, when working together, lead to Common Change.

 

Common Sense

“What looks like resistance is often a lack of clarity,” the Heaths explain.  “Ambiguity is the enemy.  Change begins at the level of individual decisions and behaviors.  To spark movement in a new direction, you need to provide crystal-clear guidance.”

Unfortunately, changes are usually communicated in ways that cause confusion instead of provide clarity.  Many change efforts fail at the outset because of either ambiguous goals or a lack of specific instructions explaining exactly how to get started.

One personal change example would be: Eat Healthier.

Although the goal makes sense, what exactly should I do?  Should I eat smaller amounts of the same food, or eat different food?  Should I start eating two large meals a day while eliminating snacks, or start eating several smaller meals throughout the day?

One professional example would be: Streamline Inefficient Processes.

This goal is even more ambiguous.  Does it mean all of the existing processes are inefficient?  What does streamline really mean?  What exactly should I do?  Should I be spending less time on certain tasks, or eliminating some tasks from my daily schedule?

Ambiguity is the enemy.  For any chance of success to be possible, both the change itself and the plan for making it happen must sound like Common Sense

More specifically, the following two things must be clearly defined and effectively communicated:

  1. Long-term Goal – What exactly is the change that we are going to make—what is our destination?
  2. Short-term Critical Moves – What are the first few things we need to do—how do we begin our journey?

“What is essential,” as the Heaths explain, “is to marry your long-term goal with short-term critical moves.”

“What you don’t need to do is anticipate every turn in the road between today and the destination.  It’s not that plotting the whole journey is undesirable; it’s that it’s impossible.  When you’re at the beginning, don’t obsess about the middle, because the middle is going to look different once you get there.  Just look for a strong beginning and a strong ending and get moving.”

 

Common Feeling

I just emphasized the critical importance of envisioning both the beginning and the end of our journey toward change.

However, what happens in the middle is the change.  So, if common sense can help us understand where we are going and how to get started, what can help keep us going during the really challenging aspects of the middle?

There’s really only one thing that can carry us through the middle—we need to get hooked on a Common Feeling.

Some people—and especially within a professional setting—will balk at discussing the role that feeling (i.e., emotion) plays in our decision making and behavior because it is commonly believed that rational analysis must protect us from irrational emotions.

However, relatively recent advancements in the fields of psychology and neuroscience have proven that good decision making requires the flexibility to know when to rely on rational analysis and when to rely on emotions—and to always consider not only how we’re thinking, but also how we’re feeling.

In their book The Heart of Change: Real-Life Stories of How People Change Their Organizations, John Kotter and Dan Cohen explained that “the core of the matter is always about changing the behavior of people, and behavior change happens mostly by speaking to people’s feelings.  In highly successful change efforts, people find ways to help others see the problems or solutions in ways that influence emotions, not just thought.”

Kotter and Cohen wrote that most people think change happens in this order: ANALYZE—THINK—CHANGE. 

However, from interviewing over 400 people across more than 130 large organizations in the United States, Europe, Australia, and South Africa, they observed that in almost all successful change efforts, the sequence of change is: SEE—FEEL—CHANGE.

“We know there’s a difference between knowing how to act and being motivated to act,” the Heaths explain.  “But when it comes time to change the behavior of other people, our first instinct is to teach them something.”

Making only a rational argument for change without an emotional appeal results in understanding without motivation, and making only an emotional appeal for change without a rational plan results in passion without direction

Therefore, making the case for lasting change requires that you effectively combine common sense with common feeling.

 

Common Place

“That is NOT how we do things around here” is the most common objection to change.  This is the Oath of Change Resistance, which maintains the status quo—the current situation that is so commonplace that it seems like “these people will never change.”

But as the Heaths explain, “what looks like a people problem is often a situation problem.”

Stanford psychologist Lee Ross coined the term fundamental attribution error to describe our tendency to ignore the situational forces that shape other people’s behavior.  The error lies in our inclination to attribute people’s behavior to the way they are rather than to the situation they are in.

When we lament that “these people will never change” we have convinced ourselves that change-resistant behavior equates to a change-resistant personal character and discount the possibility that it simply could be a reflection of the current situation

The great analogy used by the Heaths is water.  When boiling in a pot on the stove, it’s a scalding-hot liquid, but when cooling in a tray in the freezer, it’s an icy-cold solid.  However, declaring either scalding-hot or icy-cold as a fundamental attribute of water and not a situational attribute of water would obviously be absurd—but we do this with people and their behavior all the time.

This doesn’t mean that people’s behavior is always a result of their situation—nor does it excuse inappropriate behavior. 

The fundamental point is that the situation that people are currently in (i.e., their environment) can always be changed, and most important, it can be tweaked in ways that influence their behavior and encourage them to change for the better.

“Tweaking the environment,” the Heaths explain, “is about making the right behaviors a little bit easier and the wrong behaviors a little bit harder.  It’s that simple.”  The status quo is sometimes described as the path of least resistance.  So consider how you could tweak the environment in order to transform the path of least resistance into the path of change.

Therefore, in order to facilitate lasting change, you must create a new Common Place where the change becomes accepted as: “That IS how we do things around here—from now on.”  This is the Oath of Change, which redefines the status quo.

 

Common Change

“When change happens,” the Heaths explain, “it tends to follow a pattern.”  Although it is far easier to recognize than to embrace, in order for any of the changes we need to make to be successful, “we’ve got to stop ignoring that pattern and start embracing it.”

Change begins when our behavior changes.  In order for this to happen, we have to think that the change makes common sense, we have to feel that the change evokes a common feeling, and we have to accept that the change creates a new common place

When all three of these rational, emotional, and situational forces are in complete alignment, then instead of resisting change, we will experience it as Common Change.

 

Related Posts

The Winning Curve

The Balancing Act of Awareness

The Importance of Envelopes

The Point of View Paradox

Persistence

Data Quality and the Cupertino Effect

The Cupertino Effect can occur when you accept the suggestion of a spellchecker program, which was attempting to assist you with a misspelled word (or what it “thinks” is a misspelling because it cannot find an exact match for the word in its dictionary). 

Although the suggestion (or in most cases, a list of possible words is suggested) is indeed spelled correctly, it might not be the word you were trying to spell, and in some cases, by accepting the suggestion, you create a contextually inappropriate result.

It’s called the “Cupertino” effect because with older programs the word “cooperation” was only listed in the spellchecking dictionary in hyphenated form (i.e., “co-operation”), making the spellchecker suggest “Cupertino” (i.e., the California city and home of the worldwide headquarters of Apple, Inc.,  thereby essentially guaranteeing it to be in all spellchecking dictionaries).

By accepting the suggestion of a spellchecker program (and if there’s only one suggested word listed, don’t we always accept it?), a sentence where we intended to write something like:

“Cooperation is vital to our mutual success.”

Becomes instead:

“Cupertino is vital to our mutual success.”

And then confusion ensues (or hilarity—or both).

Beyond being a data quality issue for unstructured data (e.g., documents, e-mail messages, blog posts, etc.), the Cupertino Effect reminded me of the accuracy versus context debate.

 

“Data quality is primarily about context not accuracy...”

This Data Quality (DQ) Tip from last September sparked a nice little debate in the comments section.  The complete DQ-Tip was:

“Data quality is primarily about context not accuracy. 

Accuracy is part of the equation, but only a very small portion.”

Therefore, the key point wasn’t that accuracy isn’t important, but simply to emphasize that context is more important. 

In her fantastic book Executing Data Quality Projects, Danette McGilvray defines accuracy as “a measure of the correctness of the content of the data (which requires an authoritative source of reference to be identified and accessible).”

Returning to the Cupertino Effect for a moment, the spellchecking dictionary provides an identified, accessible, and somewhat authoritative source of reference—and “Cupertino” is correct data content for representing the name of a city in California. 

However, absent a context within which to evaluate accuracy, how can we determine the correctness of the content of the data?

 

The Free-Form Effect

Let’s use a different example.  A common root cause of poor quality for structured data is: free-form text fields.

Regardless of how good the metadata description is written or how well the user interface is designed, if a free-form text field is provided, then you will essentially be allowed to enter whatever you want for the content of the data (i.e., the data value).

For example, a free-form text field is provided for entering the Country associated with your postal address.

Therefore, you could enter data values such as:

Brazil
United States of America
Portugal
United States
República Federativa do Brasil
USA
Canada
Federative Republic of Brazil
Mexico
República Portuguesa
U.S.A.
Portuguese Republic

However, you could also enter data values such as:

Gondor
Gnarnia
Rohan
Citizen of the World
The Land of Oz
The Island of Sodor
Berzerkistan
Lilliput
Brobdingnag
Teletubbyland
Poketopia
Florin

The first list contains real countries, but a lack of standard values introduces needless variations. The second list contains fictional countries, which people like me enter into free-form fields to either prove a point or simply to amuse myself (well okay—both).

The most common solution is to provide a drop-down box of standard values, such as those provided by an identified, accessible, and authoritative source of reference—the ISO 3166 standard country codes.

Problem solved—right?  Maybe—but maybe not. 

Yes, I could now choose BR, US, PT, CA, MX (the ISO 3166 alpha-2 codes for Brazil, United States, Portugal, Canada, Mexico), which are the valid and standardized country code values for the countries from my first list above—and I would not be able to find any of my fictional countries listed in the new drop-down box.

However, I could also choose DO, RE, ME, FI, SO, LA, TT, DE (Dominican Republic, Réunion, Montenegro, Finland, Somalia, Lao People’s Democratic Republic, Trinidad and Tobago, Germany), all of which are valid and standardized country code values, however all of them are also contextually invalid for my postal address.

 

Accuracy: With or Without Context?

Accuracy is only one of the many dimensions of data quality—and you may have a completely different definition for it. 

Paraphrasing Danette McGilvray, accuracy is a measure of the validity of data values, as verified by an authoritative reference. 

My question is what about context?  Or more specifically, should accuracy be defined as a measure of the validity of data values, as verified by an authoritative reference, and within a specific context?

Please note that I am only trying to define the accuracy dimension of data quality, and not data quality

Therefore, please resist the urge to respond with “fitness for the purpose of use” since even if you want to argue that “context” is just another word meaning “use” then next we will have to argue over the meaning of the word “fitness” and before you know it, we will be arguing over the meaning of the word “meaning.”

Please accurately share your thoughts (with or without context) about accuracy and context—by posting a comment below.

The 2010 Data Quality Blogging All-Stars

The 2010 Major League Baseball (MLB) All-Star Game is being held tonight (July 13) at Angel Stadium in Anaheim, California.

For those readers who are not baseball fans, the All-Star Game is an annual exhibition held in mid-July that showcases the players with (for the most part) the best statistical performances during the first half of the MLB season.

Last summer, I began my own annual exhibition of showcasing the bloggers whose posts I have personally most enjoyed reading during the first half of the data quality blogging season. 

Therefore, this post provides links to stellar data quality blog posts that were published between January 1 and June 30 of 2010.  My definition of a “data quality blog post” also includes Data Governance, Master Data Management, and Business Intelligence. 

Please Note: There is no implied ranking in the order that bloggers or blogs are listed, other than that Individual Blog All-Stars are listed first, followed by Vendor Blog All-Stars, and the blog posts are listed in reverse chronological order by publication date.

 

Henrik Liliendahl Sørensen

From Liliendahl on Data Quality:

 

Dylan Jones

From Data Quality Pro:

 

Julian Schwarzenbach

From Data and Process Advantage Blog:

 

Rich Murnane

From Rich Murnane's Blog:

 

Phil Wright

From Data Factotum:

 

Initiate – an IBM Company

From Mastering Data Management:

 

Baseline Consulting

From their three blogs: Inside the Biz with Jill Dyché, Inside IT with Evan Levy, and In the Field with our Experts:

 

DataFlux – a SAS Company

From Community of Experts:

 

Related Posts

Recently Read: May 15, 2010

Recently Read: March 22, 2010

Recently Read: March 6, 2010

Recently Read: January 23, 2010

The 2009 Data Quality Blogging All-Stars

 

Additional Resources

From the IAIDQ, read the 2010 issues of the Blog Carnival for Information/Data Quality:

The Winning Curve

Illustrated above is what I am calling The Winning Curve and it combines ideas from three books I have recently read:

The Winning Curve is applicable to any type of project or the current iteration of an ongoing program—professional or personal.

 

Insight

The Winning Curve starts with the Design Phase, the characteristics of which are inspired by Tim Brown (quoted in Switch.)  Brown explains how every design phase goes through “foggy periods.”  He uses a U-shaped curve called a “project mood chart” that predicts how people will feel at different stages of the design phase. 

The design phase starts with a peak of positive emotion, labeled “Hope,” and ends with a second peak of positive emotion, labeled “Confidence.”  In between these two great heights is a deep valley of negative emotion, labeled “Insight.”

The design phase, according to Brown, is “rarely a graceful leap from height to height,” and as Harvard Business School professor Rosabeth Moss Kanter explains, “everything can look like a failure in the middle.”

Therefore, the design phase is really exciting—at the beginning

After the reality of all the research, as well as the necessary communication and collaboration with others has a chance to set in, then the hope you started out with quickly dissipates, and insight is the last thing you would expect to find “down in the valley.”

During this stage, “it’s easy to get depressed, because insight doesn’t always strike immediately,” explains Chip and Dan Heath.  “But if the team persists through this valley of angst and doubt, it eventually emerges with a growing sense of momentum.”

 

“The Dip”

After The Winning Curve has finally reached the exhilarating summit of Confidence Mountain (i.e., your design is completed), you are then faced with yet another descent, since now the Development Phase is ready to begin.

Separating the start of the development phase from the delivery date is another daunting valley, otherwise known as “The Dip.”

The development phase can be downright brutal.  It is where the grand conceptual theory of your design’s insight meets the grunt work practice required by your development’s far from conceptual daily realities. 

Everything sounds easier on paper (or on a computer screen).  Although completing the design phase was definitely a challenge, completing the development phase is almost always more challenging.

However, as Seth Godin explains, “The Dip is where success happens.  Successful people don’t just ride out The Dip.  They don’t just buckle down and survive it.  No, they lean into The Dip.”

“All our successes are the same.  All our failures, too,” explains Godin in the closing remarks of The Dip.  “We succeed when we do something remarkable.  We fail when we give up too soon.”

 

“Real Artists Ship”

When Steve Jobs said “real artists ship,” he was calling the bluff of a recalcitrant engineer who couldn’t let go of some programming code.  In Linchpin, Seth Godin quotes poet Bruce Ario to explain that “creativity is an instinct to produce.”

Toward the end of the development phase, the Delivery Date forebodingly looms.  The delivery date is when your definition of success will be judged by others, which is why some people prefer the term Judgment Day since it seems far more appropriate.

“The only purpose of starting,” writes Godin, “is to finish, and while the projects we do are never really finished, they must ship.”

Godin explains that the primary challenge to shipping (i.e., completing development by or before your delivery date) is thrashing.

“Thrashing is the apparently productive brainstorming and tweaking we do for a project as it develops.  Thrashing is essential.  The question is: when to thrash?  Professional creators thrash early.  The closer the project gets to completion, the fewer people see it and the fewer changes are permitted.”

Thrashing is mostly about the pursuit of perfection. 

We believe that if what we deliver isn’t perfect, then our efforts will be judged a failure.  Of course, we know that perfection is impossible.  However, our fear of failure is often based on our false belief that perfection was the actual expectation of others. 

Therefore, our fear of failure offers this simple and comforting advice: if you don’t deliver, then you can’t fail.

However, real artists realize that success or failure—or even worse, mediocrity—could be the judgment that they receive after they have delivered.  Success rocks and failure sucks—but only if you don’t learn from it.  That’s why real artists always ship. 

 

The Winning Curve

I named it “The Winning Curve” both because its shape resembles a “W” and it sounds better than calling it “The Failing Curve.” 

However, the key point is that failure often (if not always) precedes success, and in both our professional and personal lives, most (if not all) of us are pursuing one or more kinds of success—and in these pursuits, we generally view failure as the enemy.

Failure is not the enemy.  In fact, the most successful people realize failure is their greatest ally.

As Thomas Edison famously said, “I didn’t find a way to make a light bulb, I found a thousand ways how not to make one.”

“Even in failure, there is success,” explains Chip and Dan Heath.  Whenever you fail, it’s extremely rare that everything you did was a failure.  Your approach almost always creates a few small sparks in your quest to find a way to make your own light bulb. 

“These flashes of success—these bright spots—can illuminate the road map for action,” according to the Heaths, who also explain that “we will struggle, we will fail, we will be knocked down—but throughout, we’ll get better, and we’ll succeed in the end.”

The Winning Curve can’t guarantee success—only learning.  Unfortunately, the name “The Learning Curve” was already taken.

 

Related Posts

Persistence

Thinking along the edges of the box

The HedgeFoxian Hypothesis

The Once and Future Data Quality Expert

Mistake Driven Learning

The Fragility of Knowledge

The Wisdom of Failure

A Portrait of the Data Quality Expert as a Young Idiot

The Diffusion of Data Governance

Marty Moseley of Initiate recently blogged Are We There Yet? Results of the Data Governance Survey, and the blog post includes a link to the survey, which is freely available—no registration required.

The Initiate survey says that although data governance dates back to the late 1980s, it is experiencing a resurgence because of initiatives such as business intelligence, data quality, and master data management—as well as the universal need to make better data-driven business decisions “in less time than ever before, often culling data from more structured and unstructured sources, with more transparency required.”

Winston Chen of Kalido recently blogged A Brief History of Data Governance, which provides a brief overview of three distinct eras in data management: Application Era (1960-1990), Enterprise Repository Era (1990-2010), and Policy Era (2010-?).

As I commented on Winston’s post, I began my career at the tail-end of the Application Era, and my career has been about a 50/50 split between applications and enterprise repositories since history does not move forward at the same pace for all organizations, including software vendors—by which, I mean that my professional experience was influenced more by working for vendors selling application-based solutions than it was by working with clients who were, let’s just say, less than progressive.

Diffusion of innovations (illustrated above) is a theory developed by Everett Rogers for describing the five stages and the rate at which innovations (e.g., new ideas or technology) spread through markets (or “cultures”), starting with the Innovators and the Early Adopters, then progressing through the Early Majority and the Late Majority, and finally ending with the Laggards.

Therefore, the exact starting points of the three eras Winston described in his post can easily be debated because progress can be painfully slow until a significant percentage of the Early Majority begins to embrace the innovation—thereby causing the so-called Tipping Point where progress begins to accelerate enough for the mainstream to take it seriously. 

Please Note: I am not talking about crossing “The Chasm”—which as Geoffrey A. Moore rightfully discusses, is the critical, but much earlier, phenomenon occurring when enough of the Early Adopters have embraced the innovation so that the beginning of the Early Majority becomes an almost certainty—but true mainstream adoption of the innovation is still far from guaranteed.

The tipping point that I am describing occurs within the Early Majority and before the top of the adoption curve is reached. 

Achieving 16% market share (or “cultural awareness”) is where the Early Majority begins—and only after successfully crossing the chasm (which I approximate occurs somewhere around 8% market share).  However,  the difference between a fad and a true innovation occurs somewhere around 25% market share—and this is the tipping point that I am describing.

The Late Majority (and the top of the adoption curve) doesn’t begin until 50% market share, and it’s all downhill from there, meaning that the necessary momentum has been achieved to almost guarantee that the innovation will be fully adopted.

For example, it could be argued that master data management (MDM) reached its tipping point in late 2009, and with the wave of acquisitions in early 2010, MDM stepped firmly on the gas pedal of the Early Majority, and we are perhaps just beginning to see the start of MDM’s Late Majority.

It is much harder to estimate where we are within the diffusion of data governance.  Of course, corporate cultural awareness always plays a significant role in determining the adoption of new ideas and the market share of emerging technologies.

The Initiate survey concludes that “the state of data governance initiatives is still rather immature in most organizations” and reveals “a surprising lack of perceived executive interest in data governance initiatives.”

Rob Karel of Forrester Research recently blogged about how Data Governance Remains Immature, but he is “optimistic that we might finally see some real momentum building for data governance to be embraced as a legitimate competency.”

“It will likely be a number of years before best practices outnumber worst practices,” as Rob concludes, “but any momentum in data governance adoption is good momentum!”

From my perspective, data governance is still in the Early Adopter phase.  Perhaps 2011 will be “The Year of Data Governance” in much the same way that some have declared 2010 to to be “The Year of MDM.”

In other words, it may be another six to twelve months before we can claim the Early Majority has truly embraced not just the idea of data governance, but have realistically begun their journey toward making it happen.

 

What Say You?

Please share your thoughts about the diffusion of data governance, as well as your overall perspectives on data governance.

 

Related Posts

MacGyver: Data Governance and Duct Tape

The Prince of Data Governance

Jack Bauer and Enforcing Data Governance Policies

 

Follow OCDQ

If you enjoyed this blog post, then please subscribe to OCDQ via my RSS feed, my E-mail updates, or Google Reader.

You can also follow OCDQ on Twitter, fan the Facebook page for OCDQ, and connect with me on LinkedIn.


New Time Human Business

The song “Old Time Rock and Roll” by Bob Seger was perhaps immortalized by that famous scene in the film Risky Business, which has itself become immortalized by its many parodies including the television commercials for the video game Guitar Hero.

As I recently blogged about in my post The Great Rift, the real risky business in the new economy of the 21st century is when organizations prioritize the value of things over the value of people.

Since here in the United States, we are preparing for a long holiday weekend in celebration of the Fourth of July, and also because I am (as usual) in a musical state of mind, I wrote my own parody song called “New Time Human Business.”

 

New Time Human Business

Just get that Old School way of doing business off your mind,
And listen to me sing about the Human Side of Business, because it’s time.
Today’s business world ain’t got no damn soul,
I like how the New Time Human Business rolls!

Don’t try to take my message to an executive boardroom,
You’ll find they stopped listening to their people a long time before.
I don’t know how they manage to even get their fat heads through the door,
I like how the New Time Human Business rolls!

I’ve always liked how the Human Side of Business rolls,
That kind of business just soothes the soul.
I reminisce about the days of old,
When “Mom and Pop” knew the real business goal,
Relationship, rapport, and trust—yeah, that’s what sold!

Today’s business world ain’t got no damn soul,
I like how the New Time Human Business rolls!

Won’t go to a Big Business Rally to hear them toot their own horn,
I’d rather hear real people sing some classic blues or funky old soul.
There’s only one sure way to get me to listen to your goals,
Start singing like how the New Time Human Business rolls!

Call me a rebel, call me a dreamer, call me what you will,
Say I’m an idiot, say doing business this way, I’ll never pay my damn bills.
But today’s business world ain’t got no damn soul,
I like how the New Time Human Business rolls!

I’ve always liked how the Human Side of Business rolls,
That kind of business just soothes the soul.
I reminisce about the days of old,
When “Mom and Pop” knew the real business goal,
Relationship, rapport, and trust—yeah, that’s what sold!

Today’s business world ain’t got no damn soul,
I like how the New Time Human Business rolls!

Do you believe in Magic (Quadrants)?

Twitter

If you follow Data Quality on Twitter like I do, then you are probably already well aware that the 2010 Gartner Magic Quadrant for Data Quality Tools was released this week (surprisingly, it did not qualify as a Twitter trending topic).

The five vendors that were selected as the “data quality market leaders” were SAS DataFlux, IBM, Informatica, SAP Business Objects, and Trillium.

Disclosure: I am a former IBM employee, former IBM Information Champion, and I blog for the Data Roundtable, which is sponsored by SAS.

Please let me stress that I have the highest respect for both Ted Friedman and Andy Bitterer, as well as their in depth knowledge of the data quality industry and their insightful analysis of the market for data quality tools.

In this blog post, I simply want to encourage a good-natured debate, and not about the Gartner Magic Quadrant specifically, but rather about market research in general.  Gartner is used as the example because they are perhaps the most well-known and the source most commonly cited by data quality vendors during the sales cycle—and obviously, especially by the “leading vendors.”

I would like to debate how much of an impact market research really has on a prospect’s decision to purchase a data quality tool.

Let’s agree to keep this to a very informal debate about how research can affect both the perception and the reality of the market.

Therefore—for the love of all high quality data everywhere—please, oh please, data quality vendors, do NOT send me your quarterly sales figures, or have your PR firm mercilessly spam either my comments section or my e-mail inbox with all the marketing collateral “proving” how Supercalifragilisticexpialidocious your data quality tool is—I said please, so play nice.

 

The OCDQ View on OOBE-DQ

In a previous post, I used the term OOBE-DQ to refer to the out-of-box-experience (OOBE) provided by data quality (DQ) tools, which usually becomes a debate between “ease of use” and “powerful functionality” after you ignore the Magic Beans sales pitch that guarantees you the data quality tool is both remarkably easy to use and incredibly powerful.

However, the data quality market continues to evolve away from esoteric technical tools and toward business-empowering suites providing robust functionality with easier to use and role-based interfaces that are tailored to the specific needs of different users, such as business analysts, data stewards, application developers, and system administrators.

The major players are still the large vendors who have innovated (mostly via acquisition and consolidation) enterprise application development platforms with integrated (to varying degrees) components, which provide not only data quality functionality, but also data integration and master data management (MDM) as well.

Many of these vendors also offer service-oriented deployments delivering the same functionality within more loosely coupled technical architectures, which includes leveraging real-time services to prevent (or at least greatly minimize) poor data quality at the multiple points of origin within the data ecosystem.

Many vendors are also beginning to provide better built-in reporting and data visualization capabilities, which is helping to make the correlation between poor data quality and suboptimal business processes more tangible, especially for executive management.

It must be noted that many vendors (including the “market leaders”) continue to struggle with their International OOBE-DQ. 

Many (if not most) data quality tools are strongest in their native country or their native language, but their OOBE-DQ declines significantly when they travel abroad.  Especially outside of the United States, smaller vendors with local linguistic and cultural expertise built into their data quality tools have continued to remain fiercely competitive with the larger vendors.

Market research certainly has a role to play in making a purchasing decision, and perhaps most notably as an aid in comparing and contrasting features and benefits, which of course, always have to be evaluated against your specific requirements, including both your current and future needs. 

Now let’s shift our focus to examining some of the inherent challenges of evaluating market research, perception, and reality.

 

Confirmation Bias

First of all, I realize that this debate will suffer from a considerable—and completely understandable—confirmation bias.

If you are a customer, employee, or consultant for one of the “High Five” (not an “official” Gartner Magic Quadrant term for the Leaders), then obviously you have a vested interest in getting inebriated on your own Kool-Aid (as noted in my disclosure above, I used to get drunk on the yummy Big Blue Kool-Aid).  Now, this doesn’t mean that you are a “yes man” (or a “yes woman”).  It simply means it is logical for you to claim that market research, market perception, and market reality are in perfect alignment.

Likewise, if you are a customer, employee, or consultant for one of the “It Isn’t Easy Being Niche-y” (rather surprisingly, not an “official” Gartner Magic Quadrant term for the Niche Players), then obviously you have a somewhat vested interest in claiming that market research is from Mars, market perception is from Venus, and market reality is really no better than reality television.

And, if you are a customer, employee, or consultant for one of the “We are on the outside looking in, flipping both Gartner and their Magic Quadrant the bird for excluding us” (I think that you can figure out on your own whether or not that is an “official” Gartner Magic Quadrant term), then obviously you have a vested interest in saying that market research can “Kiss My ASCII!”

My only point is that your opinion of market research will obviously be influenced by what it says about your data quality tool. 

Therefore, should it really surprise anyone when, during the sales cycle, one of the High Five uses the Truly Awesome Syllogism:

“Well, of course, we say our data quality tool is awesome.
However, the Gartner Magic Quadrant also says our data quality tool is awesome.
Therefore, our data quality tool is Truly Awesome.”

Okay, so technically, that’s not even a syllogism—but who said any form of logical argument is ever used during a sales cycle?

On a more serious note, and to stop having too much fun at Gartner’s expense, they do advise against simply selecting vendors in their “Leaders quadrant” and instead always advise to select the vendor that is the better match for your specific requirements.

 

Features and Benefits: The Game Nobody Wins

As noted earlier, a features and benefits comparison is not only the most common technique used by prospects, but it is also the most common—if not the only—way that the vendors themselves position their so-called “competitive differentiation.”

The problem with this approach—and not just for data quality tools—is that there are far more similarities than differences to be found when comparing features and benefits. 

Practically every single data quality tool on the market today will include functionality for data profiling, data quality assessment, data standardization, data matching, data consolidation, data integration, data enrichment, and data quality monitoring.

Therefore, running down a checklist of features is like playing a game of Buzzword Bingo, or constantly playing Musical Chairs, but without removing any of the chairs in between rounds—in others words, the Features Game almost always ends in a tie.

So then next we play the Benefits Game, which is usually equally pointless because it comes down to silly arguments such as “our data matching engine is better than yours.”  This is the data quality tool vendor equivalent of:

Vendor D: “My Dad can beat up your Dad!”

Vendor Q: “Nah-huh!”

Vendor D: “Yah-huh!”

Vendor Q: “NAH-HUH!”

Vendor D: “YAH-HUH!”

Vendor Q: “NAH-HUH!”

Vendor D: “Yah-huh!  Stamp it!  No Erasies!  Quitsies!”

Vendor Q: “No fair!  You can’t do that!”

After both vendors have returned from their “timeout,” a slightly more mature approach is to run a vendor “bake-off” where the dueling data quality tools participate in a head-to-head competition processing a copy of the same data provided by the prospect. 

However, a bake-off often produces misleading results because the vendors—and not the prospect—perform the competition, making it mostly about vendor expertise, not OOBE-DQ.  Also, the data used rarely exemplifies the prospect’s data challenges.

If competitive differentiation based on features and benefits is a game that nobody wins, then what is the alternative?

 

The Golden Circle

The Golden Circle

I recently read the book Start with Why by Simon Sinek, which explains that “people don’t buy WHAT you do, they buy WHY you do it.” 

The illustration shows what Simon Sinek calls The Golden Circle.

WHY is your purpose—your driving motivation for action. 

HOW is your principles—specific actions that are taken to realize your Why. 

WHAT is your results—tangible ways in which you bring your Why to life. 

It’s a circle when viewed from above, but in reality it forms a megaphone for broadcasting your message to the marketplace. 

When you rely only on the approach of attempting to differentiate your data quality tool by discussing its features and benefits, you are focusing on only your WHAT, and absent your WHY and HOW, you sound just like everyone else to the marketplace.

When, as is often the case, nobody wins the Features and Benefits Game, a data quality tool sounds more like a commodity, which will focus the marketplace’s attention on aspects such as your price—and not on aspects such as your value.

Due to the considerable length of this blog post, I have been forced to greatly oversimplify the message of this book, which a future blog post will discuss in more detail.  I highly recommend the book (and no, I am not an affiliate).

At the very least, consider this question:

If there truly was one data quality tool on the market today that, without question, had the very best features and benefits, then why wouldn’t everyone simply buy that one? 

Of course your data quality tool has solid features and benefits—just like every other data quality tool does.

I believe that the hardest thing for our industry to accept is—the best technology hardly ever wins the sale. 

As most of the best salespeople will tell you, what wins the sale is when a relationship is formed between vendor and customer, a strategic partnership built upon a solid foundation of rapport, respect, and trust.

And that has more to do with WHY you would make a great partner—and less to do with WHAT your data quality tool does.

 

Do you believe in Magic (Quadrants)?

I Want To Believe

How much of an impact do you think market research has on the purchasing decision of a data quality tool?  How much do you think research affects both the perception and the reality of the data quality tool market?  How much do you think the features and benefits of a data quality tool affect the purchasing decision?

All perspectives on this debate are welcome without bias.  Therefore, please post a comment below.

PLEASE NOTE

Comments advertising your products and services (or bashing competitors) will not be approved.

 

 

The Great Rift

I recently read a great article about social collaboration in the enterprise by Julie Hunt, which includes the excellent insight:

“Most enterprises have failed to engender a ‘collaboration culture’ based on real human interaction.  The executive management of many companies does not even understand what a ‘collaboration culture’ is.  Frankly, executive management of many companies is hard put to authentically value employees—these companies want to de-humanize employees with such terms as ‘resources’ and ‘human capital’, and think that it is enough if they sling around a few ‘mission statements’ claiming that they ‘value’ employees.”

Even though the article was specifically discussing the reason why companies struggle to effectively use social media in business, it reminded me of the reason that many enterprise initiatives struggle—if not fail—to live up to their rather lofty expectations.

The most common root cause for the failure of enterprise initiatives is what I like to refer to as The Great Rift.

 

The Great Rift

In astronomy, the Great Rift—also known as the Dark Rift—is a series of overlapping and non-luminous molecular dust clouds, which appear to create a dark divide in the otherwise bright band of stars and other luminous objects comprising our Milky Way.

Within the intergalactic empires of the business world, The Great Rift is a metaphor for the dark divide separating how most of these organizations would list and prioritize their corporate assets:

Please note that a list of things is on the left side of The Great Rift and on the right side is a list of people. 

Although the order of importance given to the items within each of these lists is debatable, I would argue what is not debatable is that the list of things is what most organizations prioritize as their most important corporate assets.

It is precisely this prioritization of the value of things over the value of people that creates and sustains The Great Rift.

Of course, the message delivered by corporate mission statements, employee rallies, and customer conferences would lead you to believe the exact opposite is true—and in fairness, some organizations do prioritize the value of people over the value of things.

However, the harsh reality of the business world is that the message “we value our people” is often only a Machiavellian illusion.

I believe that as long as The Great Rift exists, then no enterprise initiative can be successful—or remain successful for very long. 

The enterprise-wide communication and collaboration that is so critical to achieving and sustaining success on initiatives such as Master Data Management (MDM) and Data Governance, can definitely not escape the effects of The Great Rift. 

Eventually, The Great Rift becomes the enterprise equivalent of a black hole, where not even the light shining from your very brightest stars will be able to escape its gravitational pull.

“Returning to the human side of business won’t happen magically,” Julie Hunt concluded her article.  “It will take real work and real commitment, from the executive level through all levels of management and employee departments.”

I wholeheartedly agree with Julie and will therefore conclude this blog post by paraphrasing the lyrics from “Yellow” by Coldplay into a song I am simply calling “People” because repairing The Great Rift and “returning to the human side of business” can only be accomplished by acknowledging that every organization’s truly most important corporate asset is—their people.

Rumors have it that the The Rolling Forecasts might even add the song to their playlist for the Data Rock Star World Tour 2010.

 

People

Look at your people
Look how they shine for you
And in everything they do
Yeah, they’re all stars

They came along 
They wrote a song for you
About all the Things they do
And it was called People

So they each took their turn 
And sung about all the things they’ve done
And it was all for you

Your business
Oh yeah, your technology and your data too
They turned it all into something beautiful
Did you know they did it for you?
They did it all for you

Now what are you going to do for them?

They crossed The Great Rift
They jumped across for you 
Because all the things you do
Are all done by your people

Look at your stars
Look how they shine
And in everything they do
Look how they shine for you

They crossed the line
The imaginary line drawn by you
Oh what a wonderful thing to do
And it was all for you

Your business
Oh yeah, your technology and your data too
They turned it all into something beautiful
Did you know they did it for you?
They did it all for you

Now what are you going to do for them?

Look at your people, they’re your stars, it’s true
Look how they shine
And in everything they do
Look how they shine for you

Look at your people
Look at your stars
Look how they shine
And in everything they do
Look how they shine for you

Now what are you going to do for them?

Channeling My Inner Beagle: The Case for Hyperactivity

UnderDog

Phil Simon, who is a Bulldog’s best friend and is a good friend of mine, recently blogged Channeling My Inner Bulldog: The Case for Stubbornness, in which he described how the distracting nature of multitasking can impair our ability to solve complex problems.

Although I understood every single word he wrote, after three dog nights, I can’t help but take the time to share my joy to the world by channeling my inner beagle and making the case for hyperactivity—in other words, our need to simply become better multitaskers.

The beloved mascot of my blog post is Bailey, not only a great example of a typical Beagle, but also my brother’s family dog, who is striking a heroic pose in this picture while proudly sporting his all-time favorite Halloween costume—Underdog.

I could think of no better hero to champion my underdog of a cause:

“There’s no need to fear . . . hyperactivity!”

 

Please Note: Just because Phil Simon coincidentally uses “Simon Says” as the heading for all his blog conclusions, doesn’t mean Phil is Simon Bar Sinister, who coincidentally used “Simon Says” to explain his diabolical plans—that’s completely coincidental.

 

The Power of Less

I recently read The Power of Less, the remarkable book by Leo Babauta, which provides practical advice on simplifying both our professional and personal lives.  The book has a powerfully simple message—identify the essential, eliminate the rest.

I believe that the primary reason multitasking gets such a bad reputation is the numerous non-essential tasks typically included. 

Many daily tasks are simply “busy work” that we either don’t really need to do at all, or don’t need to do as frequently.  We have allowed ourselves to become conditioned to perform certain tasks, such as constantly checking our e-mail and voice mail. 

Additionally, whenever we do find a break in our otherwise hectic day, “nervous energy” often causes us to feel like we should be doing something with our time—and so the vicious cycle of busy work begins all over again.

“Doing nothing is better than being busy doing nothing,” explained Lao Tzu

I personally find that whenever I am feeling overwhelmed by multitasking, it’s not because I am trying to distribute my time among a series of essential tasks—instead, I was really just busy doing a whole lot of nothing.  “Doing a huge number of things,” explains Babauta, “doesn’t mean you’re getting anything meaningful done.”

Meaningful accomplishment requires limiting our focus to only essential tasks.  Unlimited focus, according to Babauta, is like “taking a cup of red dye and pouring it into the ocean, and watching the color dilute into nothingness.  Limited focus is putting that same cup of dye into a gallon of water.”

Only you can decide which tasks are essential.  Look at your “to do list” and first identify the essential—then eliminate the rest.

 

It’s about the journey—not the destination

Once you have eliminated the non-essential tasks, your next challenge is limiting your focus to only the essential tasks. 

Perhaps the simplest way to limit your focus and avoid the temptation of multitasking altogether is to hyper-focus on only one task at a time.  So let’s use reading a non-fiction book as an example of one of the tasks you identified as essential.

Some people would read this non-fiction book as fast as they possibly can—hyper-focused and not at all distracted—as if they’re trying to win “the reading marathon” by finishing the book in the shortest time possible. 

They claim that this gives them both a sense of accomplishment and allows them to move on to their next essential task, thereby always maintaining their vigilant hyper-focus of performing only one task at a time. 

However, what did they actually accomplish other than simply completing the task of reading the book?

I find people—myself included—that voraciously read non-fiction books often struggle when attempting to explain the book, and in fact, they usually can’t tell you anything more than what you would get from simply reading the jacket cover of the book. 

Furthermore, they often can’t demonstrate any proof of having learned anything from reading the book.  Now, if they were reading fiction, I would argue that’s not a problem.  However, their “undistracted productivity” of reading a non-fiction book can easily amount to nothing more than productive entertainment. 

They didn’t mind the gap between the acquisition of new information and its timely and practical application.  Therefore, they didn’t develop valuable knowledge.  They didn’t move forward on their personal journey toward wisdom. 

All they did was productively move the hands of the clock forward—all they did was pass the time.

Although by eliminating distractions and focusing on only essential tasks, you’ll get more done and reach your destination faster, in my humble opinion, a meaningful life is not a marathon—a meaningful life is a race not to run.

It’s about the journey—not the destination.  In the words of Ralph Waldo Emerson:

“With the past, I have nothing to do; nor with the future.  I live now.”

Hyperactivity is Simply Better Multitasking

Although I do definitely believe in the power of less, the need to eliminate non-essential tasks, and the need to focus my attention, I am far more productive when hyper-active (i.e., intermittently alternating my attention among multiple simultaneous tasks).

Hyperactively collecting small pieces of meaningful information from multiple sources, as well as from the scattered scraps of knowledge whirling around inside my head, is more challenging, and more stressful, than focusing on only one task at a time.

However, at the end of most days, I find that I have made far more meaningful progress on my essential tasks. 

Although, in all fairness, I often breakdown and organize essential tasks into smaller sub-tasks, group similar sub-tasks together, then I multitask within only one group at a time.  This lower-level multitasking minimizes what I call the plate spinning effect, where an interruption can easily cause a disastrous disruption in productivity.

Additionally, I believe that not all distractions are created equal.  Some, in fact, can be quite serendipitous.  Therefore, I usually allow myself to include one “creative distraction” in my work routine.  (Typically, I use either Twitter or some source of music.)

By eliminating non-essential tasks, grouping together related sub-tasks, and truly embracing the chaos of creative distraction, hyperactivity is simply better multitasking—and I think that in the Digital Age, this is a required skill we all must master.

 

The Rumble in the Dog Park

So which is better?  Stubbornness or Hyperactivity?  In the so-called Rumble in the Dog Park, who wins?  Bulldogs or Beagles? 

I know that I am a Beagle.  Phil knows he is a Bulldog.  I would be unhappy as a Bulldog.  Phil would be unhappy as a Beagle. 

And that is the most important point.

There is absolutely no better way to make yourself unhappy than by trying to live by someone else’s definition of happiness.

You should be whatever kind of dog that truly makes you happy.  In other words, if you prefer single-tasking, then be a Bulldog, and if you prefer multitasking, then be a Beagle—and obviously, Bulldogs and Beagles are not the only doggone choices.

Maybe you’re one of those people who prefers cats—that’s cool too—just be whatever kind of cool cat truly makes you happy. 

Or maybe you’re neither a dog person nor a cat person.  Maybe you’re more of a Red-Eared Slider kind of person—that’s cool too.

And who ever said that you had to choose to be only one kind of person anyway? 

Maybe some days you’re a Beagle, other days you’re a Bulldog, and on weekends and vacation days you’re a Red-Eared Slider. 

It’s all good

Just remember—no matter what—always be you.

Twitter, Meaningful Conversations, and #FollowFriday

In social media, one of the most common features of social networking services is allowing users to share brief status updates.  Twitter is currently built on only this feature and uses status updates (referred to as tweets) that are limited to a maximum of 140 characters, which creates a rather pithy platform that many people argue is incompatible with meaningful communication.

Although I use Twitter for a variety of reasons, one of them is sharing quotes that I find thought-provoking.  For example:

 

This George Santayana quote was shared by James Geary, whom I follow on Twitter because he uses his account to provide the “recommended daily dose of aphorisms.”  My re-tweet (i.e., “forwarding” of another user’s status update) triggered the following meaningful conversation with Augusto Albeghi, the founder of StraySoft who is known as @Stray__Cat on Twitter:

 

Now of course, I realize that what exactly constitutes a “meaningful conversation” is debatable regardless of the format.

Therefore, let me first provide my definition, which is comprised of the following three simple requirements:

  1. At least two people discussing a topic, which is of interest to all parties involved
  2. Allowing all parties involved to have an equal chance to speak (or otherwise share their thoughts)
  3. Attentively listening to the current speaker—as opposed to merely waiting for your turn to speak

Next, let’s examine why Twitter’s format can be somewhat advantageous to satisfying these requirements:

  1. Although many (if not most) tweets are not necessarily attempting to start a conversation, at the very least they do provide a possible topic for any interested parties
  2. Everyone involved has an equal chance to speak, but time lags and multiple simultaneous speakers can occur, which in all fairness can happen in any other format
  3. Tweets provide somewhat of a running transcript (again, time lags can occur) for the conversation, making it easier to “listen” to the other speaker (or speakers)

Now, let’s address the most common objection to Twitter being used as a conversation medium:

“How can you have a meaningful conversation when constrained to only 140 characters at a time?”

I admit to being a long-winded talker or, as a favorite (canceled) television show would say, “conversationally anal-retentive.”  In the past (slightly less now), I was also known for e-mail messages even Leo Tolstoy would declare to be far too long.

However, I wholeheartedly agree with Jennifer Blanchard, who explained how Twitter makes you a better writer.  When forced to be concise, you have to focus on exactly what you want to say, using as few words as possible.

I call this reduction of your message to its bare essence—the power of pith.  In order to engage in truly meaning conversations, this is a required skill we all must master, and not just for tweeting—but Twitter does provide a great practice environment.

 

At least that’s my 140 characters worth on this common debate—well okay, it’s more like my 5,000 characters worth.

 

Great folks to follow on Twitter

Since this blog post was published on a Friday, which for Twitter users like me means it’s FollowFriday, I would like to conclude by providing a brief list of some great folks to follow on Twitter. 

Although by no means a comprehensive list, and listed in no particular order whatsoever, here are some great tweeps, and especially if you are interested in Data Quality, Data Governance, Master Data Management, and Business Intelligence:

 

PLEASE NOTE: No offense is intended to any of my tweeps not listed above.  However, if you feel that I have made a glaring omission of an obviously Twitterific Tweep, then please feel free to post a comment below and add them to the list.  Thanks!

I hope that everyone has a great FollowFriday and an even greater weekend.  See you all around the Twittersphere.

 

Related Posts

Wordless Wednesday: June 16, 2010

Data Rock Stars: The Rolling Forecasts

The Fellowship of #FollowFriday

Social Karma (Part 7)

The Wisdom of the Social Media Crowd

The Twitter Clockwork is NOT Orange

Video: Twitter #FollowFriday – January 15, 2010

Video: Twitter Search Tutorial

Live-Tweeting: Data Governance

Brevity is the Soul of Social Media

If you tweet away, I will follow

Tweet 2001: A Social Media Odyssey

MacGyver: Data Governance and Duct Tape

One of my favorite 1980s television shows was MacGyver, which starred Richard Dean Anderson as an extremely intelligent and endlessly resourceful secret agent, known for his practical application of scientific knowledge and inventive use of common items.

While I was thinking about the role of both data stewards and data cleansing within a successful data governance program, the two things that immediately came to mind were MacGyver, and the other equally versatile metaphor for versatility—duct tape

I decided to combine these two excellent metaphors by envisioning MacGyver as a data steward and duct tape as data cleansing.

 

Data Steward: The MacGyver of Data Governance

Since “always prepared for adventure” was one of the show’s taglines, I think MacGyver would make an excellent data steward.

The fact that the activities associated with the role can vary greatly, almost qualifies “data steward” as a MacGyverism.  Your particular circumstances, and especially the unique corporate culture of your organization, will determine the responsibilities of your data stewardship function, but the general principles of data stewardship, as defined by Jill Dyché, include the following:

  • Stewardship is the practice of managing or looking after the well being of something.
  • Data is an asset owned by the enterprise.
  • Data stewards do not necessarily “own” the data assigned to them.
  • Data stewards care for data assets on behalf of the enterprise.

Just like MacGyver’s trusted sidekick—his Swiss Army knife—the most common trait of a data steward may be versatility. 

I am not suggesting that a data steward is a jack of all trades, but master of none.  However, a data steward often has a rather HedgeFoxian personality, thereby possessing the versatility necessary to integrate disparate disciplines into practical solutions.

In her excellent article Data Stewardship Strategy, Jill Dyché outlined six tried-and-true techniques that can help you avoid some common mistakes and successfully establish a data stewardship function within your organization.  The second technique provides a few examples of typical data stewardship activities, which often include assessing and correcting data quality issues.

 

Data Cleansing: The Duct Tape of Data Quality

About poor data quality, MacGyver says, “if I had some duct tape, I could fix that.”  (Okay—so he says that about everything.)

Data cleansing is the duct tape of data quality.

Proactive defect prevention is highly recommended, even though it is impossible to truly prevent every problem before it happens, because the more control enforced where data originates, the better the overall quality will be for enterprise information. 

However, when poor data quality negatively impacts decision-critical information, the organization may legitimately prioritize a reactive short-term response—where the only remediation will be finding and fixing the immediate problems. 

Of course, remediation limited to data cleansing alone will neither identify nor address the burning root cause of those problems. 

Effectively balancing the demands of a triage mentality with a best practice of implementing defect prevention wherever possible, will often create a very challenging situation for data stewards to contend with on a daily basis.  However, like MacGyver says:

“When it comes down to me against a situation, I don’t like the situation to win.”

Therefore, although comprehensive data remediation will require combining reactive and proactive approaches to data quality, data stewards need to always keep plenty of duct tape on hand (i.e., put data cleansing tools to good use whenever necessary).

 

The Data Governance Foundation

In the television series, MacGyver eventually left the clandestine service and went to work for the Phoenix Foundation

Similarly, in the world of data quality, many data stewards don’t formally receive that specific title until they go to work helping to establish your organization’s overall Data Governance Foundation.

Although it may be what the function is initially known for, as Jill Dyché explains, “data stewardship is bigger than data quality.”

“Data stewards establish themselves as adept at executing new data governance policies and consequently, vital to ongoing information management, they become ambassadors on data’s behalf, proselytizing the concept of data as a corporate asset.”

Of course, you must remember that many of the specifics of the data stewardship function will be determined by your unique corporate culture and where your organization currently is in terms of its overall data governance maturity.

Although not an easy mission to undertake, the evolving role of a data steward is of vital importance to data governance.

The primary focus of data governance is the strategic alignment of people throughout the organization through the definition, and enforcement, of policies in relation to data access, data sharing, data quality, and effective data usage, all for the purposes of supporting critical business decisions and enabling optimal business performance. 

I know that sounds like a daunting challenge (and it definitely is) but always remember the wise words of Angus MacGyver:

“Brace yourself.  This could be fun.”

Related Posts

The Prince of Data Governance

Jack Bauer and Enforcing Data Governance Policies

The Circle of Quality

A Tale of Two Q’s

Live-Tweeting: Data Governance

 

Follow OCDQ

If you enjoyed this blog post, then please subscribe to OCDQ via my RSS feed, my E-mail updates, or Google Reader.

You can also follow OCDQ on Twitter, fan the Facebook page for OCDQ, and connect with me on LinkedIn.


Wednesday Word: June 23, 2010

Wednesday Word is an OCDQ regular segment intended to provide an occasional alternative to my Wordless Wednesday posts.  Wednesday Word provides a word (or words) of the day, including both my definition and an example of recommended usage.

 

Referential Narcissisity

Definition – When referential integrity is enforced, a relational database table’s foreign key columns must only contain data values from their parent table’s primary key column, but referential narcissisity occurs when a table’s foreign key columns refuse to acknowledge data values from their alleged parent table—especially when the parent table was created by another DBA.

Example – The following scene is set on the eighth floor of the Nemesis Corporation, where within the vast cubicle farm of the data architecture group, Bob, a Business Analyst struggling with an ad hoc report, seeks the assistance of Doug, a Senior DBA.

Bob: “Excuse me, Doug.  I don’t mean to bother you, I know you are a very busy and important man, but I am trying to join the Sales Transaction table to the Customer Master table using Customer Key, and my queries always return zero rows.”

Doug: “That is because although Doug created the Sales Transaction table, the Customer Master table was created by Craig.  Doug’s tables do not acknowledge any foreign key relationships with Craig’s tables.  Doug is superior to Craig in every way.  Doug’s Kung Fu is the best—and until Craig publicly acknowledges this, your joins will not return any rows.”

Bob: “Uh, why do you keep referring to yourself in the third person?”

Doug: “Doug is bored with this conversation now.  Be gone from my sight, lowly business analyst.  You should be happy that Doug even acknowledged your presence at all.” 

 

Related Posts

Wednesday Word: June 9, 2010 – C.O.E.R.C.E.

Wednesday Word: April 28, 2010 – Antidisillusionmentarianism

Wednesday Word: April 21, 2010 – Enterpricification

Wednesday Word: April 7, 2010 – Vendor Asskisstic

The Balancing Act of Awareness

This is my sixth blog post tagged Karma since I promised to discuss it directly and indirectly on my blog throughout the year after declaring KARMA my theme word for 2010 back on the first day of January—surprisingly now almost six months ago.

Lately I have been contemplating the importance of awareness, and far more specifically, the constant challenge involved in maintaining the balance between our self-awareness and our awareness of others.

The three sections below are each prefaced by a chapter from Witter Bynner’s “American poetic” translation of the Tao Te Ching.  I certainly do not wish to offend anyone’s religious sensibilities—I am using these references in a philosophical and secular sense.

Since I also try to balance my philosophy between Eastern and Western influences, Lao Tzu won’t be the only “old master” cited.

Additionally, please note that the masculine language (e.g., “he” and “man”) used in the selected quotes below is a by-product of the age of the original texts (e.g., the Tao Te Ching is over 2,500 years old).  Therefore, absolutely no gender bias is intended.

 

Self-Awareness

“Nothing can bring you peace but yourself.”  Ralph Waldo Emerson wrote this sentence in the closing lines of his wonderful essay on Self-Reliance, which is one of my all-time favorites even though I first read it over 25 years ago.  My favorite passage is:

“What I must do is all that concerns me, not what the people think.  This rule, equally arduous in actual and in intellectual life, may serve for the whole distinction between greatness and meanness.  It is the harder because you will always find those who think they know what is your duty better than you know it.  It is easy in the world to live after the world’s opinion; it is easy in solitude to live after our own; but the great man is he who in the midst of the crowd keeps with perfect sweetness the independence of solitude.”

Emerson’s belief in the primacy of the individual was certainly not an anti-social sentiment.

Emerson believed society is best served whenever individuals possess a healthy sense of self and a well-grounded self-confidence, both of which can only be achieved if we truly come to know who we are on our own terms.

Writing more than 150 years later, and in one of my all-time favorite non-fiction books, The 7 Habits of Highly Effective People, Stephen Covey explains the importance of first achieving independence through self-mastery before successful interdependence with others is possible.  “Interdependence is a choice only independent people can make,” Covey explained.  “Dependent people cannot choose to become interdependent.  They don’t have the character to do it; they don’t own enough of themselves.”

“Private victories precede public victories,” wrote Covey, explaining that the private victories of independence are the essence of our character growth, and provide the prerequisite foundation necessary for the public victories of interdependence.

Of course, the reality is that self-awareness and independence cannot be developed only during our moments of solitude.

We must interact with others even before we have achieved self-mastery.  Furthermore, self-mastery is a continuous process.  Although self-awareness is essential for effectively interacting with others, it provides no guarantee for social success.

However, as William Shakespeare taught us by way of the character Polonius in Hamlet:

“This above all—to thine own self be true;
And it must follow, as the night the day,
Thou canst not then be false to any man.”

Other-Awareness

Empathy, which is central to our awareness of others (i.e., other-awareness), is often confused with sympathy.

Sympathy is an agreement of feeling that we express by providing support or showing compassion for the suffering of others.  Empathy is an identification with the emotions, thoughts, or perspectives expressed by others.

The key difference is found between the words agreement and identification.

Sympathy is the ability to relate oneself to others.  Empathy is the ability to see the self in others—not your self, but the unique self within each individual.  Sympathy is about trying to comfort others.  Empathy is about trying to understand others.

“Empathy is not sympathy,” explains Covey.  “Sympathy is a form of agreement, a form of judgment.  And it is sometimes the more appropriate response.  But people often feed on sympathy.  It makes them dependent.  The essence of empathy is not that you agree with someone; it’s that you fully, deeply, understand that person, emotionally as well as intellectually.”

Although both sympathy and empathy are important, empathy is more crucial for other-awareness.

We often simply act sympathetic when in the presence of others.  Therefore, sympathy is sometimes all too easy to feign and can easily remain superficial.  Empathy is less ostentatious, but can exert a far more powerfully positive influence over others.

In the words of Roy Schafer, who emphasized the role of narrative (i.e., the interpretation of our life stories) in psychoanalysis:

“Empathy involves the inner experience of sharing in and comprehending the momentary psychological state of another person.”

Balanced Awareness

Although it is easy to be aware of only our good qualities, while at the same time, only be aware of the bad qualities of others, these convenient blind spots in our awareness can also become our greatest teachers. 

Borrowing the wise words of Socrates, which thankfully were recorded for us by Plato:

“The unexamined life is not worth living.”

Examining our awareness, and shifting its focus when appropriate between self-awareness and other-awareness truly requires a delicate balancing act. 

When we become preoccupied with self-awareness, our consideration for others suffers.  Likewise, if we become too focused on other-awareness, we can neglect our own basic needs.

Aristotle wrote about such challenges using what he called the Golden Mean, which is usually simplified into the sage advice:

“Moderation in all things.” 

Obviously, there will be times when self-awareness must be our priority, and other times when it must become other-awareness. 

I believe that there is no such thing as achieving a perfect balance, but if we remain true to our own character, then hopefully a consistency will flow freely throughout all of our behaviors, our actions, and our communication and collaboration with others.

 

Related Posts

The Importance of Envelopes

The Point of View Paradox

The Challenging Gift of Social Media

True Service

The Game of Darts – An Allegory

“I can make glass tubes”

My #ThemeWord for 2010: KARMA

Promoting Poor Data Quality

A few months ago, during an e-mail correspondence with one of my blog readers from Brazil (I’ll let him decide if he wishes to remain anonymous or identify himself in the comments section), I was asked the following intriguing question:

“Who profits from poor data quality?”

The specific choice of verb (i.e., “profits”) may have been a linguistic issue, by which I mean that since I don’t know Portuguese, our correspondence had to be conducted in English. 

Please don’t misunderstand me—his writing was perfectly understandable. 

As I discussed in my blog post Can Social Media become a Universal Translator?, my native language is English, and like many people from the United States, it is the only language I am fluent in.  My friends from Great Britain would most likely point that I am only fluent in the American “version” of the English language, but that’s a topic for another day—and another blog post.

When anyone communicates in another language—and especially in writing—not every word may be exactly right. 

For example: Muito obrigado por sua pergunta!

Hopefully (and with help from Google Translate), I just wrote “thank you for your question” in Portuguese.

My point is that I believe he was asking why poor data quality continues to persist as an extremely prevalent issue, especially when its detrimental effects on effective business decisions has become painfully obvious given the recent global financial crisis.

However, being mentally stuck on my literal interpretation of the word “profit” has delayed my blog post response—until now.

 

Promoting Poor Data Quality

In economics, the term “flight to quality” describes the aftermath of a financial crisis (e.g., a stock market crash) when people become highly risk-averse and move their money into safer, more reliable investments.  A similar “flight to data quality” often occurs in the aftermath of an event when poor data quality negatively impacted decision-critical enterprise information. 

The recent recession provides many examples of the financial aspect of this negative impact.  Therefore, even companies that may not have viewed poor data quality as a major risk—and a huge cost greatly decreasing their profits—are doing so now.

However, the retail industry has always been known for its paper thin profit margins, which are due, in large part, to often being forced into the highly competitive game of pricing.  Although dropping the price is the easiest way to sell just about any product, it is also virtually impossible to sustain this rather effective, but short-term, tactic as a viable long-term business strategy. 

Therefore, a common approach used to compete on price without risking too much on profit is to promote sales using a rebate, which I believe is a business strategy intentionally promoting poor data quality for the purposes of increasing profits.

 

You break it, you slip it—either way—you buy it, we profit

The most common form of a rebate is a mail-in rebate.  The basic premise is simple.  Instead of reducing the in-store price of a product, it is sold at full price, but a rebate form is provided that the customer can fill out and mail to the product’s manufacturer, which will then mail a rebate check to the customer—usually within a few business weeks after approving the rebate form. 

For example, you could purchase a new mobile phone for $250 with a $125 mail-in rebate, which would make the “sale price” only $125—which is what the store will advertise as the actual sale price with “after a $125 mail-in rebate” written in small print.

Two key statistics significantly impact the profitability of these type of rebate programs, breakage and slippage.

Breakage is the percentage of customers who, for reasons I will get to in a moment, fail to take advantage of the rebate, and therefore end up paying full price for the product.  Returning to my example, the mobile phone that would have cost $125 if you received the $125 mail-in rebate, instead becomes exactly what you paid for it—$250 (plus applicable taxes, of course).

Slippage is the percentage of customers who either don’t mail in the rebate form at all, or don’t cash their received rebate check.  The former is the most common “slip,” while the latter is usually caused by failing to cash the rebate check before it expires, which is typically 30 to 90 days after it is processed (i.e., expiration dated)—and regardless of when it is actually received.

Breakage, and the most common form of slippage, are generally the result of making the rebate process intentionally complex. 

Rebate forms often require you to provide a significant amount of information, both about yourself and the product, as well as attach several “proofs of purchase” such as a copy of the receipt and the barcode cut out of the product’s package. 

Data entry errors are perhaps the most commonly cited root cause of poor data quality. 

Rebates seem designed to guarantee data entry errors (by encouraging the customer to fill out the rebate form incorrectly). 

In this particular situation, the manufacturer is hyper-vigilant about data quality and for an excellent reason—poor data quality will either delay or void the customer’s rebate. 

Additionally, the fine print of the rebate form can include other “terms and conditions” voiding the rebate—even if the form is filled out perfectly.  A common example is the limitation of “only one rebate per postal address.”  This sounds reasonable, right? 

Well, one major electronics manufacturer used this disclaimer to disqualify all customers who lived in multiple unit dwellings, such as an apartment building, where another customer “at the same postal address” had already applied for a rebate.

 

Conclusion

Statistics vary by product and region, but estimates show that breakage and slippage combine on average to result in 40% of retail customers paying full price when making a purchasing decision based on a promotional price requiring a mail-in rebate.

So who profits from poor data quality?  Apparently, the retail industry does—sometimes. 

Poor data quality (and poor information quality in the case of intentionally confusing fine print) definitely has a role to play with mail-in rebates—and it’s a supporting role that can definitely lead to increased profits. 

Of course, the long-term risks and costs associated with alienating the marketplace with gimmicky promotions take their toll. 

In fact, the major electronics manufacturer mentioned above was actually substantially fined in the United States and forced to pay hundreds of thousands of dollars worth of denied mail-in rebates to customers.

Therefore, poor data quality, much like crime, doesn’t pay—at least not for very long.

I am not trying to demonize the retail industry. 

Excluding criminal acts of intentional fraud, such as identity theft and money laundering, this was the best example I could think of that allowed me to respond to a reader’s request—without using the far more complex example of the mortgage crisis.

 

What Say You?

Can you think of any other examples of the possible benefits—intentional or accidental—derived from poor data quality?