Freemium is the future – and the future is now

Earlier this week, two excellent blog posts—Three Ways to Start a Revolution by James Chartrand on Men with Pens, and Your Dream is Under Attack by Nathan Hangen on Copyblogger—discussed the somewhat polarizing debate about making money from blogging, which is one of many examples of the so-called “freemium” business model, which was first articulated in 2006 by venture capitalist Fred Wilson:

“Give your service away for free, acquire a lot of customers very efficiently through word of mouth and referral networks, then offer premium priced, value added services or an enhanced version of your service to your customer base.”

In 2009, Chris Anderson published the book Free: The Future of a Radical Price, which among numerous other coverage, was critically reviewed in the article Priced to Sell by Malcolm Gladwell, and discussed in an interview conducted by Charlie Rose.

 

Isn't everything on the Internet supposed to be free?

The freemium model, as well as the concept expressed in Anderson's book, is not entirely about the Internet.  However, it is most often at the center of polarized debates because more and more businesses, in varying degrees, are becoming online businesses.

General public perception is that the Internet is free—getting on the Internet does have a cost (sometimes conveniently ignored), in terms of electricity, ISPs, and the various computer and mobile devices used to access it.  However, once you are connected, the content on the Internet is either free or is supposed to be free—according to the “logic” of a very common perspective.

To be fair, this is somewhat understandable, especially given the fact that many of the most popular online services, such as Twitter, Facebook, and YouTube, to name but three examples from countless others, are in fact, free – and their users often defiantly claim that they would never pay any amount of money for such a service.

 

So how does the Internet make money?

The Internet has traditionally made money the same way broadcast television (also “free” when you conveniently ignore the cost of electricity, cable and satellite providers, and the various devices used to access it) has traditionally made money – advertising.

Paraphrasing (and oversimplifying) the words of Chris Anderson, the three generations of making money on the Internet:

  1. Pop-Up Ads – in the beginning was the Pop-up Ad—and it was not good.  Do you still remember (or are you old enough to remember) the early days of the Internet?  Nearly every website you visited brought the seemingly random attack of pop-up ads.  Even after the invention of pop-up blockers and the advent of alternatives to pop-up ads, online advertising was not very context sensitive and not only annoying, but also largely ineffective.

     

  2. Google AdSense – the next generation of advertising was basically pioneered by Google (or companies they now own).  Exemplified by the now somewhat ubiquitous Google AdSense, ads specific to website content provided online advertising that is both less annoying and seemingly far more effective.

     

  3. Freemium – we are just entering the third generation of making money on the Internet, and the first one not ruled by advertising—at least not advertising in the “traditional” sense.  Under this new model, free online content is made available to everyone—providing the opportunity to “up-sell” premium content to a (typically small) percentage of your audience.

 

Freemium is NOT a new concept

Although many Internet users become seemingly outraged by the very notion of the option to purchase premium content, the idea of giving away something for free in order to facilitate a potential purchase is by no means a new concept.

Just a few simple examples include:

  • Samples at the mall food court are free, but you have to pay to eat a full meal
  • Movie previews are free, but you have to pay to watch an entire movie
  • Broadcast television shows are free, but you have to pay for the DVD box sets

The Internet, however, has seemingly always been viewed as a special case.

I believe this is mostly due to the ratio of free to premium.  Food samples, movie previews, and an individual episode of a television show, are small compared to the size of a full meal, a full-length movie, and a full season (or series) of episodes.

In other words, what we get for free isn't much, so paying for the rest makes more sense.  On the Internet, this ratio is reversed. 

Since almost everything on the Internet is free (again, after the cost of connection), we are genuinely, and perhaps really quite understandably, surprised or even annoyed when we encounter something that we are asked to pay for.

In other words, since we get so much for free, paying just to get a little more simply doesn't seem to make sense. 

After all, if the full meals at the mall food court were free, we certainly wouldn't pay just to eat samples.

(And yes—I do realize that was a terrible analogy on so many levels—so please stop yelling at me.)

 

Isn't freemium the end of the world as we know it?

Obviously, the real issue is not the ratio of free to premium, or how much you should (or should not) expect to get for free. 

The fundamental argument is that anything you pay for should be worth the price.

Historically, price has been the indicator of value, meaning something has value only if people are willing to pay for it.  Higher prices, in theory at least, indicate higher value, especially if people are willing to purchase at the higher price.

So, if people are willing to pay for it, then this indicates there is a demand for it, for which a supply of it must be produced. 

(And yes—I do realize that was a huge oversimplification of economic theory—so yet again, please stop yelling at me.)

One of the most common counter-arguments to the freemium model is that if price is allowed to essentially drop to zero, then there will be no way to accurately measure demand, which means there will be no way for content producers to determine what to supply.  Furthermore, if almost everything is free, then why would content consumers be willing to pay for anything at all.

If nobody is willing to pay, then nobody can possibly get paid, and all online content will be completely user-generated, and following Andrew Keen's argument in The Cult of the Amateur, a cultural apocalypse occurs, which results in not only the Internet, but the entirety of human expression, being reduced to us hurling our feces at each other just like our primate cousins.

(You may feel free to resume yelling at me now.)

 

Freemium is the future—and the future is now

Obviously, the freemium business model doesn't only apply to blogging.  By the way, it is totally understandable if you had forgotten that my lunatic fringe was ignited by the debate over making money from blogging.

Freemium is the future of most of the business world—and the harsh reality is—the future has already arrived.

In my opinion, too many people, companies, and in some cases, entire industries, are wasting their time, effort, and money trying to fight the unrelenting reality of freemium.  Instead of refusing to accept that the price of what you are now offering may be falling essentially to zero—focus on creating something new that people would be willing to pay for.

Once again, to paraphrase Chris Anderson, “free” is only one of many markets—and only one of many additional pricing levels. 

Don't stop at thinking about just two versions of each individual product or service—one free version and one premium version.  You should be thinking about one free version and multiple tiers of premium.  Value still drives price.  Therefore, if you can truly add more value at each tier, then you can successfully demand a higher price.

Freemium works as a viable model because people will always be willing to pay a premium for something worth its price.

If you can't (or can no longer) produce something your customers are willing to pay for—that's your problem, not theirs.

The War of Word Craft

After publishing my previous post, I watched Empire of the Word Part 4: The Future of Reading, which was a panel discussion on The Agenda with Steve Paikin, featuring Cynthia Good, Keith Oatley, Mark Federman, Bob Stein, and Bill Buxton.

Please let me stress that I highly respect all of the panelists who were involved in this discussion.  My selective paraphrasing of their quotes, which I have woven into the tapestry of this blog post, doesn't come close to doing justice to the full range of excellent insights they shared.  Therefore, although it is 53 minutes long, I highly recommend watching the full video.

 

The War of Word Craft

Bob Stein used the extremely popular multi-player online game World of Warcraft, where the players collaboratively create the narrative in real-time, as an example of the type of interactive multimedia experience that may be the true future of reading.

This analogy inspired my post title—since the debate seems to be about not only the future of reading, but also the future of how what we read (and by whatever means we “read” it) will be produced—or using far more dramatic flourish, this debate is about:

The War of Word Craft

e-Books are the end of anything worth reading?

When the financial implications of electronic publishing were briefly discussed, Bill Buxton explained that when things go digital and there is no cost of goods (i.e., producing an e-book), there is a law of economics that states the price drops essentially to zero.

Buxton argued this would mean the end of anything worth reading.  Since, when professional writers are no longer able to make a living from writing (i.e., because e-books are “free”), then only amateurs will write.  This will cause a dramatic drop in the overall quality of writing, and therefore no new writing will be worth reading.

 

Publishing companies are the gatekeepers of standards?

A somewhat similar sentiment was expressed by Cynthia Good, in defending what have traditionally been considered the gatekeepers for the standards of high quality, professional writing—publishing companies. 

(Please note: Good was formerly the president of a publishing company, and is now an academic director of publishing.)

Good argues that historically it has been publishers and editors who select and perfect the books to be published, thereby guaranteeing high standards for quality writing—and that society still requires these standards.

 

The Cult of the Amateur

In 2007, Andrew Keen wrote the controversial book The Cult of the Amateur, which has the provocative sub-title: “how blogs, MySpace, YouTube, and the rest of today's user-generated media are destroying our economy, our culture, and our values.”

I am definitely not suggesting Buxton and Good are advocating a similar perspective.  However, I find both the notion that only “professional” writers can write anything worth reading, and we require gatekeepers of “standards” to protect us from ourselves, to be incredibly pretentious and outdated ideas.

Writing is not an esoteric skill possessed by only a select few—and the best writers are not motivated (only) by money.

Publishing companies publish books that guarantee a high profit margin—and not high standards for quality writing.

 

The New Word Order

Bob Stein discussed the differences between the old-school and new-school mentality of authors.

The commitment of old-school authors is to engage with the subject matter on behalf of future readers.

By contrast, the commitment of new-school authors is to engage with readers in the context of the subject matter.

Stein believes the future role of the publisher is to develop a community around the subject matter, and bring the content to the community who wants to read it, instead of pushing the community toward the content you tell them they should read.

Mark Federman agreed, and sees the role of the publisher changing into one of creating an environment of engagement for genres and niche communities, which bring together writers and readers.

Federman also sees the roles of writers and readers becoming interchangeable within these communities. 

Quoting Finnegans Wake by James Joyce: “my consumers, are they not my producers?”

Pardon the pun, but I believe this will become the new order of the publishing world, or more simply: The New Word Order.

 

A Different Kind of Social Media

Bob Stein explained that solitary reading is really a recent development in human history.  Previously, most reading was a very social activity, where groups of people came together to listen to books (and poetry and other works) being read out loud.

Books (and reading as we know it) will not go away.  However, Stein believes we are at the very beginning of the explosion of new forms of written (and other creative) expression. 

The idea of reading (and writing) with others is going to become commonplace again, because we value the input of others, which greatly improves our individual experience, understanding, and unleashes the true joy of reading.

In what Stein describes, I see the future of reading and writing as a different kind of social media—a better kind of social media.

 

New Medium, New Message

In his book Understanding Media: The Extensions of Man, Marshall McLuhan coined the phrase: “the medium is the message.”

Steve Paikin asked what, within this new medium we have been discussing, is the message?

Mark Federman responded:

“Connection—the ability to connect readers and writers and interchange their roles.  The ability to collaborate as we construct knowledge, as we engage with one another's experiences, as we bring multiple contexts into understanding what it is we are reading and creating simultaneously—that's the message.”


Will people still read in the future?

This question and debate was motivated by my comments on the recent blog post The Future of Reading by Phil Simon.

In the following OCDQ Video, I share some of my perspectives on the future of reading, specifically covering three key points:

  1. Books vs. e-Books
  2. Print Media vs. Social Media
  3. Reading vs. Multimedia

  If you are having trouble viewing this video, then you can watch it on Vimeo by clicking on this link: OCDQ Video

 

A Very Brief History of Human Communication

Long before written language evolved, humans communicated using hand and facial gestures, monosyllabic and polysyllabic grunting, as well as crude drawings and other symbols, all in an attempt to share our thoughts and feelings with each other.

First, improved spoken language increased our ability to communicate by using words as verbal symbols for emotions and ideas.  Listening to stories, and retelling them to others, became the predominant means of education and “recording” our history.

Improved symbolism via more elaborate drawings, sculptures, and other physical and lyrical works of artistic expression, greatly enhanced our ability to not only communicate, but also leave a lasting legacy beyond the limits of our individual lives.

Later, written language would provide a quantum leap in human evolution.  Writing (and reading) greatly improved our ability to communicate, educate, record our history, and thereby pass on our knowledge and wisdom to future generations.

 

The Times They Are a-Changin’

The pervasiveness of the Internet and the rapid proliferation of powerful mobile technology is transforming the very nature of human communication—some purists might even argue it is regressing human communication.

I believe there is already a declining interest in reading throughout society in general, and more specifically, a marked decline across current generation gaps, which will become even more dramatic in the coming decades.

 

Books vs. e-Books

People are reading fewer books—and fewer people are reading books.  The highly polarized “book versus e-book debate” is really only a debate within the shrinking segment of the population that still reads books. 

So, yes, between us book lovers, some of us will not exchange our personal tactile relationship with printed books for an e-book reader made of the finest plastic, glass, and metal, and equipped with all the bells and whistles of the latest technology. 

However, e-book readers simply aren't going to make non-book readers want to read books.  I am truly sorry Amazon and Barnes & Noble, but the truth is—the Kindle and Nook are not going to making reading books cool—they will simply provide an alternative for people who already enjoy reading books, and mostly for those who also love having the latest techno-gadgets.

 

Print Media vs. Social Media

We continue to see print media (newspapers, magazines, and books) either offering electronic alternatives, or transitioning into online publications—or in some cases, simply going out of business.

I believe the primary reason for this media transition is our increasing interest in exchanging what has traditionally been only a broadcast medium (print media) for a conversation medium (social media).

Social media can engage us in conversation and enable communication between content creators and their consumers.

We are constantly communicating with other people via phone calls, text messages, e-mails, and status updates on Twitter and Facebook.  We are also sharing more of our lives visually through the photos we post on Flickr and the videos we post on YouTube.  More and more, we are creating—and not just consuming—content that we want to share with others.

We are also gaining more control over how we filter communication.  Google real-time searches and e-mail alerts, RSS readers, and hashtagged Twitter streams—these are just a few examples of the many tools currently allowing us to customize and personalize the content we create and consume.

We are becoming an increasingly digital society, and through social media, we are living more and more of both our personal and professional lives online, blurring—if not eliminating—the distinction between the two.

 

Reading vs. Multimedia

I believe the future of human communication will be a return to the more direct social interactions that existed before the evolution of written language.  I am not predicting a return to polysyllabic grunting and interpretive dance. 

Instead, I believe we will rely less and less on reading and writing, and more and more on watching, listening, and speaking.

The future of human communication may become short digital bursts of multimedia experiences, seamlessly blending an economy of words with audio and video elements.  Eventually, even digitally written words may themselves disappear—and we will communicate via interactive digital video and audio—and the very notion of “literacy” may become meaningless.

But fear not—I don't predict this will happen until the end of the century—and I am probably completely wrong anyway.

 

Please Share Your Thoughts

Do you read a lot of books?  If so, have you purchased an e-book reader (e.g., Amazon Kindle, Barnes & Noble Nook) or are you planning to in the near-future?  If you have an e-book reader, how would you compare it to reading a printed book?

Do you read newspapers and/or magazines?  If so, are you reading them in print or online? 

How often do you read blogs and other publications that are only available as online content?

How often do you listen to podcasts or watch video blogs or other online videos (excluding television and movies)?

What is the future of reading?


Beyond a “Single Version of the Truth”

This post is involved in a good-natured contest (i.e., a blog-bout) with two additional bloggers: Henrik Liliendahl Sørensen and Charles Blyth.  Our contest is a Blogging Olympics of sorts, with the United States, Denmark, and England competing for the Gold, Silver, and Bronze medals in an event we are calling “Three Single Versions of a Shared Version of the Truth.” 

Please take the time to read all three posts and then vote for who you think has won the debate (see poll below).  Thanks!

 

The “Point of View” Paradox

In the early 20th century, within his Special Theory of Relativity, Albert Einstein introduced the concept that space and time are interrelated entities forming a single continuum, and therefore the passage of time can be a variable that could change for each individual observer.

One of the many brilliant insights of special relativity was that it could explain why different observers can make validly different observations – it was a scientifically justifiable matter of perspective. 

It was Einstein's apprentice, Obi-Wan Kenobi (to whom Albert explained “Gravity will be with you, always”), who stated:

“You're going to find that many of the truths we cling to depend greatly on our own point of view.”

The Data-Information Continuum

In the early 21st century, within his popular blog post The Data-Information Continuum, Jim Harris introduced the concept that data and information are interrelated entities forming a single continuum, and that speaking of oneself in the third person is the path to the dark side.

I use the Dragnet definition for data – it is “just the facts” collected as an abstract description of the real-world entities that the enterprise does business with (e.g., customers, vendors, suppliers).

Although a common definition for data quality is fitness for the purpose of use, the common challenge is that data has multiple uses – each with its own fitness requirements.  Viewing each intended use as the information that is derived from data, I define information as data in use or data in action.

Quality within the Data-Information Continuum has both objective and subjective dimensions.  Data's quality is objectively measured separate from its many uses, while information's quality is subjectively measured according to its specific use.

 

Objective Data Quality

Data quality standards provide a highest common denominator to be used by all business units throughout the enterprise as an objective data foundation for their operational, tactical, and strategic initiatives. 

In order to lay this foundation, raw data is extracted directly from its sources, profiled, analyzed, transformed, cleansed, documented and monitored by data quality processes designed to provide and maintain universal data sources for the enterprise's information needs. 

At this phase of the architecture, the manipulations of raw data must be limited to objective standards and not be customized for any subjective use.  From this perspective, data is now fit to serve (as at least the basis for) each and every purpose.

 

Subjective Information Quality

Information quality standards (starting from the objective data foundation) are customized to meet the subjective needs of each business unit and initiative.  This approach leverages a consistent enterprise understanding of data while also providing the information necessary for day-to-day operations.

But please understand: customization should not be performed simply for the sake of it.  You must always define your information quality standards by using the enterprise-wide data quality standards as your initial framework. 

Whenever possible, enterprise-wide standards should be enforced without customization.  The key word within the phrase “subjective information quality standards” is standards — as opposed to subjective, which can quite often be misinterpreted as “you can do whatever you want.”  Yes you can – just as long as you have justifiable business reasons for doing so.

This approach to implementing information quality standards has three primary advantages.  First, it reinforces a consistent understanding and usage of data throughout the enterprise.  Second, it requires each business unit and initiative to clearly explain exactly how they are using data differently from the rest of your organization, and more important, justify why.  Finally, all deviations from enterprise-wide data quality standards will be fully documented. 

 

The “One Lie Strategy”

A common objection to separating quality standards into objective data quality and subjective information quality is the enterprise's significant interest in creating what is commonly referred to as a “Single Version of the Truth.”

However, in his excellent book Data Driven: Profiting from Your Most Important Business Asset, Thomas Redman explains:

“A fiendishly attractive concept is...'a single version of the truth'...the logic is compelling...unfortunately, there is no single version of the truth. 

For all important data, there are...too many uses, too many viewpoints, and too much nuance for a single version to have any hope of success. 

This does not imply malfeasance on anyone's part; it is simply a fact of life. 

Getting everyone to work from a single version of the truth may be a noble goal, but it is better to call this the 'one lie strategy' than anything resembling truth.”

Beyond a “Single Version of the Truth”

In the classic 1985 film Mad Max Beyond Thunderdome, the title character arrives in Bartertown, ruled by the evil Auntie Entity, where people living in the post-apocalyptic Australian outback go to trade for food, water, weapons, and supplies.  Auntie Entity forces Mad Max to fight her rival Master Blaster to the death within a gladiator-like arena known as Thunderdome, which is governed by one simple rule:

“Two men enter, one man leaves.”

I have always struggled with the concept of creating a “Single Version of the Truth.”  I imagine all of the key stakeholders from throughout the enterprise arriving in Corporatetown, ruled by the Machiavellian CEO known only as Veritas, where all business units and initiatives must go to request funding, staffing, and continued employment.  Veritas forces all of them to fight their Master Data Management rivals within a gladiator-like arena known as Meetingdome, which is governed by one simple rule:

“Many versions of the truth enter, a Single Version of the Truth leaves.”

For any attempted “version of the truth” to truly be successfully implemented within your organization, it must take into account both the objective and subjective dimensions of quality within the Data-Information Continuum. 

Both aspects of this shared perspective of quality must be incorporated into a “Shared Version of the Truth” that enforces a consistent enterprise understanding of data, but that also provides the information necessary to support day-to-day operations.

The Data-Information Continuum is governed by one simple rule:

“All validly different points of view must be allowed to enter,

In order for an all encompassing Shared Version of the Truth to be achieved.”

 

You are the Judge

This post is involved in a good-natured contest (i.e., a blog-bout) with two additional bloggers: Henrik Liliendahl Sørensen and Charles Blyth.  Our contest is a Blogging Olympics of sorts, with the United States, Denmark, and England competing for the Gold, Silver, and Bronze medals in an event we are calling “Three Single Versions of a Shared Version of the Truth.” 

Please take the time to read all three posts and then vote for who you think has won the debate.  A link to the same poll is provided on all three blogs.  Therefore, wherever you choose to cast your vote, you will be able to view an accurate tally of the current totals. 

The poll will remain open for one week, closing at midnight on November 19 so that the “medal ceremony” can be conducted via Twitter on Friday, November 20.  Additionally, please share your thoughts and perspectives on this debate by posting a comment below.  Your comment may be copied (with full attribution) into the comments section of all of the blogs involved in this debate.

 

Related Posts

Poor Data Quality is a Virus

The General Theory of Data Quality

The Data-Information Continuum

The Once and Future Data Quality Expert

World Quality Day 2009

Wednesday, November 11 is World Quality Day 2009.

World Quality Day was established by the United Nations in 1990 as a focal point for the quality management profession and as a celebration of the contribution that quality makes to the growth and prosperity of nations and organizations.  The goal of World Quality Day is to raise awareness of how quality approaches (including data quality best practices) can have a tangible effect on business success, as well as contribute towards world-wide economic prosperity.

 

IAIDQ

The International Association for Information and Data Quality (IAIDQ) was chartered in January 2004 and is a not-for-profit, vendor-neutral professional association whose purpose is to create a world-wide community of people who desire to reduce the high costs of low quality information and data by applying sound quality management principles to the processes that create, maintain and deliver data and information.

Since 2007 the IAIDQ has celebrated World Quality Day as a springboard for improvement and a celebration of successes.  Please join us to celebrate World Quality Day by participating in our interactive webinar in which the Board of Directors of the IAIDQ will share with you stories and experiences to promote data quality improvements within your organization.

In my recent Data Quality Pro article The Future of Information and Data Quality, I reported on the IAIDQ Ask The Expert Webinar with co-founders Larry English and Tom Redman, two of the industry pioneers for data quality and two of the most well-known data quality experts.

 

Data Quality Expert

As World Quality Day 2009 approaches, my personal reflections are focused on what the title data quality expert has meant in the past, what it means today, and most important, what it will mean in the future.

With over 15 years of professional services and application development experience, I consider myself to be a data quality expert.  However, my experience is paltry by comparison to English, Redman, and other industry luminaries such as David Loshin, to use one additional example from many. 

Experience is popularly believed to be the path that separates knowledge from wisdom, which is usually accepted as another way of defining expertise. 

Oscar Wilde once wrote that “experience is simply the name we give our mistakes.”  I agree.  I have found that the sooner I can recognize my mistakes, the sooner I can learn from the lessons they provide, and hopefully prevent myself from making the same mistakes again. 

The key is early detection.  As I gain experience, I gain an improved ability to more quickly recognize my mistakes and thereby expedite the learning process.

James Joyce wrote that “mistakes are the portals of discovery” and T.S. Eliot wrote that “we must not cease from exploration and the end of all our exploring will be to arrive where we began and to know the place for the first time.”

What I find in the wisdom of these sages is the need to acknowledge the favor our faults do for us.  Therefore, although experience is the path that separates knowledge from wisdom, the true wisdom of experience is the wisdom of failure.

As Jonah Lehrer explained: “Becoming an expert just takes time and practice.  Once you have developed expertise in a particular area, you have made the requisite mistakes.”

But expertise in any discipline is more than simply an accumulation of mistakes and birthdays.  And expertise is not a static state that once achieved, allows you to simply rest on your laurels.

In addition to my real-world experience working on data quality initiatives for my clients, I also read all of the latest books, articles, whitepapers, and blogs, as well as attend as many conferences as possible.

 

The Times They Are a-Changin'

Much of the discussion that I have heard regarding the future of the data quality profession has been focused on the need for the increased maturity of both practitioners and organizations.  Although I do not dispute this need, I am concerned about the apparent lack of attention being paid to how fast the world around us is changing.

Rapid advancements in technology, coupled with the meteoric rise of the Internet and social media (blogs, wikis,  Twitter, Facebook, LinkedIn, etc.) has created an amazing medium that is enabling people separated by vast distances and disparate cultures to come together, communicate, and collaborate in ways few would have thought possible just a few decades ago. 

I don't believe that it is an exaggeration to state that we are now living in an age where the contrast between the recent past and the near future is greater than perhaps it has ever been in human history.  This brave new world has such people and technology in it, that practically every new day brings the possibility of another quantum leap forward.

Although it has been argued by some that the core principles of data quality management are timeless, I must express my doubt.  The daunting challenges of dramatically increasing data volumes and the unrelenting progress of cloud computing, software as a service (SaaS), and mobile computing architectures, would appear to be racing toward a high-speed collision with our time-tested (but time-consuming to implement properly) data quality management principles.

The times they are indeed changing and I believe we must stop using terms like Six Sigma and Kaizen as if they were a shibboleth.  If these or any other disciplines are to remain relevant, then we must honestly assess them in the harsh and unforgiving light of our brave new world that is seemingly changing faster than the speed of light.

Expertise is not static.  Wisdom is not timeless.  The only constant is change.  For the data quality profession to truly mature, our guiding principles must change with the times, or be relegated to a past that is all too quickly becoming distant.

 

Share Your Perspectives

In celebration of World Quality Day, please share your perspectives regarding the past, present, and most important, the future of the data quality profession.  With apologies to T. H. White, I declare this debate to be about the difference between:

The Once and Future Data Quality Expert

Related Posts

Mistake Driven Learning

The Fragility of Knowledge

The Wisdom of Failure

A Portrait of the Data Quality Expert as a Young Idiot

The Nine Circles of Data Quality Hell

 

Additional IAIDQ Links

IAIDQ Ask The Expert Webinar: World Quality Day 2009

IAIDQ Ask The Expert Webinar with Larry English and Tom Redman

INTERVIEW: Larry English - IAIDQ Co-Founder

INTERVIEW: Tom Redman - IAIDQ Co-Founder

IAIDQ Publications Portal

Blog-Bout: “Risk” versus “Monopoly”

A “blog-bout” is a good-natured debate between two bloggers.  This blog-bout is between Jim Harris and Phil Simon, where they debate which board game is the better metaphor for an Information Technology (IT) project: “Risk” or “Monopoly.”

 

Why “Risk” is a better metaphor for an IT Project

By Jim Harris

IT projects and “Risk” have a great deal in common.  I thought long and hard about this while screaming obscenities and watching professional sports on television, the source of all of my great thinking.  I came up with five world dominating reasons.

1. Both things start with the players marking their territory.  In Risk, the game begins with the players placing their “armies” on the territories they will initially occupy.  On IT projects, the different groups within the organization will initially claim their turf. 

Please note that the term “Information Technology” is being used in a general sense to describe a project (e.g. Data Quality, Master Data Management, etc.) and should not be confused with the IT group within an organization.  At a very high level, the Business and IT are the internal groups representing the business and technical stakeholders on a project.

The Business usually owns the data and understands its meaning and use in the day-to-day operation of the enterprise.  IT usually owns the hardware and software infrastructure of the enterprise's technical architecture. 

Both groups can claim they are only responsible for what they own, resist collaborating with the “other side” and therefore create organizational barriers as fiercely defended as the continental borders of Europe and Asia in Risk.

2. In both, there are many competing strategies.  In Risk, the official rules of the game include some basic strategies and over the years many players have developed their own fool-proof plans to guarantee victory.  Some strategies advocate focusing on controlling entire continents, while others advise fortifying your borders by invading and occupying neighboring territories.  And my blog-bout competitor Phil Simon half-jokingly claims that the key to winning Risk is securing the island nation of Madagascar.

On IT projects, you often hear a lot of buzzwords and strategies bandied about, such as Lean, Agile, Six Sigma, and Kaizen, to name but a few.  Please understand – I am an advocate for methodology and best practices, and there are certainly many excellent frameworks out there, including the paradigms I just mentioned.

However, a general problem that I have with most frameworks is their tendency to adopt a one-size-fits-all strategy, which I believe is an approach that is doomed to fail.  Any implemented framework must be customized to adapt to an organization’s unique culture. 

In part, this is necessary because implementing changes of any kind will be met with initial resistance, but an attempt at forcing a one-size-fits-all approach almost sends a message to the organization that everything they are currently doing is wrong, which will of course only increase the resistance to change. 

Starting with a framework simply provides a reference of best practices and recommended options of what has worked on successful IT projects.  The framework should be reviewed in order to determine what can be learned from it and to select what will work in the current environment and what simply won't.     

3. Pyrrhic victories are common during both endeavors.  In Risk, sacrificing everything to win a single battle or to defend your favorite territory can ultimately lead you to lose the war.  Political fiefdoms can undermine what could otherwise have been a successful IT project.  Do not underestimate the unique challenges of your corporate culture.

Obviously, business, technical and data issues will all come up from time to time, and there will likely be disagreements regarding how these issues should be prioritized.  Some issues will likely affect certain stakeholders more than others. 

Keeping data and technology aligned with business processes requires getting people aligned and free to communicate their concerns.  Coordinating discussions with all of the stakeholders and maintaining open communication can prevent a Pyrrhic victory for one stakeholder causing the overall project to fail.

4. Alliances are the key to true victory.  In Risk, it is common for players to form alliances by combining their resources and coordinating their efforts in order to defend their shared borders or to eliminate a common enemy. 

On IT projects, knowledge about data, business processes and supporting technology are spread throughout the organization.  Neither the Business nor IT alone has all of the necessary information required to achieve success. 

Successful projects are driven by an executive management mandate for the Business and IT to forge an alliance of ongoing and iterative collaboration throughout the entire project.

5. The outcomes of both are too often left to chance.  IT projects are complex, time-consuming, and expensive enterprise initiatives.  Success requires people taking on the challenge united by collaboration, guided by an effective methodology, and implementing a solution using powerful technology.

But the complexity of an IT project can sometimes work against your best intentions.  It is easy to get pulled into the mechanics of documenting the business requirements and functional specifications, drafting the project plan and then charging ahead on the common mantra: “We planned the work, now we work the plan.”

Once an IT project achieves some momentum, it can take on a life of its own and the focus becomes more and more about making progress against the tasks in the project plan, and less and less on the project's actual business goals.  Typically, this leads to another all too common mantra: “Code it, test it, implement it into production, and then declare victory.”

In Risk, the outcomes are literally determined by a roll of the dice.  If you allow your IT project to lose sight of its business goals, then you treat it like a game of chance.  And to paraphrase Albert Einstein:

“Do not play dice with IT Projects.”

Why “Monopoly” is a better metaphor for an IT Project

By Phil Simon

IT projects and “Monopoly” have a great deal in common.  I thought long and hard about this at the gym, the source of all of my great thinking.  I came up with six really smashing reasons.

1. Both things take much longer than originally expected.  IT projects typically take much longer than expected for a wide variety of reasons.  Rare is the project that finishes on time (with expected functionality delivered).

The same holds true for Monopoly.  Remember when you were a kid and you wanted to play a quick game?  Now, I consider the term “a quick game of Monopoly” to be the very definition of an oxymoron.  You’d better block off about four to six hours for a proper game.  Unforeseen complexities will doubtlessly delay even the best intentions.

2. During both endeavors, screaming matches typically erupt.  Many projects become tense.  I remember one in which two participants nearly came to blows.  Most projects have key players engage in very heated debates over strategic vision and execution.

With Monopoly, especially after the properties are divvied up, players scream and yell over what constitutes a “fair” deal.  “What do you mean Boardwalk for Ventnor Avenue and Pennsylvania Railroad isn’t reasonable?  IT’S COMPLETELY FAIR!”  Debates like this are the rule, not the exception.

3. While the basic rules may be the same, different people play by different rules.  The vast majority of projects on which I have worked have had the usual suspects: steering committees, executive sponsors, PMOs, different stages of testing, and ultimately system activation.  However, different organizations often try to do things in vastly different ways.  For example, on two similar projects in different organizations, you are likely to find differences with respect to:

  • the number of internal and external folks assigned to a project
  • the project’s timeline and budget
  • project objectives

By the same token, people play Monopoly in somewhat different ways.  Many don’t know about the auction rule.  Others replenish Free Parking with a new $500 bill after someone lands on it.  Also, many people disregard altogether the property assessment card while sticklers like me assess penalties when that vaunted red card appears.

4. Personal relationships can largely determine the outcome in both.  Negotiation is key on IT projects.  Clients negotiate rates, prices, and responsibilities with consulting vendors and/or software vendors.

In Monopoly, personal rivalries play a big part in who makes a deal with whom.  Often players chime in (uninvited, of course) with their opinions on potential deals, without a doubt to affect the outcome.

5. Little things really matter, especially at the end.  Towards the end of an IT project, snakes in the woodwork often come out to bite people when they least expect it.  A tightly staffed or planned project may not be able to withstand a relatively minor problem, especially if the go-live date is non-negotiable.

In Monopoly, the same holds true.  Laugh all you want when your opponent builds hotels on Mediterranean Avenue and Baltic Avenue, but at the end of the game those $250 and $450 charges can really hurt, especially when you’re low on cash.

6. Many times, each does not end; it is merely abandoned.  A good percentage of projects have their plugs pulled prior to completion.  A CIO may become tired with an interminable project and decide to simply end it before costs skyrocket even further.

I’d say that about half of the Monopoly games that I’ve played in the last fifteen years have also been called by “executive decision.”  The writing is on the board, as 1 a.m. rolls around and only two players remain.  Often player X simply cedes the game to player Y.

 

You are the Referee

All bouts require a referee.  Blog-bouts are refereed by the readers.  Therefore, please cast your vote in the poll and also weigh in on this debate by sharing your thoughts by posting a comment below.  Since a blog-bout is co-posted, your comments will be copied (with full attribution) into the comments section of both of the blogs co-hosting this blog-bout.

 

About Jim Harris

Jim Harris is the Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ), which is an independent blog offering a vendor-neutral perspective on data quality.  Jim is also an independent consultant, speaker, writer and blogger with over 15 years of professional services and application development experience in data quality (DQ), data integration, data warehousing (DW), business intelligence (BI), customer data integration (CDI), and master data management (MDM).  Jim is also a contributing writer to Data Quality Pro, the leading online magazine and community resource dedicated to data quality professionals.

 

About Phil Simon

Phil Simon is the author of the acclaimed book Why New Systems Fail: Theory and Practice Collide and the highly anticipated upcoming book The Next Wave of Technologies: Opportunities from Chaos.  Phil is also an independent systems consultant and a dynamic public speaker for hire focusing on how organizations use technology.  Phil also writes for a number of technology media outlets.