Availability Bias and Data Quality Improvement

The availability heuristic is a mental shortcut that occurs when people make judgments based on the ease with which examples come to mind.  Although this heuristic can be beneficial, such as when it helps us recall examples of a dangerous activity to avoid, sometimes it leads to availability bias, where we’re affected more strongly by the ease of retrieval than by the content retrieved.

In his thought-provoking book Thinking, Fast and Slow, Daniel Kahneman explained how availability bias works by recounting an experiment where different groups of college students were asked to rate a course they had taken the previous semester by listing ways to improve the course — while varying the number of improvements that different groups were required to list.

Counterintuitively, students in the group required to list more necessary improvements gave the course a higher rating, whereas students in the group required to list fewer necessary improvements gave the course a lower rating.

According to Kahneman, the extra cognitive effort expended by the students required to list more improvements biased them into believing it was difficult to list necessary improvements, leading them to conclude that the course didn’t need much improvement, and conversely, the little cognitive effort expended by the students required to list few improvements biased them into concluding, since it was so easy to list necessary improvements, that the course obviously needed improvement.

This is counterintuitive because you’d think that the students would rate the course based on an assessment of the information retrieved from their memory regardless of how easy that information was to retrieve.  It would have made more sense for the course to be rated higher for needing fewer improvements, but availability bias lead the students to the opposite conclusion.

Availability bias can also affect an organization’s discussions about the need for data quality improvement.

If you asked stakeholders to rate the organization’s data quality by listing business-impacting incidents of poor data quality, would they reach a different conclusion if you asked them to list one incident versus asking them to list at least ten incidents?

In my experience, an event where poor data quality negatively impacted the organization, such as a regulatory compliance failure, is often easily dismissed by stakeholders as an isolated incident to be corrected by a one-time data cleansing project.

But would forcing stakeholders to list ten business-impacting incidents of poor data quality make them concede that data quality improvement should be supported by an ongoing program?  Or would the extra cognitive effort bias them into concluding, since it was so difficult to list ten incidents, that the organization’s data quality doesn’t really need much improvement?

I think that the availability heuristic helps explain why most organizations easily approve reactive data cleansing projects, and availability bias helps explain why most organizations usually resist proactively initiating a data quality improvement program.

 

Related Posts

DQ-View: The Five Stages of Data Quality

Data Quality: Quo Vadimus?

Data Quality and Chicken Little Syndrome

The Data Quality Wager

You only get a Return from something you actually Invest in

“Some is not a number and soon is not a time”

Why isn’t our data quality worse?

Data Quality and the Bystander Effect

Data Quality and the Q Test

Perception Filters and Data Quality

Predictably Poor Data Quality

WYSIWYG and WYSIATI

 

Related OCDQ Radio Episodes

Clicking on the link will take you to the episode’s blog post:

  • Organizing for Data Quality — Guest Tom Redman (aka the “Data Doc”) discusses how your organization should approach data quality, including his call to action for your role in the data revolution.
  • The Johari Window of Data Quality — Guest Martin Doyle discusses helping people better understand their data and assess its business impacts, not just the negative impacts of bad data quality, but also the positive impacts of good data quality.
  • Redefining Data Quality — Guest Peter Perera discusses his proposed redefinition of data quality, as well as his perspective on the relationship of data quality to master data management and data governance.
  • Studying Data Quality — Guest Gordon Hamilton discusses the key concepts from recommended data quality books, including those which he has implemented in his career as a data quality practitioner.

Open MIKE Podcast — Episode 05

Method for an Integrated Knowledge Environment (MIKE2.0) is an open source delivery framework for Enterprise Information Management, which provides a comprehensive methodology that can be applied across a number of different projects within the Information Management space.  For more information, click on this link: openmethodology.org/wiki/What_is_MIKE2.0

The Open MIKE Podcast is a video podcast show, hosted by Jim Harris, which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

 

Episode 05: Defining Big Data

If you’re having trouble viewing this video, you can watch it on Vimeo by clicking on this link: Open MIKE Podcast on Vimeo

 

MIKE2.0 Content Featured in or Related to this Podcast

Big Data Definition: openmethodology.org/wiki/Big_Data_Definition

Big Sensor Data: openmethodology.org/wiki/Big_sensor_data

Hadoop and the Enterprise Debates: openmethodology.org/wiki/Hadoop_and_the_Enterprise_Debates

Preparing for NoSQL: openmethodology.org/wiki/Preparing_for_NoSQL

Big Data Solution Offering: openmethodology.org/wiki/Big_Data_Solution_Offering

You can also find the videos and blog post summaries for every episode of the Open MIKE Podcast at: ocdqblog.com/MIKE

 

Related Posts

Our Increasingly Data-Constructed World

Dot Collectors and Dot Connectors

HoardaBytes and the Big Data Lebowski

OCDQ Radio - Data Quality and Big Data

Exercise Better Data Management

A Tale of Two Datas

Big Data Lessons from Orbitz

The Graystone Effects of Big Data

Will Big Data be Blinded by Data Science?

Magic Elephants, Data Psychics, and Invisible Gorillas

Big Data el Memorioso

Information Overload Revisited

Finding a Needle in a Needle Stack

Darth Vader, Big Data, and Predictive Analytics

Why Can’t We Predict the Weather?

Swimming in Big Data

The Big Data Theory

Big Data: Structure and Quality

Sometimes it’s Okay to be Shallow

Small Data and VRM

Can the Enterprise really be Secured?

This blog post is sponsored by the Enterprise CIO Forum and HP.

Over the last two months, I have been blogging a lot about how enterprise security has become an even more important, and more complex, topic of discussion than it already was.  The days of the perimeter fence model being sufficient are long gone, and social media is helping social engineering more effectively attack the weakest links in an otherwise sound security model.

With the consumerization of IT allowing Shadow IT to emerge from the shadows and the cloud and mobile devices enabling the untethering of the enterprise from the physical boundaries that historically defined where the enterprise stopped and the outside world began, I have been more frequently pondering the question: Can the enterprise really be secured?

The cloud presents the conundrum of relying on non-enterprise resources for some aspects of enterprise security.  However, “one advantage of the cloud,” Judy Redman recently blogged, “is that it drives the organization to take a more comprehensive, and effective, approach to risk governance.”  Redman’s post includes four recommended best practices for stronger cloud security.

With the growing popularity of the mobile-app-portal-to-the-cloud business model, more enterprises are embracing mobile app development for deploying services to better support both their customers and their employees.  “Mobile apps,” John Jeremiah recently blogged, “are increasingly dependent on cloud services that the apps team didn’t build, the organization doesn’t own, and the ops team doesn’t even know about.”  Jeremiah’s post includes four things to consider for stronger mobile security.

Although it is essential for every enterprise to have a well-articulated security strategy, “it is important to understand that strategy is not policy,” John Burke recently blogged.  “Security strategy links corporate strategy overall to specific security policies; policies implement strategy.”  Burke’s post includes five concrete steps to take to build a security strategy and implement security policies.

With the very notion of an enterprise increasingly becoming more of a conceptual entity than a physical entity, enterprise security is becoming a bit of a misnomer.  However, the underlying concepts of enterprise security still need to be put into practice, and even more so now that, since the enterprise has no physical boundaries, the enterprise is everywhere, which means that everyone (employees, partners, suppliers, service providers, customers) will have to work together for “the enterprise” to really be secured.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Enterprise Security and Social Engineering

The Weakest Link in Enterprise Security

Enterprise Security is on Red Alert

Securing your Digital Fortress

The Good, the Bad, and the Secure

The Data Encryption Keeper

The Cloud Security Paradox

The Cloud is shifting our Center of Gravity

Are Cloud Providers the Bounty Hunters of IT?

The Return of the Dumb Terminal

More Tethered by the Untethered Enterprise?

A Swift Kick in the AAS

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

The Diffusion of the Consumerization of IT

Cloud Computing for Midsize Businesses

OCDQ Radio is a vendor-neutral podcast about data quality and its related disciplines, produced and hosted by Jim Harris.

During this episode, Ed Abrams and I discuss cloud computing for midsize businesses, and, more specifically, we discuss aspects of the recently launched IBM global initiatives to help Managed Service Providers (MSP) deliver cloud-based service offerings.

Ed Abrams is the Vice President of Marketing, IBM Midmarket.  In this role, Ed is responsible for leading a diverse team that supports IBM’s business objectives with small and midsize businesses by developing, planning, and executing offerings and go-to-market strategies designed to help midsize businesses grow.  In this role Ed works closely and collaboratively with sales and channels teams, and agency partners to deliver high-quality and effective marketing strategies, offerings, and campaigns.

A Tale of Two Datas

Is big data more than just lots and lots of data?  Is big data unstructured and not-so-big data structured?  Malcolm Chisholm explored these questions in his recent Information Management column, where he posited that there are, in fact, two datas.

“One type of data,” Chisholm explained,  “represents non-material entities in vast computerized ecosystems that humans create and manage.  The other data consists of observations of events, which may concern material or non-material entities.”

Providing an example of the first type, Chisholm explained, “my bank account is not a physical thing at all; it is essentially an agreed upon idea between myself, the bank, the legal system, and the regulatory authorities.  It only exists insofar as it is represented, and it is represented in data.  The balance in my bank account is not some estimate with a positive and negative tolerance; it is exact.  The non-material entities of the financial sector are orderly human constructs.  Because they are orderly, we can more easily manage them in computerized environments.”

The orderly human constructs that are represented in data, in the stories told by data (including the stories data tell about us and the stories we tell data) is one of my favorite topics.  In our increasingly data-constructed world, it’s important to occasionally remind ourselves that data and the real world are not the same thing, especially when data represents non-material entities since, with the possible exception of Makers using 3-D printers, data-represented entities do not re-materialize into the real world.

Describing the second type, Chisholm explained, “a measurement is usually a comparison of a characteristic using some criteria, a count of certain instances, or the comparison of two characteristics.  A measurement can generally be quantified, although sometimes it’s expressed in a qualitative manner.  I think that big data goes beyond mere measurement, to observations.”

Chisholm called the first type the Data of Representation, and the second type the Data of Observation.

The data of representation tends to be structured, in the relational sense, but doesn’t need to be (e.g., graph databases) and the data of observation tends to be unstructured, but it can also be structured (e.g., the structured observations generated by either a data profiling tool analyzing structured relational tables or flat files, or a word-counting algorithm analyzing unstructured text).

Structured and unstructured,” Chisholm concluded, “describe form, not essence, and I suggest that representation and observation describe the essences of the two datas.  I would also submit that both datas need different data management approaches.  We have a good idea what these are for the data of representation, but much less so for the data of observation.”

I agree that there are two types of data (i.e., representation and observation, not big and not-so-big) and that different data uses will require different data management approaches.  Although data modeling is still important and data quality still matters, how much data modeling and data quality is needed before data can be effectively used for specific business purposes will vary.

In order to move our discussions forward regarding “big data” and its data management and business intelligence challenges, we have to stop fiercely defending our traditional perspectives about structure and quality in order to effectively manage both the form and essence of the two datas.  We also have to stop fiercely defending our traditional perspectives about data analytics, since there will be some data use cases where depth and detailed analysis may not be necessary to provide business insight.

 

A Tale of Two Datas

In conclusion, and with apologies to Charles Dickens and his A Tale of Two Cities, I offer the following A Tale of Two Datas:

It was the best of times, it was the worst of times.
It was the age of Structured Data, it was the age of Unstructured Data.
It was the epoch of SQL, it was the epoch of NoSQL.
It was the season of Representation, it was the season of Observation.
It was the spring of Big Data Myth, it was the winter of Big Data Reality.
We had everything before us, we had nothing before us,
We were all going direct to hoarding data, we were all going direct the other way.
In short, the period was so far like the present period, that some of its noisiest authorities insisted on its being signaled, for Big Data or for not-so-big data, in the superlative degree of comparison only.

Related Posts

HoardaBytes and the Big Data Lebowski

The Idea of Order in Data

The Most August Imagination

Song of My Data

The Lies We Tell Data

Our Increasingly Data-Constructed World

Plato’s Data

OCDQ Radio - Demystifying Master Data Management

OCDQ Radio - Data Quality and Big Data

Big Data: Structure and Quality

Swimming in Big Data

Sometimes it’s Okay to be Shallow

Darth Vader, Big Data, and Predictive Analytics

The Big Data Theory

Finding a Needle in a Needle Stack

Exercise Better Data Management

Magic Elephants, Data Psychics, and Invisible Gorillas

Why Can’t We Predict the Weather?

Data and its Relationships with Quality

A Tale of Two Q’s

A Tale of Two G’s

Social Media Marketing: From Monologues to Dialogues

“With social media analytics,” Ed Abrams and Jay Hakami recently blogged, midsize businesses “can calculate the ROI and assess the effectiveness of their social media marketing campaigns by tracking both the actions of consumers and the influence of their top commenters and re-tweeters providing a level of insight on the individual never seen before . . . allowing them to make informed business decisions on how to best leverage their online presence.”

But perhaps the most challenging aspect for businesses trying to best leverage their online presence by using social media in their marketing campaigns is that it involves the presence of voices that they don’t have control over — their customers’ voices.

Social media has transformed word of mouth into word of data.  And, as more companies are being forced to acknowledge, the digital mouths of customers speak volumes.  Social media is empowering customers to add their voice to marketing messages.

“Everyone loves to talk about customers engaging with brands,” Rick Robinson recently blogged.  “But in the process, customers are also taking over brands.  The message for midsize firms is that they can no longer count on shaping the conversation.”

“Social media offers,” Dan Berthiaume recently blogged, “the opportunity to directly engage with customers for real-time feedback.  Social media marketing at its core is a relatively inexpensive and fast way of conducting marketing.”  True, however as Paul Gillin explained during our recent podcast discussion about social media for midsize businesses, the fundamental difference between traditional marketing and social media marketing is that the former is one-way, whereas the latter is two-way.

In other words, marketing has not historically looked to engage with customers to receive feedback.  Marketing has traditionally broadcasted messages at customers — and marketing’s early use of social media has been as just another broadcast channel.

However, “social media is a process of continual conversation,” Gillin explained.  “It’s a very different way to go about marketing, but the natural tendency for people when they see something new is to apply the old metaphors to it.  What you’ve seen during the first five years of social media’s popularity is a lot of use of these platforms as essentially the same old marketing channels.”

“We see more companies every year that are getting the idea that social media is a two-way conversation,” Gillin continued, “but it’s a difficult skill to develop.  Marketers are not taught in school or at work to converse — they’re taught to deliver messages.  So that’s turning around a pretty big battleship trying to convince and teach all these people the skills of two-way engagement.”

Marketing has long been accustomed to controlling a conversation that was never really a conversation — since marketers did all the talking, and customers could only listen or ignore them.  Social media is evolving marketing from monologues to dialogues.

Is your midsize business ready and, more importantly, willing to engage customers in an actual conversation?

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

Enterprise Security and Social Engineering

This blog post is sponsored by the Enterprise CIO Forum and HP.

“100 percent security no longer exists in the digital world,” Christian Verstraete recently blogged.  “Many companies have to recognize that they have not developed a proactive enough security strategy.  They also have to recognize that they have not put the appropriate procedures in place to cope with a security breach when it happens.  Instead, they are in reactive mode.”

In my previous post, I blogged about how although any proactive security strategy can only be as strong as its weakest link, the weakest link in your enterprise security could actually be the protocols enacted in the event of an apparent security breach.

“We are confronted with a world where employees bring their own devices and use them for both their private and their business lives,” Verstraete continued.  “As our world is getting increasingly integrated, and as social media is used by enterprises to reach their customers and prospects, we need to train our people to ensure they are watchful for social engineering.”

The book Social Engineering: The Art of Human Hacking by Chris Hadnagy, the lead developer of Social-Engineer.org, defines social engineering as “the act of manipulating a person to take an action that may or may not be in their best interest.”

“While software companies are learning how to strengthen their programs,” Hadnagy explained, “hackers and malicious social engineers are turning to the weakest part of the infrastructure — the people.  The motivation is all about return on investment.  No self-respecting hacker is going to spend 100 hours to get the same results from a simple attack that takes one hour, or less.”

“Denial, ignorance, or the overwhelming nature of threats and vulnerabilities are all causes of a lack of focus,” Ken Larson recently blogged.  “In this age of IT, the threats and vulnerabilities raised by mobility, social networking, cloud computing, and the sharing of IT resources between enterprises must be added to the traditional threats that we’ve focused on for years.”

As I have previously blogged, traditional approaches focus mainly on external security threats, which nowadays is like fortifying your physical barriers while ignoring the cloud floating over them and the mobile devices walking around them.  The more open business environment enabled by cloud and mobile technologies is here to stay, and it requires a modern data security model.

“Proactively define your security strategy,” Verstraete concluded.  “Decide what an acceptable risk level is.  Choose and implement tools and procedures accordingly, and train, train, train your employees.”  I definitely agree that employee training is essential to strengthening your enterprise security, and especially training your employees to understand the principles of social engineering.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The Weakest Link in Enterprise Security

Enterprise Security is on Red Alert

Securing your Digital Fortress

The Good, the Bad, and the Secure

The Data Encryption Keeper

The Cloud Security Paradox

The Cloud is shifting our Center of Gravity

Are Cloud Providers the Bounty Hunters of IT?

The Return of the Dumb Terminal

A Swift Kick in the AAS

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

Turning the M Upside Down

I am often asked about the critical success factors for enterprise initiatives, such as data quality, master data management, and data governance.

Although there is no one thing that can guarantee success, if forced to choose one critical success factor to rule them all, I would choose collaboration.

But, of course, when I say this everyone rolls their eyes at me (yes, I can see you doing it now through the computer) since it sounds like I’m avoiding the complex concepts underlying enterprise initiatives by choosing collaboration.

The importance of collaboration is a very simple concept but, as Amy Ray and Emily Saliers taught me, “the hardest to learn was the least complicated.”

 

The Pronoun Test

Although all organizations must define the success of enterprise initiatives in business terms (e.g., mitigated risks, reduced costs, or increased revenue), collaborative organizations understand that the most important factor for enduring business success is the willingness of people all across the enterprise to mutually pledge to each other their communication, cooperation, and trust.

These organizations pass what Robert Reich calls the Pronoun Test.  When their employees make references to the company, it’s done with the pronoun We and not They.  The latter suggests at least some amount of disengagement, and perhaps even alienation, whereas the former suggests the opposite — employees feel like part of something significant and meaningful.

An even more basic form of the Pronoun Test is whether or not people can look beyond their too often self-centered motivations and selflessly include themselves in a collaborative effort.  “It’s amazing how much can be accomplished if no one cares who gets the credit” is an old quote for which, with an appropriate irony, it is rather difficult to identify the original source.

Collaboration requires a simple, but powerful, paradigm shift that I call Turning the M Upside Down — turning Me into We.

 

Related Posts

The Algebra of Collaboration

The Business versus IT—Tear down this wall!

The Road of Collaboration

Dot Collectors and Dot Connectors

No Datum is an Island of Serendip

The Three Most Important Letters in Data Governance

The Stakeholder’s Dilemma

Shining a Social Light on Data Quality

Data Quality and the Bystander Effect

The Family Circus and Data Quality

The Year of the Datechnibus

Being Horizontally Vertical

The Collaborative Culture of Data Governance

Collaboration isn’t Brain Surgery

Are you Building Bridges or Digging Moats?

Open MIKE Podcast — Episode 03

Method for an Integrated Knowledge Environment (MIKE2.0) is an open source delivery framework for Enterprise Information Management, which provides a comprehensive methodology that can be applied across a number of different projects within the Information Management space.  For more information, click on this link: openmethodology.org/wiki/What_is_MIKE2.0

The Open MIKE Podcast is a video podcast show, hosted by Jim Harris, which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

 

Episode 03: Data Quality Improvement and Data Investigation

If you’re having trouble viewing this video, you can watch it on Vimeo by clicking on this link: Open MIKE Podcast on Vimeo

 

MIKE2.0 Content Featured in or Related to this Podcast

Enterprise Data Management: openmethodology.org/wiki/Enterprise_Data_Management_Offering_Group

Data Quality Improvement: openmethodology.org/wiki/Data_Quality_Improvement_Solution_Offering

Data Investigation: openmethodology.org/wiki/Category:Data_Investigation_and_Re-Engineering

You can also find the videos and blog post summaries for every episode of the Open MIKE Podcast at: ocdqblog.com/MIKE

Social Media for Midsize Businesses

OCDQ Radio is a vendor-neutral podcast about data quality and its related disciplines, produced and hosted by Jim Harris.

During this episode, Paul Gillin and I discuss social media for midsize businesses, including how the less marketing you do, the more effective you will be with social media marketing, the war of generosity, where the more you give, the more you get, and the importance of the trust equation, which means the more people trust you, the more they will want to do business with you.

Paul Gillin is a veteran technology journalist and a thought leader in new media.  Since 2005, he has advised marketers and business executives on strategies to optimize their use of social media and online channels to reach buyers cost-effectively.  He is a popular speaker who is known for his ability to simplify complex concepts using plain talk, anecdotes, and humor.

Paul Gillin is the author of four books about social marketing: The New Influencers (2007), Secrets of Social Media Marketing (2008), Social Marketing to the Business Customer (2011), co-authored with Eric Schwartzman, and the forthcoming book Attack of the Customers (2012), co-authored with Greg Gianforte.

Paul Gillin was previously the founding editor of TechTarget and editor-in-chief of Computerworld.  He writes a monthly column for BtoB magazine and is an active blogger and media commentator.  He has appeared as an expert commentator on CNN, PBS, Fox News, MSNBC, and other television outlets.  He has also been quoted or interviewed for hundreds of news and radio reports in outlets such as The Wall Street Journal, The New York Times, NPR, and the BBC.  Paul Gillin is a Senior Research Fellow and member of the board of directors at the Society for New Communications Research.

The Weakest Link in Enterprise Security

This blog post is sponsored by the Enterprise CIO Forum and HP.

As a recent Techopedia article noted, one of the biggest challenges for IT security these days is finding a balance among three overarching principles: availability (i.e., that information is accessible when authorized users need it), confidentiality (i.e., that information is only being seen or used by people who are authorized to access it), and integrity (i.e., that any changes to information by an unauthorized user are impossible — or at least detected — and changes by authorized users are tracked).

Finding this balance has always been a complex challenge for enterprise security since the tighter you lock an IT system down, the harder it can become to use for daily business activities, which sometimes causes usability to be prioritized over security.

“I believe those who think security isn’t a general IT priority are wrong,” Rafal Los recently blogged in a post about the role of Chief Information Security Officer (CISO).  “Pushing the security agenda ahead of doing business seems to be something poor CISOs are known for, which creates a backlash of executive push-back against security in many organizations.”

According to Los, IT leaders need to balance the business enablement of IT with the need to keep information secure, which requires better understanding both business risks and IT threats, and allowing the organization to execute its business goals in a tactical fashion while simultaneously working out the long-term enterprise security strategy.

Although any security strategy is only as strong as its weakest link, the weakest link in enterprise security might not be where you’d expect to find it.  A good example of this came from perhaps the biggest personal data security disaster story of the year, the epic hacking of Mat Honan, during which, as he described it, “in the space of one hour, my entire digital life was destroyed.”

The biggest lesson learned was not the lack of a good security strategy (though that obviously played a part, not only with Honan personally, but also with the vendors involved).  Instead, the lesson was that the weakest link in any security strategy might be its recovery procedures — and that hackers don’t need to rely on Hollywood-style techno-wizardry to overcome security protocols.

Organizations are rightfully concerned about mobile devices containing sensitive data getting stolen — in fact, many make use of the feature provided by Apple that enables you to remotely delete data on your iPhone, iPad, and MacBook in the event of theft.

In Honan’s case, the hackers exploited this feature by accessing his Apple iCloud account (for the details of how that happened, read his blog post), wiping clean his not-stolen mobile devices, resetting his passwords, including for his email accounts, which prevented him from receiving any security warnings and password reset notifications, and bought the hackers the time needed to redirect everything — essentially all by doing what Honan would have done if his mobile devices had actually been stolen.

The hackers also deleted all of Honan’s data stored in the cloud, which was devastating since he had no off-line backups (yes, he admits that’s his fault).  Before you’re tempted to use this as a cloud-bashing story, as Honan blogged in a follow-up post about how he resurrected his digital life, “when my data died, it was the cloud that killed it.  The triggers hackers used to break into my accounts and delete my files were all cloud-based services — iCloud, Google, and Amazon.  Some pundits have latched onto this detail to indict our era of cloud computing.  Yet just as the cloud enabled my disaster, so too was it my salvation.”

Although most security strategies are focused on preventing a security breach from happening, as the Honan story exemplifies, the weakest link in your enterprise security could actually be the protocols enacted in the event of an apparent security breach.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Enterprise Security is on Red Alert

Securing your Digital Fortress

The Good, the Bad, and the Secure

The Data Encryption Keeper

The Cloud Security Paradox

The Cloud is shifting our Center of Gravity

Are Cloud Providers the Bounty Hunters of IT?

The Return of the Dumb Terminal

The UX Factor

A Swift Kick in the AAS

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

Cooks, Chefs, and Data Governance

In their book Practical Wisdom, Barry Schwartz and Kenneth Sharpe quoted retired Lieutenant Colonel Leonard Wong, who is a Research Professor of Military Strategy in the Strategic Studies Institute at the United States Army War College, focusing on the human and organizational dimensions of the military.

“Innovation,” Wong explained, “develops when an officer is given a minimal number of parameters (e.g., task, condition, and standards) and the requisite time to plan and execute the training.  Giving the commanders time to create their own training develops confidence in operating within the boundaries of a higher commander’s intent without constant supervision.”

According to Wong, too many rules and requirements “remove all discretion, resulting in reactive instead of proactive thought, compliance instead of creativity, and adherence instead of audacity.”  Wong believed that it came down to a difference between cooks, those who are quite adept at carrying out a recipe, and chefs, those who can look at the ingredients available to them and create a meal.  A successful military strategy is executed by officers who are trained to be chefs, not cooks.

Data Governance’s Kitchen

Data governance requires the coordination of a complex combination of a myriad of factors, including executive sponsorship, funding, decision rights, arbitration of conflicting priorities, policy definition, policy implementation, data quality remediation, data stewardship, business process optimization, technology enablement, and, perhaps most notably, policy enforcement.

Because of this complexity, many organizations think the only way to run data governance’s kitchen is to institute a bureaucracy that dictates policies and demands compliance.  In other words, data governance policies are recipes and employees are cooks.

Although implementing data governance policies does occasionally require a cook-adept-at-carrying-out-a-recipe mindset, the long-term success of a data governance program is going to also require chefs since the dynamic challenges faced, and overcome daily, by business analysts, data stewards, technical architects, and others, exemplify today’s constantly changing business world, which can not be successfully governed by forcing employees to systematically apply rules or follow rigid procedures.

Data governance requires chefs who are empowered with an understanding of the principles of the policies, and who are trusted to figure out how to best implement the policies in a particular business context by combining rules with the organizational ingredients available to them, and creating a flexible procedure that operates within the boundaries of the policy’s principles.

But, of course, just like a military can not be staffed entirely by officers, and a kitchen can not be staffed entirely by chefs, in order to implement a data governance program successfully, an organization needs both cooks and chefs.

Similar to how data governance is neither all-top-down nor all-bottom-up, it’s also neither all-cook nor all-chef.

Only the unique corporate culture of your organization can determine how to best staff your data governance kitchen.

Open MIKE Podcast — Episode 02

Method for an Integrated Knowledge Environment (MIKE2.0) is an open source delivery framework for Enterprise Information Management, which provides a comprehensive methodology that can be applied across a number of different projects within the Information Management space.  For more information, click on this link: openmethodology.org/wiki/What_is_MIKE2.0

The Open MIKE Podcast is a video podcast show, hosted by Jim Harris, which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

 

Episode 02: Information Governance and Distributing Power

If you’re having trouble viewing this video, you can watch it on Vimeo by clicking on this link: Open MIKE Podcast on Vimeo

 

MIKE2.0 Content Featured in or Related to this Podcast

Information Governance: openmethodology.org/wiki/Information_Governance_Solution_Offering

Governance 2.0: openmethodology.org/wiki/Governance_2.0_Solution_Offering

You can also find the videos and blog post summaries for every episode of the Open MIKE Podcast at: ocdqblog.com/MIKE

Cloud Computing is the New Nimbyism

NIMBY is an acronym for “Not In My Back Yard” and its derivative term Nimbyism usually refers to the philosophy of opposing construction projects or other new developments, which would be performed too close to your residence or your business, because even though those new developments could provide widespread benefits, they might just be a disruption to you.  So, for example, yes, please build that new airport or hospital or power plant that our city needs — just don’t build it too close to my back yard.

For a long time, midsize businesses viewed their information technology (IT) department as a Nimbyistic disruption, meaning that they viewed IT as a necessary cost of doing business, but one that also took up valuable space and time, and distracted their focus away from their core competencies, which, for most midsize businesses, are supported by but not directly related to IT.

Nowadays, cloud computing is providing a new — and far more positive — spin on Nimbyism by allowing midsize businesses to free up space in their back yard (where, in my experience, many midsize businesses keep their IT department) as well as free up their time to focus on mission-critical business activities by leveraging more cloud-based IT services, which also allows them to scale up their IT during peak business periods without requiring them to first spend time and money building a bigger back yard.

Shifting to a weather analogy, stratus clouds are characterized by horizontal layering with a uniform base, and nimbostratus clouds are stratus clouds of moderate vertical development, signifying the onset of steady, moderate to heavy, precipitation.

We could say cloud computing is the nimby-stratus IT clouds providing midsize businesses with a uniform base of IT services, which can quickly scale horizontally and/or vertically with the agility to adapt to best serve their evolving business needs.

The nimbleness of the new Nimbyism facilitated by cloud computing is providing another weather-related business insight that’s helping midsize businesses forecast a promising future, hopefully signifying the onset of steady, moderate to heavy, profitability.

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.