Word of Mouth has become Word of Data

In a previous post about overcoming information asymmetry, I discussed one of the ways that customers are changing the balance of power in the retail industry.  During last week’s Mid-Market Smarter Commerce Tweet Chat, the first question was:

Why does measuring social media matter for the retail industry today?

My response was: Word of Mouth has become Word of Data.  In this blog post, I want to explain what I meant by that.

Historically, information reached customers in one of two ways, either through advertising or word of mouth.  The latter was usually words coming from the influential mouths of family and friends, but sometimes from strangers with relevant experience or expertise.  Either way, those words were considered more credible than advertising based on the assumption that the mouths saying them didn’t stand to gain anything personally from sharing their opinions about a company, product, or service.

The biggest challenge facing word of mouth was that you either had to be there to hear the words when they were spoken, or you needed to have a large enough network of people you knew that would be able to pass along those words.  The latter was like we were all playing the children’s game broken telephone since relying upon only verbally transmitted information about any subject, and perhaps especially about a purchasing decision, was dubious when receiving the information via one or more intermediaries.

But the rise of social networking services, like Twitter, Facebook, and Google Plus, has changed the game, especially now that our broken telephones have been replaced with smartphones.  Not only is our social network larger (albeit still mostly comprised of intermediate connections), but, more important, our conversations are essentially being transcribed — our words no longer just leave our mouths, but are also exchanged in short bursts of social data via tweets, status updates, online reviews, and blog posts.

And it could be argued that our social data has a more active social life than we do, since all of our data interacts with the data from other users within and across our social networks, participating in conversations that keep on going long after we have logged out.  Influential tweets get re-tweeted.  Meaningful status updates and blog posts receive comments.  Votes determine which online reviews are most helpful.  This ongoing conversation enriches the information customers have available to them.

Although listening to customers has always been important, gathering customer feedback used to be a challenge.  But nowadays, customers provide their feedback to retailers, and share their experiences with other customers, via social media.  Word of mouth has become word of data.  The digital mouths of customers speak volumes.  The voice of the customer has become empowered by social media, changing the balance of power in the retail industry, and putting customers in control of the conversation.

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

More Tethered by the Untethered Enterprise?

This blog post is sponsored by the Enterprise CIO Forum and HP.

A new term I have been hearing more frequently lately is the Untethered Enterprise.  Like many new terms, definitions vary, but for me at least, it conjures up images of cutting the cords and wires that tether the enterprise to a specific physical location, and tether the business activities of its employees to specific time frames during specific days of the week.

There was a time, not too long ago, when the hard-wired phone lines for desk phones and the Ethernet cables for desktop PCs used by employees between the hours of 9AM and 5PM on Monday through Friday within the office spaces of the organization was how, when, and where the vast majority of the business activities of the enterprise were conducted.

Then came the first generation of mobile phones — the ones that only made phone calls.  And laptop computers, which initially supplemented desktop PCs, but typically only for those employees with a job requiring them to regularly work outside the office, such as traveling salespeople.  Eventually, laptops became the primary work computer with docking stations allowing them to connect to keyboards and monitors while working in the office, and providing most employees with the option of taking their work home with them.  Then the next generations of mobile phones brought text messaging, e-mail, and as Wi-Fi networks became more prevalent, full Internet access, which completed the education of the mobile phone, graduating it to a smartphone.

These smartphones are now supplemented by either a laptop or a tablet, or sometimes both.  These devices are either provided by the enterprise, or with the burgeoning Bring Your Own Device (BYOD) movement, employees are allowed to use their personal smartphones, laptops, and tablets for business purposes.  Either way, enabled by the growing availability of cloud-based services, many employees of most organizations are now capable of conducting business anywhere at anytime.  And beyond a capability, some enterprises foster the expectation that their employees demonstrate a willingness to conduct business anywhere at anytime.

I acknowledge its potential for increasing productivity and better supporting the demands of today’s fast-paced business world, but I can’t help but wonder if the enterprise and its employees will feel more tethered by the untethered enterprise because, when we can no longer unplug since there’s nothing left to unplug, then our always precarious work-life balance seems to surrender to the pervasive work-is-life feeling enabled by the untethered enterprise.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The Diffusion of the Consumerization of IT

Serving IT with a Side of Hash Browns

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

The UX Factor

A Swift Kick in the AAS

Shadow IT and the New Prometheus

The Diderot Effect of New Technology

Are Cloud Providers the Bounty Hunters of IT?

The IT Pendulum and the Federated Future of IT

Information Asymmetry versus Empowered Customers

Information asymmetry is a term from economics describing how one party involved in a transaction typically has more or better information than the other party.  Perhaps the easiest example of information asymmetry is retail sales, where historically the retailer has always had more or better information than the customer about a product that is about to be purchased.

Generally speaking, information asymmetry is advantageous for the retailer, allowing them to manipulate the customer into purchasing products that benefit the retailer’s goals (e.g., maximizing profit margins or unloading excess inventory) more than the customer’s goals (e.g., paying a fair price or buying the product that best suits their needs).  I don’t mean to demonize the retail industry, but for a long time, I’m pretty sure its unofficial motto was: “An uninformed customer is the best customer.”

Let’s consider the example of purchasing a high-definition television (HDTV) since it demonstrates how information asymmetry is not always about holding back useful information, but also bombarding customers with useless information.  In this example, it’s about bombarding customers with useless technical jargon, such as refresh rate, resolution, and contrast ratio.

To an uninformed customer, it certainly sounds like it makes sense that the HDTV with a 240Hz refresh rate, 1080p resolution, and 2,000,000:1 contrast ratio is better than the one with a 120Hz refresh rate, 720p resolution, and 1,000,000:1 contrast ratio.

After all, 240 > 120, 1080 > 720, and 2,000,000 > 1,000,000, right?  Yes — but what do any of those numbers actually mean?

The reality is that refresh rate, resolution, and contrast ratio are just three examples of useless HDTV specifications because they essentially provide no meaningful information about the video quality of the television.  This information is advantageous to only one party involved in the transaction — the retailer — since it appears to justify the higher price of an allegedly better product.

But nowadays fewer customers are falling for these tricks.  Performing a quick Internet search, either before going shopping or on their mobile phone while at the store, is balancing out some of the information asymmetry in retail sales and empowering customers to make better purchasing decisions.  With the increasing availability of broadband Internet and mobile connectivity, today’s empowered customer arrives at the retail front lines armed and ready to do battle with information asymmetry.

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

The Diffusion of the Consumerization of IT

This blog post is sponsored by the Enterprise CIO Forum and HP.

On a previous post about the consumerization of IT, Paul Calento commented: “Clearly, it’s time to move IT out of a discrete, defined department and out into the field, even more than already.  Likewise, solutions used to power an organization need to do the same thing.  Problem is, though, that it’s easy to say that embedding IT makes sense (it does), but there’s little experience with managing it (like reporting and measurement).  Services integration is a goal, but cross-department, cross-business-unit integration remains a thorn in the side of many attempts.”

Embedding IT does make sense, and not only is it easier said than done, let alone done well, but part of the problem within many organizations is that IT became partially self-embedded within some business units while the IT department was resisting the consumerization of IT because they treated it like a fad and not an innovation.  And now those business units are resisting the efforts of the redefined IT department because they fear losing the IT capabilities that consumerization has already given them.

This growing IT challenge brings to mind the Diffusion of Innovations theory developed by Everett Rogers for describing the five stages for the rate at which innovations (e.g., new ideas or technology trends) spread within cultures, such as organizations, starting with the Innovators and Early Adopters, progressing through the Early and Late Majority, and trailed by the Laggards.

A related concept called Crossing the Chasm was developed by Geoffrey Moore to describe the critical phenomenon occurring when enough of the Early Adopters have embraced the innovation so that the beginning of the Early Majority becomes an almost certainty even though mainstream adoption of the innovation is still far from guaranteed.

From my perspective, traditional IT departments are just now crossing the chasm of the diffusion of the consumerization of IT, and are conflicting with the business units that crossed the chasm long ago with their direct adoption of cloud computingSaaS, and mobility solutions not provided by the IT department.  This divergence caused by the IT department and some business units being on different sides of the chasm has damaged, and potentially irreparably, some aspects of the IT-Business partnership.

The longer the duration of this divergence, the more difficult it will be for an IT department, that has finally crossed the chasm, to redefine their role and remain relevant partners with those business units that, perhaps for the first time in the organization’s history, were ahead of the information technology adoption curve.  Additionally, even the communication and collaboration across business units is negatively affected by different business units crossing the IT consumerization chasm at different times, which often, as Paul Calento noted, complicates the organization’s attempts to integrate cross-business-unit IT services.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Serving IT with a Side of Hash Browns

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

The UX Factor

A Swift Kick in the AAS

Shadow IT and the New Prometheus

The Diderot Effect of New Technology

Are Cloud Providers the Bounty Hunters of IT?

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

Talking Business about the Weather

Businesses of all sizes are always looking for ways to increase revenue, decrease costs, and operate more efficiently.  When I talk with midsize business owners, I hear the typical questions.  Should we hire a developer to update our website and improve our SEO rankings?  Should we invest less money in traditional advertising and invest more time in social media?  After discussing these and other business topics for a while, we drift into that standard conversational filler — talking about the weather.

But since I am always interested in analyzing data from as many different perspectives as possible, when I talk about the weather, I ask midsize business owners how much of a variable the weather plays in their business.  Does the weather affect the number of customers that visit your business on a daily basis?  Do customers purchase different items when the weather is good versus bad?

I usually receive quick responses, but when I ask if those responses were based on analyzing sales data alongside weather data, the answer is usually no, which is understandable since businesses are successful when they can focus on their core competencies, and for most businesses, analytics is not a core competency.  The demands of daily operations often prevent midsize businesses from stepping back and looking at things differently, like whether or not there’s a hidden connection between weather and sales.

One of my favorite books is Freakonomics: A Rogue Economist Explores the Hidden Side of Everything by Steven Levitt and Stephen Dubner.  The book, as well as its sequel, podcast, and movie, provides good examples of one of the common challenges facing data science, and more specifically predictive analytics since its predictions often seem counterintuitive to business leaders, whose intuition is rightfully based on their business expertise, which has guided their business success to date.  The reality is that even organizations that pride themselves on being data driven naturally resist any counterintuitive insights found in their data.

Dubner was recently interviewed by Crysta Anderson about how organizations can find insights in their data if they are willing and able to ask good questions.  Of course, it’s not always easy to determine what a good question would be.  But sometimes something as simple as talking about the weather when you’re talking business could lead to a meaningful business insight.

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

Will Big Data be Blinded by Data Science?

All of the hype about Big Data is also causing quite the hullabaloo about hiring Data Scientists in order to help your organization derive business value from big data analytics.  But even though we are still in the hype and hullabaloo stages, these unrelenting trends are starting to rightfully draw the attention of businesses of all sizes.  After all, the key word in big data isn’t big, because, in our increasing data-constructed world, big data is no longer just for big companies and high-tech firms.

And since the key word in data scientist isn’t data, in this post I want to focus on the second word in today’s hottest job title.

When I think of a scientist of any kind, I immediately think of the scientific method, which has been the standard operating procedure of scientific discovery since the 17th century.  First, you define a question, gather some initial data, and form a hypothesis, which is some idea about how to answer your question.  Next, you perform an experiment to test the hypothesis, during which more data is collected.  Then, you analyze the experimental data and evaluate your results.  Whether or not the experiment confirmed or contradicted your hypothesis, you do the same thing — repeat the experiment.  Because a hypothesis can only be promoted to a theory after repeated experimentation (including by others) consistently produces the same result.

During experimentation, failure happens just as often as, if not more often than, success.  However, both failure and success have long played an important role in scientific discovery because progress in either direction is still progress.

Therefore, experimentation is an essential component of scientific discovery — and data science is certainly no exception.

“Designed experiments,” Melinda Thielbar recently blogged, “is where we’ll make our next big leap for data science.”  I agree, but with the notable exception of A/B testing in marketing, most business activities generally don’t embrace data experimentation.

“The purpose of science,” Tom Redman recently explained, “is to discover fundamental truths about the universe.  But we don’t run our businesses to discover fundamental truths.  We run our businesses to serve a customer, gain marketplace advantage, or make money.”  In other words, the commercial application of science has more to do with commerce than it does with science.

One example of the challenges inherent in the commercial application of science is the misconception that predictive analytics can predict what is going to happen with certainty.  When instead, what it actually does is predict some of the possible things that could happen with a certain probability.  Although predictive analytics can be a valuable tool for many business activities, especially decision making, as Steve Miller recently blogged, most of us are not good at using probabilities to make decisions.

So, with apologies to Thomas Dolby, I can’t help but wonder, will big data be blinded by data science?  Will the business leaders being told to hire data scientists to derive business value from big data analytics be blind to what data science tries to show them?

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

Serving IT with a Side of Hash Browns

This blog post is sponsored by the Enterprise CIO Forum and HP.

Since it’s where I started my career, I often ponder what it would be like to work in the IT department today.  This morning, instead of sitting in a cubicle with no window view other than the one Bill Gates gave us, I’m sitting in a booth by a real window, albeit one with a partially obstructed view of the parking lot, at a diner eating a two-egg omelette with a side of hash browns.

But nowadays, it’s possible that I’m still sitting amongst my fellow IT workers.  Perhaps the older gentleman to my left is verifying last night’s database load using his laptop.  Maybe the younger woman to my right is talking into her Bluetooth earpiece with a business analyst working on an ad hoc report.  And the couple in the corner could be struggling to understand the technology requirements of the C-level executive they’re meeting with, who’s now vocalizing his displeasure about sitting in the high chair.

It’s possible that everyone thinks I am updating the status of an IT support ticket on my tablet based on the mobile text alert I just received.  Of course, it’s also possible that all of us are just eating breakfast while I’m also writing this blog post about IT.

However, as Joel Dobbs recently blogged, the IT times are a-changin’ — and faster than ever before since, thanks to the two-egg IT omelette of mobile technologies and cloud providers, IT no longer only happens in the IT department.  IT is everywhere now.

“There is a tendency to compartmentalize various types of IT,” Bruce Guptill recently blogged, “in order to make them more understandable and conform to budgeting practices.  But the core concept/theme/result of mobility really is ubiquity of IT — the same technology, services, and capabilities regardless of user and asset location.”

Regardless of how much you have embraced the consumerization of IT, some of your IT happens outside of your IT department, and some IT tasks are performed by people who not only don’t work in IT, but possibly don’t even work for your organization.

“While systems integration was once the big concern,” Judy Redman recently blogged, “today’s CIOs need to look to services integration.  Companies today need to obtain services from multiple vendors so that they can get best-of-breed solutions, cost efficiencies, and the flexibility needed to meet ever-changing and ever-more-demanding business needs.”

With its increasingly service-oriented and ubiquitous nature, it’s not too far-fetched to imagine that in the near future of IT, the patrons of a Wi-Fi-enabled diner could be your organization’s new IT department, serving your IT with a side of hash browns.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The IT Consumerization Conundrum

Shadow IT and the New Prometheus

A Swift Kick in the AAS

The UX Factor

Are Cloud Providers the Bounty Hunters of IT?

The Cloud Security Paradox

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

The UX Factor

This blog post is sponsored by the Enterprise CIO Forum and HP.

In his book The Most Human Human, Brian Christian explained that “UX — short for User Experience — refers to the experience a given user has using a piece of software or technology, rather than the purely technical capacities of that device.”

But since its inception, the computer industry has been primarily concerned with technical capacities.  Computer advancements have followed the oft-cited Moore’s Law, a trend accurately described by Intel co-founder Gordon Moore in 1965, which states the number of transistors that can be placed inexpensively on an integrated circuit, thereby increasing processing speed and memory capacity, doubles approximately every two years.

However, as Christian explained, for a while in the computer industry, “an arms race between hardware and software created the odd situation that computers were getting exponentially faster but not faster at all to use, as software made ever-larger demands on systems resources, at a rate that matched and sometimes outpaced hardware improvements.”  This was sometimes called “Andy and Bill’s Law,” referring to Andy Grove of Intel and Bill Gates of Microsoft.  “What Andy giveth, Bill taketh away.”

But these advancements in computational power, along with increased network bandwidth, parallel processing frameworks (e.g., Hadoop), scalable and distributed models (e.g., cloud computing), and other advancements (e.g., in-memory technology) are making powerful technical capacities so much more commonplace, and so much less expensive, that the computer industry is responding to consumers demanding that the primary concern be user experience — hence the so-called Consumerization of IT.

“As computing technology moves increasingly toward mobile devices,” Christian noted, “product development becomes less about the raw computing horsepower and more about the overall design of the product and its fluidity, reactivity, and ease of use.”

David Snow and Alex Bakker have recently blogged about the challenges and opportunities facing enterprises and vendors with respect to the Bring Your Own Device (BYOD) movement, where more employees, and employers, are embracing mobile devices.

Although the old mantra of function over form is not getting replaced by form over function, form factor, interface design, and the many other aspects of User Experience are becoming the unrelenting UX Factor of the continuing consumerization trend.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The Diderot Effect of New Technology

A Swift Kick in the AAS

Shadow IT and the New Prometheus

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

Are Cloud Providers the Bounty Hunters of IT?

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

Magic Elephants, Data Psychics, and Invisible Gorillas

This blog post is sponsored by the Enterprise CIO Forum and HP.

A recent Forbes article predicts Big Data will be a $50 billion market by 2017, and Michael Friedenberg recently blogged how the rise of big data is generating buzz about Hadoop (which I call the Magic Elephant): “It certainly looks like the Holy Grail for organizing unstructured data, so it’s no wonder everyone is jumping on this bandwagon.  So get ready for Hadoopalooza 2012.”

John Burke recently blogged about the role of big data helping CIOs “figure out how to handle the new, the unusual, and the unexpected as an opportunity to focus more clearly on how to bring new levels of order to their traditional structured data.”

As I have previously blogged, many big data proponents (especially the Big Data Lebowski vendors selling Hadoop solutions) extol its virtues as if big data provides clairvoyant business insight, as if big data was the Data Psychic of the Information Age.

But a recent New York Times article opened with the story of a statistician working for a large retail chain being asked by his marketing colleagues: “If we wanted to figure out if a customer is pregnant, even if she didn’t want us to know, can you do that?” As Eric Siegel of Predictive Analytics World is quoted in the article, “we’re living through a golden age of behavioral research.  It’s amazing how much we can figure out about how people think now.”

So, perhaps calling big data psychic is not so far-fetched after all.  However, the potential of predictive analytics exemplifies why one of the biggest implications about big data is the data privacy concerns it raises.

Although it’s amazing (and scary) how much the Data Psychic can figure out about how we think (and work, shop, vote, love), it’s equally amazing (and scary) how much Psychology is figuring out about how we think, how we behave, and how we decide.

As I recently blogged about WYSIATI (“what you see is all there is” from Daniel Kahneman’s book Thinking, Fast and Slow), when you are using big data to make business decisions, what you are looking for can greatly influence what you are looking at (and vice versa).  But this natural human tendency could cause you miss the Invisible Gorilla walking across your screen.

If you are unfamiliar with that psychology experiment, which was created by Christopher Chabris and Daniel Simons, authors of the book The Invisible Gorilla: How Our Intuitions Deceive Us, then I recommend going to theinvisiblegorilla.com/videos.html. (By the way, before I was familiar with its premise, the first time I watched the video, I did not see the guy in the gorilla suit, and now when I watch the video, seeing the “invisible gorilla” distracts me, causing me to not count the number of passes correctly.)

In his book Incognito: The Secret Lives of the Brain, David Eagleman explained how our brain samples just a small bit of the physical world, making time-saving assumptions and seeing only as well as it needs to.  As our eyes interrogate the world, they optimize their strategy for the incoming data, arbitrating a battle between the conflicting information.  What we see is not what is really out there, but instead only a moment-by-moment version of which perception is winning over the others.  Our perception works not by building up bits of captured data, but instead by matching our expectations to the incoming sensory data.

I don’t doubt the Magic Elephants and Data Psychics provide the potential to envision and analyze almost anything happening within the complex and constantly changing business world — as well as the professional and personal lives of the people in it.

But I am concerned that information optimization driven by the biases of our human intuition and perception will only match our expectations to those fast-moving large volumes of various data, thereby causing us to not see many of the Invisible Gorillas.

Although this has always been a business intelligence concern, as technological advancements improve our data analytical tools, we must not lose sight of the fact that tools and data remain only as effective (and as beneficent) as the humans who wield them.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Big Data el Memorioso

Neither the I Nor the T is Magic

Information Overload Revisited

HoardaBytes and the Big Data Lebowski

WYSIWYG and WYSIATI

The Speed of Decision

The Data-Decision Symphony

A Decision Needle in a Data Haystack

The Big Data Collider

Dot Collectors and Dot Connectors

DQ-View: Data Is as Data Does

Data, Information, and Knowledge Management

A Swift Kick in the AAS

This blog post is sponsored by the Enterprise CIO Forum and HP.

Appending the phrase “as a Service” (AAS) to almost every word (e.g., Software, Platform, Infrastructure, Data, Analytics) has become increasing prevalent due to the world-wide-webification of IT by cloud computing and other consumerization trends.

Rick Blaisdell recently blogged about the benefits of the cloud, which include fully featured services, monthly subscription costs, 24/7 support, high availability, and financially-backed service level agreements.  “Look at the cloud,” Blaisdell recommended, “as a logical extension of your IT capabilities, and take advantage of all the benefits of cloud services.”

Judy Redman has blogged about how cloud computing is one of three IT delivery trends (along with agile development and composite applications) that are allowing IT leaders to reduce costs, deliver better applications faster, and provide results that are more aligned with, and more responsive to, the business.

And with more existing applications migrating to the cloud, it is all too easy to ponder whether these services raining down from the cloud forecast the end of the reign of the centralized IT department — and, perhaps by extension, the end of the reign of the traditional IT vendor that remains off-premises-resistant (i.e., vendors continuing to exclusively sell on-premises solutions, which they positively call enterprise-class solutions, but their customers often come to negatively call legacy applications).

However, “cloud (or public cloud at least) is not the only enabler,” Adrian Bridgwater recently blogged, explaining how a converged infrastructure acknowledges that “existing systems need to be consolidated and brought into line in a harmonious, interconnected, and interoperable way.  This is where private clouds (and/or a mix of hybrid clouds) come to the fore and a firm manages its own internal systems in a hyper-efficient manner.  From this point, we see IT infrastructure working to a) save money, b) run parallel with strategic business objectives for profit and growth, and c) become a business enabler in its own right.”

No matter how much of it is cloud-oriented (or public/private clouded), the future of IT is definitely going to be service-oriented.

Now, of course, the role of IT has always been to deliver to the enterprise a fast and agile business-enabling service.  But perhaps what is refreshingly new about the unrelenting “as a Service” trend is that it reminds the IT department of their prime directive, and it enables the enterprise to deliver to the IT industry as a whole a (sometimes sorely needed) Swift Kick in the AAS.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Can Enterprise-Class Solutions Ever Deliver ROI?

Why does the sun never set on legacy applications?

Are Applications the La Brea Tar Pits for Data?

The Cloud Security Paradox

Are Cloud Providers the Bounty Hunters of IT?

The Partly Cloudy CIO

Shadow IT and the New Prometheus

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

The IT Pendulum and the Federated Future of IT

Big Data el Memorioso

This blog post is sponsored by the Enterprise CIO Forum and HP.

Funes el memorioso is a short story by Jorge Luis Borges, which describes a young man named Ireneo Funes who, as a result of a horseback riding accident, has lost his ability to forget.  Although Funes has a tremendous memory, he is so lost in the details of everything he knows that he is unable to convert the information into knowledge and unable, as a result, to achieve wisdom.

In Spanish, the word memorioso means “having a vast memory.”  Without question, Big Data has a vast memory comprised of fast-moving large volumes of varying data seemingly providing details about everything your organization could ever want to know about our increasingly digitized and pixelated world.  But what if Big Data is the Ireneo Funes of the Information Age?

What if Big Data el Memorioso is the not-so-short story in which your organization becomes so lost in the details of everything big data delivers that you’re unable to connect enough of the dots to convert the information into knowledge and unable, as a result, to achieve the wisdom necessary to satisfice specific business needs?

Adrian Bridgwater recently compared this challenge to “trying to balance a stack of papers on a moving walkway, in a breeze, without knowing the full length or speed of the walkway itself.  If you want to extend the metaphor one step further — there are other passengers on our walkway and they could bump into us and/or add papers to our stack.  Oh, did I mention that the pieces of paper might not even all be the same size, shape, or color — and some may have tattered edges and coffee stains?”

In other words, as Bridgwater went on to explain, “our information optimization goals will typically include the need to manage information and assess its quantitative and qualitative values.  We will also need to analyze streams of both structured and unstructured data, the latter including video, emails, and other less ‘straight edged’ data.”

While examining some of the technology options that can assist with this challenge, Paul Muller recently remarked “whether it be structured, unstructured, big, small, real-time, or historical — data of all kinds are top-of-mind for business executives.  It may already feel like you’re drowning in data, but it’s important to get to grips with the changing technology landscape to ensure you’re not drowning in an incoherent mess of information management architectures too.”

Edd Dumbill recently wrote an introduction to the big data landscape, which concluded that “big data is no panacea.  You can find patterns and clues in your data, but then what?”  As Dumbill recommends, you need to know where you want to go.  You need to know what problem you want to solve, i.e., you need to pick a real business problem to guide your implementation.

Without this implementation guide, big data will have, as Borges said of Funes, “a certain stammering greatness,” but amount to, as William Shakespeare said in The Tragedy of Macbeth, “a tale told by an idiot, full of sound and fury, signifying nothing.”

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Neither the I Nor the T is Magic

Information Overload Revisited

The Speed of Decision

The Data-Decision Symphony

A Decision Needle in a Data Haystack

The Big Data Collider

Dot Collectors and Dot Connectors

DQ-View: Data Is as Data Does

Data, Information, and Knowledge Management

Is your data complete and accurate, but useless to your business?

The Real Data Value is Business Insight

Data, data everywhere, but where is data quality?

Neither the I Nor the T is Magic

This blog post is sponsored by the Enterprise CIO Forum and HP.

It’s that time when we reflect on the past year and try to predict the future, such as Paul Muller, Joel Rothman, and Pearl Zhu did with their recent blog posts.  Although I have previously written about why most predictions don’t come true, in this post, I throw my fortune-telling hat into the 2012 prediction ring.

The information technology (IT) trends of 2011 included consumerization and decentralization, application modernization and information optimization, cloud computing and cloud security (and, by extension, enterprise security).  However, perhaps the biggest IT trend of the year was that 2011 is going out with a Big Bang about Big Data in 2012 and beyond.

Since its inception, the IT industry has both benefited from and battled against the principle known as Clarke’s Third Law:

“Any sufficiently advanced technology is indistinguishable from magic.”

This principle often fuels the Diderot Effect of New Technology, enchanting our organizations with the mad desire to stock up on new technologically magic things.  As such, many are predicting 2012 will be the Year of the Magic Elephant named Hadoop because, as Gartner Research predicts about big data, “the size, complexity of formats, and speed of delivery exceeds the capabilities of traditional data management technologies; it requires the use of new or exotic technologies simply to manage the volume alone.  Many new technologies are emerging, with the potential to be disruptive.  Analytics has become a major driving application.”  As a corollary, the potential business value of integrating big data into business analytics seems to be conjuring up an alternative version of Clarke’s Third Law:

“Any sufficiently advanced information is indistinguishable from magic.”

In other words, many big data proponents (especially IT vendors selling Hadoop-based solutions) extol its virtues as if its information is capable of providing clairvoyant business insight, as if big data was the Data Psychic of the Information Age.

Although both sufficiently advanced information and technology will have important business-enabling IT roles to play in 2012, never forget that neither the I nor the T is magic — no matter what the Data Psychics and Magic Elephants may say.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Information Overload Revisited

The Data Encryption Keeper

The Cloud Security Paradox

The Good, the Bad, and the Secure

Securing your Digital Fortress

Shadow IT and the New Prometheus

Are Cloud Providers the Bounty Hunters of IT?

The Diderot Effect of New Technology

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

A Sadie Hawkins Dance of Business Transformation

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Partly Cloudy CIO

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

Information Overload Revisited

This blog post is sponsored by the Enterprise CIO Forum and HP.

Information Overload is a term invoked regularly during discussions about the data deluge of the Information Age, which has created a 24 hours a day, 7 days a week, 365 days a year, world-wide whirlwind of constant information flow, where the very air we breath is literally teeming with digital data streams — continually inundating us with new, and new types of, information.

Information overload generally refers to how too much information can overwhelm our ability to understand an issue, and can even disable our decision making in regards to that issue (this latter aspect is generally referred to as Analysis Paralysis).

But we often forget that the term is over 40 years old.  It was popularized by Alvin Toffler in his bestselling book Future Shock, which was published in 1970, back when the Internet was still in its infancy, and long before the Internet’s progeny would give birth to the clouds contributing to the present, potentially perpetual, forecast for data precipitation.

A related term that has become big in the data management industry is Big Data, which, as Gartner Research explains, although the term acknowledges the exponential growth, availability, and use of information in today’s data-rich landscape, big data is about more than just data volume.  Data variety (i.e., structured, semi-structured, and unstructured data, as well as other types, such as the sensor data emanating from the Internet of Things) and data velocity (i.e., how fast data is being produced and how fast the data must be processed to meet demand) are also key characteristics of the big challenges of big data.

John Dodge and Bob Gourley recently discussed big data on Enterprise CIO Forum Radio, where Gourley explained that big data is essentially “the data that your enterprise is not currently able to do analysis over.”  This point resonates with a similar one made by Bill Laberis, who recently discussed new global research where half of the companies polled responded that they cannot effectively deal with analyzing the rising tide of data available to them.

Most of the big angst about big data comes from this fear that organizations are not tapping the potential business value of all that data not currently being included in their analytics and decision making.  This reminds me of psychologist Herbert Simon, who won the 1978 Nobel Prize in Economics for his pioneering research on decision making, which included comparing and contrasting the decision-making strategies of maximizing and satisficing (a term that combines satisfying with sufficing).

Simon explained that a maximizer is like a perfectionist who considers all the data they can find because they need to be assured that their decision was the best that could be made.  This creates a psychologically daunting task, especially as the amount of available data constantly increases (again, note that this observation was made over 40 years ago).  The alternative is to be a satisficer, someone who attempts to meet criteria for adequacy rather than identify an optimal solution.  And especially when time is a critical factor, such as it is with the real-time decision making demanded by a constantly changing business world.

Big data strategies will also have to compare and contrast maximizing and satisficing.  Maximizers, if driven by their angst about all that data they are not analyzing, might succumb to information overload.  Satisficers, if driven by information optimization, might sufficiently integrate just enough of big data into their business analytics in a way that satisfies specific business needs.

As big data forces us to revisit information overload, it may be useful for us to remember that originally the primary concern was not about the increasing amount of information, but instead the increasing access to information.  As Clay Shirky succinctly stated, “It’s not information overload, it’s filter failure.”  So, to harness the business value of big data, we will need better filters, which may ultimately make for the entire distinction between information overload and information optimization.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The Data Encryption Keeper

The Cloud Security Paradox

The Good, the Bad, and the Secure

Securing your Digital Fortress

Shadow IT and the New Prometheus

Are Cloud Providers the Bounty Hunters of IT?

The Diderot Effect of New Technology

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

A Sadie Hawkins Dance of Business Transformation

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Partly Cloudy CIO

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

The Data Encryption Keeper

This blog post is sponsored by the Enterprise CIO Forum and HP.

Since next week is Halloween, and Rafal Los recently blogged about how most enterprise security discussions are FUD-filled (i.e., filled with Fear, Uncertainty, and Doubt) horror stories, I decided to use Tales from the Crypt as the theme for this blog post.

 

Tales from the Encrypted

One frightening consequence of the unrelenting trend of the consumerization of IT, especially cloud computing and mobility, is that not all of the organization’s data is stored within its on-premises technology infrastructure, or accessed using devices under its control.  With an increasing percentage of enterprise data constantly in motion as a moving target in a sometimes horrifyingly hyper-connected world, data protection and data privacy are legitimate concerns and increasingly complex challenges.

Cryptography has a long history that predates the Information Age, but data encryption via cryptographic computer algorithms has played a key (sorry, I couldn’t resist the pun) role in the history of securing the organization’s data.  But instead of trying to fight the future of business being enabled by cloud and mobile technologies like it was the Zombie Data-pocalypse, we need a modern data security model that can remain good for business, but ghoulish for the gremlins, goblins, and goons of cyber crime.

Although some rightfully emphasize the need for stronger authentication to minimize cloud breaches, data encryption is often overlooked—especially who should be responsible for it.  Most cloud providers use vendor-side encryption models, meaning that their customers transfer non-encrypted data to the cloud, where the cloud vendor then becomes responsible for data encryption.

 

The Data Encryption Keeper

However, as Richard Jarvis commented on my previous post, “it’s only a matter of time before there’s a highly public breakdown in the vendor-side encryption model.  Long term, I expect to see an increase in premium, client-side encryption services targeted at corporate clients.  To me, this will offer the best of both worlds, and will benefit both cloud vendors and their clients.”

I have to admit that in my own security assessments of cloud computing solutions, I have verified that the cloud vendor was using strong data encryption methods, but I didn’t consider that the responsibility for cloud data encryption might be misplaced.

So perhaps one way to prevent the cloud from becoming a haunted house for data is to pay more attention to who is cast to play the role of the Data Encryption Keeper.  And perhaps the casting call for this data security role should stay on-premises.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The Cloud Security Paradox

The Good, the Bad, and the Secure

Securing your Digital Fortress

Shadow IT and the New Prometheus

Are Cloud Providers the Bounty Hunters of IT?

The Diderot Effect of New Technology

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

A Sadie Hawkins Dance of Business Transformation

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Partly Cloudy CIO

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT

The Cloud Security Paradox

This blog post is sponsored by the Enterprise CIO Forum and HP.

Nowadays it seems like any discussion about enterprise security inevitably becomes a discussion about cloud security.  Last week, as I was listening to John Dodge and Bob Gourley discuss recent top cloud security tweets on Enterprise CIO Forum Radio, the story that caught my attention was the Network World article by Christine Burns, part of a six-part series on cloud computing, which had a provocative title declaring that public cloud security remains Mission Impossible.

“Cloud security vendors and cloud services providers have a long way to go,” Burns wrote, “before enterprise customers will be able to find a comfort zone in the public cloud, or even in a public/private hybrid deployment.”  Although I agree with Burns, and I highly recommend reading her entire excellent article, I have always been puzzled by debates over cloud security.

A common opinion is that cloud-based solutions are fundamentally less secure than on-premises solutions.  Some critics even suggest cloud-based solutions can never be secure.  I don’t agree with either opinion because to me it’s all a matter of perspective.

Let’s imagine that I am a cloud-based service provider selling solutions leveraging my own on-premises resources, meaning that I own and operate all of the technology infrastructure within the walls of my one corporate office.  Let’s also imagine that in addition to the public cloud solution that I sell to my customers, I have built a private cloud solution for some of my employees (e.g., salespeople in the field), and that I also have other on-premises systems (e.g., accounting) not connected to any cloud.

Since all of my solutions are leveraging the exact same technology infrastructure, if it is impossible to secure my public cloud, then it logically follows that not only is it impossible to secure my private cloud, but it is also impossible to secure my on-premises systems as well.  Therefore, all of my security must be Mission Impossible.  I refer to this as the Cloud Security Paradox.

Some of you will argue that my scenario was oversimplified, since most cloud-based solutions, whether public or private, may include technology infrastructure that is not under my control, and may be accessed using devices that are not under my control.

Although those are valid security concerns, they are not limited to—nor were they created by—cloud computing, because with the prevalence of smart phones and other mobile devices, those security concerns exist for entirely on-premises solutions as well.

In my opinion, cloud-based versus on-premises, public cloud versus private cloud, and customer access versus employee access, are all oversimplified arguments.  Regardless of the implementation strategy, technology infrastructure and especially your data needs to be secured wherever it is, however it is accessed, and with the appropriate levels of control over who can access what.

Fundamentally, the real problem is a lack of well-defined, well-implemented, and well-enforced security practices.  As Burns rightfully points out, a significant challenge with cloud-based solutions is that “public cloud providers are notoriously unwilling to provide good levels of visibility into their underlying security practices.”

However, when the cost savings and convenience of cloud-based solutions are accepted without a detailed security assessment, that is not a fundamental flaw of cloud computing—that is simply a bad business decision.

Let’s stop blaming poor enterprise security practices on the adoption of cloud computing.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The Good, the Bad, and the Secure

Securing your Digital Fortress

Shadow IT and the New Prometheus

Are Cloud Providers the Bounty Hunters of IT?

The Diderot Effect of New Technology

The IT Consumerization Conundrum

The IT Prime Directive of Business First Contact

A Sadie Hawkins Dance of Business Transformation

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Partly Cloudy CIO

The IT Pendulum and the Federated Future of IT

Suburban Flight, Technology Sprawl, and Garage IT