Open MIKE Podcast — Episode 03

Method for an Integrated Knowledge Environment (MIKE2.0) is an open source delivery framework for Enterprise Information Management, which provides a comprehensive methodology that can be applied across a number of different projects within the Information Management space.  For more information, click on this link: openmethodology.org/wiki/What_is_MIKE2.0

The Open MIKE Podcast is a video podcast show, hosted by Jim Harris, which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

 

Episode 03: Data Quality Improvement and Data Investigation

If you’re having trouble viewing this video, you can watch it on Vimeo by clicking on this link: Open MIKE Podcast on Vimeo

 

MIKE2.0 Content Featured in or Related to this Podcast

Enterprise Data Management: openmethodology.org/wiki/Enterprise_Data_Management_Offering_Group

Data Quality Improvement: openmethodology.org/wiki/Data_Quality_Improvement_Solution_Offering

Data Investigation: openmethodology.org/wiki/Category:Data_Investigation_and_Re-Engineering

You can also find the videos and blog post summaries for every episode of the Open MIKE Podcast at: ocdqblog.com/MIKE

Social Media for Midsize Businesses

OCDQ Radio is a vendor-neutral podcast about data quality and its related disciplines, produced and hosted by Jim Harris.

During this episode, Paul Gillin and I discuss social media for midsize businesses, including how the less marketing you do, the more effective you will be with social media marketing, the war of generosity, where the more you give, the more you get, and the importance of the trust equation, which means the more people trust you, the more they will want to do business with you.

Paul Gillin is a veteran technology journalist and a thought leader in new media.  Since 2005, he has advised marketers and business executives on strategies to optimize their use of social media and online channels to reach buyers cost-effectively.  He is a popular speaker who is known for his ability to simplify complex concepts using plain talk, anecdotes, and humor.

Paul Gillin is the author of four books about social marketing: The New Influencers (2007), Secrets of Social Media Marketing (2008), Social Marketing to the Business Customer (2011), co-authored with Eric Schwartzman, and the forthcoming book Attack of the Customers (2012), co-authored with Greg Gianforte.

Paul Gillin was previously the founding editor of TechTarget and editor-in-chief of Computerworld.  He writes a monthly column for BtoB magazine and is an active blogger and media commentator.  He has appeared as an expert commentator on CNN, PBS, Fox News, MSNBC, and other television outlets.  He has also been quoted or interviewed for hundreds of news and radio reports in outlets such as The Wall Street Journal, The New York Times, NPR, and the BBC.  Paul Gillin is a Senior Research Fellow and member of the board of directors at the Society for New Communications Research.

The Weakest Link in Enterprise Security

This blog post is sponsored by the Enterprise CIO Forum and HP.

As a recent Techopedia article noted, one of the biggest challenges for IT security these days is finding a balance among three overarching principles: availability (i.e., that information is accessible when authorized users need it), confidentiality (i.e., that information is only being seen or used by people who are authorized to access it), and integrity (i.e., that any changes to information by an unauthorized user are impossible — or at least detected — and changes by authorized users are tracked).

Finding this balance has always been a complex challenge for enterprise security since the tighter you lock an IT system down, the harder it can become to use for daily business activities, which sometimes causes usability to be prioritized over security.

“I believe those who think security isn’t a general IT priority are wrong,” Rafal Los recently blogged in a post about the role of Chief Information Security Officer (CISO).  “Pushing the security agenda ahead of doing business seems to be something poor CISOs are known for, which creates a backlash of executive push-back against security in many organizations.”

According to Los, IT leaders need to balance the business enablement of IT with the need to keep information secure, which requires better understanding both business risks and IT threats, and allowing the organization to execute its business goals in a tactical fashion while simultaneously working out the long-term enterprise security strategy.

Although any security strategy is only as strong as its weakest link, the weakest link in enterprise security might not be where you’d expect to find it.  A good example of this came from perhaps the biggest personal data security disaster story of the year, the epic hacking of Mat Honan, during which, as he described it, “in the space of one hour, my entire digital life was destroyed.”

The biggest lesson learned was not the lack of a good security strategy (though that obviously played a part, not only with Honan personally, but also with the vendors involved).  Instead, the lesson was that the weakest link in any security strategy might be its recovery procedures — and that hackers don’t need to rely on Hollywood-style techno-wizardry to overcome security protocols.

Organizations are rightfully concerned about mobile devices containing sensitive data getting stolen — in fact, many make use of the feature provided by Apple that enables you to remotely delete data on your iPhone, iPad, and MacBook in the event of theft.

In Honan’s case, the hackers exploited this feature by accessing his Apple iCloud account (for the details of how that happened, read his blog post), wiping clean his not-stolen mobile devices, resetting his passwords, including for his email accounts, which prevented him from receiving any security warnings and password reset notifications, and bought the hackers the time needed to redirect everything — essentially all by doing what Honan would have done if his mobile devices had actually been stolen.

The hackers also deleted all of Honan’s data stored in the cloud, which was devastating since he had no off-line backups (yes, he admits that’s his fault).  Before you’re tempted to use this as a cloud-bashing story, as Honan blogged in a follow-up post about how he resurrected his digital life, “when my data died, it was the cloud that killed it.  The triggers hackers used to break into my accounts and delete my files were all cloud-based services — iCloud, Google, and Amazon.  Some pundits have latched onto this detail to indict our era of cloud computing.  Yet just as the cloud enabled my disaster, so too was it my salvation.”

Although most security strategies are focused on preventing a security breach from happening, as the Honan story exemplifies, the weakest link in your enterprise security could actually be the protocols enacted in the event of an apparent security breach.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Enterprise Security is on Red Alert

Securing your Digital Fortress

The Good, the Bad, and the Secure

The Data Encryption Keeper

The Cloud Security Paradox

The Cloud is shifting our Center of Gravity

Are Cloud Providers the Bounty Hunters of IT?

The Return of the Dumb Terminal

The UX Factor

A Swift Kick in the AAS

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

Open MIKE Podcast — Episode 02

Method for an Integrated Knowledge Environment (MIKE2.0) is an open source delivery framework for Enterprise Information Management, which provides a comprehensive methodology that can be applied across a number of different projects within the Information Management space.  For more information, click on this link: openmethodology.org/wiki/What_is_MIKE2.0

The Open MIKE Podcast is a video podcast show, hosted by Jim Harris, which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

 

Episode 02: Information Governance and Distributing Power

If you’re having trouble viewing this video, you can watch it on Vimeo by clicking on this link: Open MIKE Podcast on Vimeo

 

MIKE2.0 Content Featured in or Related to this Podcast

Information Governance: openmethodology.org/wiki/Information_Governance_Solution_Offering

Governance 2.0: openmethodology.org/wiki/Governance_2.0_Solution_Offering

You can also find the videos and blog post summaries for every episode of the Open MIKE Podcast at: ocdqblog.com/MIKE

Cloud Computing is the New Nimbyism

NIMBY is an acronym for “Not In My Back Yard” and its derivative term Nimbyism usually refers to the philosophy of opposing construction projects or other new developments, which would be performed too close to your residence or your business, because even though those new developments could provide widespread benefits, they might just be a disruption to you.  So, for example, yes, please build that new airport or hospital or power plant that our city needs — just don’t build it too close to my back yard.

For a long time, midsize businesses viewed their information technology (IT) department as a Nimbyistic disruption, meaning that they viewed IT as a necessary cost of doing business, but one that also took up valuable space and time, and distracted their focus away from their core competencies, which, for most midsize businesses, are supported by but not directly related to IT.

Nowadays, cloud computing is providing a new — and far more positive — spin on Nimbyism by allowing midsize businesses to free up space in their back yard (where, in my experience, many midsize businesses keep their IT department) as well as free up their time to focus on mission-critical business activities by leveraging more cloud-based IT services, which also allows them to scale up their IT during peak business periods without requiring them to first spend time and money building a bigger back yard.

Shifting to a weather analogy, stratus clouds are characterized by horizontal layering with a uniform base, and nimbostratus clouds are stratus clouds of moderate vertical development, signifying the onset of steady, moderate to heavy, precipitation.

We could say cloud computing is the nimby-stratus IT clouds providing midsize businesses with a uniform base of IT services, which can quickly scale horizontally and/or vertically with the agility to adapt to best serve their evolving business needs.

The nimbleness of the new Nimbyism facilitated by cloud computing is providing another weather-related business insight that’s helping midsize businesses forecast a promising future, hopefully signifying the onset of steady, moderate to heavy, profitability.

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

Enterprise Security is on Red Alert

This blog post is sponsored by the Enterprise CIO Forum and HP.

Enterprise security is becoming an even more important, and more complex, topic of discussion than it already was.  Especially when an organization focuses mostly on preventing external security threats, which is somewhat like, as in the photo to the left, telling employees to keep the gate closed but ignore the cloud floating over the gate and the mobile devices walking around it.

But that doesn’t mean we need to build bigger and better gates.  The more open business environment enabled by cloud and mobile technologies is here to stay, and it requires a modern data security model that can protect us from the bad without being overprotective to the point of inhibiting the good.

“Security controls cost money and have an impact on the bottom line,” Gideon Rasmussen recently blogged.  Therefore, “business management may question the need for controls beyond minimum compliance requirements.  However, adherence to compliance requirements, control frameworks, and best practices may not adequately protect sensitive or valuable information because they are not customized to the unique aspects of your organization.”

This lack of a customized security solution can also be introduced when leveraging cloud providers.  “Transparency is the capability to look inside the operational day-to-day activity of your cloud provider,” Rafal Los recently blogged.  “As a consumer, transparency means that I have audit-ability of the controls, systems, and capabilities that directly impact my consumed service.”

A further complication for enterprise security is that many cloud-based services are initiated as Shadow IT projects.  “There are actually good reasons why you may want to take a hard look at Shadow IT, as it may fundamentally put you at risk of breaching compliance,” Christian Verstraete recently blogged.  “Talking to business users, I’m often flabbergasted by how little they know of the potential risks encountered by putting information in the public cloud.”

In the science fiction universe of Star Trek, the security officers aboard the starship Enterprise, who wore red shirts, often quickly died on away missions.  Protecting your data, especially when it goes on away missions in the cloud or on mobile devices, requires your enterprise security to be on red alert — otherwise everyone in your organization might as well be wearing a red shirt.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Securing your Digital Fortress

The Good, the Bad, and the Secure

The Data Encryption Keeper

The Cloud Security Paradox

The Cloud is shifting our Center of Gravity

Are Cloud Providers the Bounty Hunters of IT?

The Return of the Dumb Terminal

The UX Factor

A Swift Kick in the AAS

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

The Diffusion of the Consumerization of IT

Open MIKE Podcast — Episode 01

Method for an Integrated Knowledge Environment (MIKE2.0) is an open source delivery framework for Enterprise Information Management, which provides a comprehensive methodology that can be applied across a number of different projects within the Information Management space.  For more information, click on this link: openmethodology.org/wiki/What_is_MIKE2.0

The Open MIKE Podcast is a video podcast show, hosted by Jim Harris, which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

 

Episode 01: Information Management Principles

If you’re having trouble viewing this video, you can watch it on Vimeo by clicking on this link: Open MIKE Podcast on Vimeo

 

MIKE2.0 Content Featured in or Related to this Podcast

Information Management Principles: openmethodology.org/wiki/Economic_Value_of_Information

Information Economics: openmethodology.org/wiki/Information_Economics

You can also find the videos and blog post summaries for every episode of the Open MIKE Podcast at: ocdqblog.com/MIKE

The Age of the Mobile Device

Bob Sutor recently blogged about mobile devices, noting that “the power of these gadgets isn’t in their touchscreens or their elegant design.  It’s in the variety of apps and communication services we can use on them to stay connected.  By thinking beyond the device, companies can prepare themselves and figure out how to make the most of this age of the mobile device.”

The disruptiveness of mobile devices to existing business models — even Internet-based ones — is difficult to overstate.  In fact, I believe the age of the mobile device will be even more disruptive than the age of the Internet, which, during the 1990s and early 2000s, disrupted entire industries and professions — the three most obvious examples being music, journalism, and publishing.

However, during those disruptions, mobile devices were in their nascent phase.  Laptops were still the dominant mobile devices and most mobile phones only made phone calls, though text messaging and e-mail soon followed.  It’s only been about five years — with the notable arrivals of the iPhone and the Kindle in 2007, the Android operating system in 2008, and the iPad in 2010 — since mobile devices started to hit their stride.  The widespread availability of connectivity options (Wi-Fi and 3G/4G broadband), the shift to more cloud-based services, and, as Sutor noted, in 2011, for the first time ever, shipments of smartphones exceeded total PC shipments, all appears to forecast that the age of the mobile device will be an age of massive — and rapid — disruption.

The IBM Midmarket white paper A Smarter Approach to Customer Relationship Management (CRM) notes that “mobile is becoming the customers’ preferred communications means for multiple channels.  As customers go mobile and sales teams strive to meet customers’ needs, midsize companies are enabling mobile CRM.  They are optimizing Web sites for wireless devices and deploying mobile apps directly linked into the contact centers.  They are purchasing apps for particular devices and are buying solutions that store CRM data on them when offline, and update the information when Internet access is restored.  This enables sales teams to quickly acquire customer histories and respond with offerings tailored to their desires.”

As Sutor concluded, “mobile devices are a springboard into the future, where the apps can significantly improve the quality of our personal or business lives by allowing us to do things we have never done before.”  I agree that mobile devices are a springboard into a future that allows us, as well as our businesses and our customers, to do things we have never done before.

The age of the mobile device is the future — and the future is now.  Is your midsize business ready?

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

Balancing the IT Budget

This blog post is sponsored by the Enterprise CIO Forum and HP.

While checking out the new Knowledge Vaults on the Enterprise CIO Forum, I came across the Genefa Murphy blog post How IT Debt is Crippling the Enterprise, which included three recommendations for alleviating some of that crippling IT debt.

The first recommendation was application retirement.  As I have previously blogged, applications become retirement-resistant because applications and data have historically been so tightly coupled, making most of what are referred to as data silos actually application silos.  Therefore, in order to help de-cripple IT debt, organizations need to de-couple applications and data, not only by allowing more data to float up into the cloud, but also, as Murphy noted, instituting better procedures for data archival, which helps more easily identify applications for retirement that have become merely containers for unused data.

The second recommendation was cutting the IT backlog.  “One of the main reasons for IT debt,” Murphy explained, “is the fact that the enterprise is always trying to keep up with the latest and greatest trends, technologies and changes.”  I have previously blogged about this as The Diderot Effect of New Technology.  By better identifying how up-to-date the IT backlog is, and how well — if at all — it still reflects current business needs, an organization can skip needless upgrades and enhancement requests, and not only eliminate some of the IT debt, but also better prioritize efforts so that IT functions as a business enabler.

The third recommendation was performing more architectural reviews, which, Murphy explained, “is less about getting rid of old debt and more about making sure new debt does not accumulate.  Since IT teams don’t often have the time to do this (as they are concerned with getting a working solution to the customer ASAP), it is a good idea to have this as a parallel effort led by a technology or architectural review group outside of the project teams but still closely linked.”

Although it’s impossible to completely balance the IT budget, and IT debt doesn’t cause an overall budget deficit, reducing costs associated with business-enabling technology does increase the potential for a surplus of financial success for the enterprise.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Why does the sun never set on legacy applications?

Are Applications the La Brea Tar Pits for Data?

The Diffusion of the Consumerization of IT

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

The UX Factor

The Return of the Dumb Terminal

A Swift Kick in the AAS

The Cloud is shifting our Center of Gravity

Lightning Strikes the Cloud

The Partly Cloudy CIO

Are Cloud Providers the Bounty Hunters of IT?

The Cloud Security Paradox

The Good, the Bad, and the Secure

The Diderot Effect of New Technology

The Cloud is shifting our Center of Gravity

This blog post is sponsored by the Enterprise CIO Forum and HP.

Since more organizations are embracing cloud computing and cloud-based services, and some analysts are even predicting that personal clouds will soon replace personal computers, the cloudy future of our data has been weighing on my mind.

I recently discovered the website DataGravity.org, which contains many interesting illustrations and formulas about data gravity, a concept which Dave McCrory blogged about in his December 2010 post Data Gravity in the Clouds.

“Consider data as if it were a planet or other object with sufficient mass,” McCrory wrote.  “As data accumulates (builds mass) there is a greater likelihood that additional services and applications will be attracted to this data.  This is the same effect gravity has on objects around a planet.  As the mass or density increases, so does the strength of gravitational pull.  As things get closer to the mass, they accelerate toward the mass at an increasingly faster velocity.”

In my blog post What is Weighing Down your Data?, I explained the often misunderstood difference between mass, which is an intrinsic property of matter based on atomic composition, and weight, which is a gravitational force acting on matter.  By using these concepts metaphorically, we could say that mass is an intrinsic property of data, representing objective data quality, and weight is a gravitational force acting on data, representing subjective data quality.

I used a related analogy in my blog post Quality is the Higgs Field of Data.  By using data, we give data its quality, i.e., its mass.  We give data mass so that it can become the basic building blocks of what matters to us.

Historically, most of what we referred to as data silos were actually application silos because data and applications became tightly coupled due to the strong gravitational force that legacy applications exerted, preventing most data from achieving the escape velocity needed to free itself from an application.  But the laudable goal of storing your data in one easily accessible place, and then building services and applications around your data, is one of the fundamental value propositions of cloud computing.

With data accumulating in the cloud, as McCrory explained, although “services and applications have their own gravity, data is the most massive and dense, therefore it has the most gravity.  Data, if large enough, can be virtually impossible to move.”

The cloud is shifting our center of gravity because of the data gravitational field emitted by the massive amount of data being stored in the cloud.  The information technology universe, business world, and our personal (often egocentric) solar systems are just beginning to feel the effects of this massive gravitational shift.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Quality is the Higgs Field of Data

What is Weighing Down your Data?

A Swift Kick in the AAS

Lightning Strikes the Cloud

The Partly Cloudy CIO

Are Cloud Providers the Bounty Hunters of IT?

The Cloud Security Paradox

Are Applications the La Brea Tar Pits for Data?

Why does the sun never set on legacy applications?

The Good, the Bad, and the Secure

The Return of the Dumb Terminal

The UX Factor

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

The Diffusion of the Consumerization of IT

Lightning Strikes the Cloud

This blog post is sponsored by the Enterprise CIO Forum and HP.

Recent bad storms in the United States caused power outages as well as outages of a different sort for some of the companies relying on cloud computing and cloud-based services.  As the poster child for cloud providers, Amazon Web Services always makes headlines when it suffers a major outage, as it did last Friday when its Virginia cloud computing facility was struck by lightning, an incident which John Dodge examined in his recent blog post: Has Amazon's cloud grown too big, too fast?

Another thing that commonly coincides with a cloud outage is ponderances about the nebulous definition of “the cloud.”

In his recent article for The Washington Post, How a storm revealed the myth of the ‘cloud’, Dominic Basulto pondered “of all the metaphors and analogies used to describe the Internet, perhaps none is less understood than the cloud.  A term that started nearly a decade ago to describe pay-as-you-go computing power and IT infrastructure-for-rent has crossed over to the consumer realm.  It’s now to the point where many of the Internet’s most prolific companies make it a key selling point to describe their embrace of the cloud.  The only problem, as we found out this weekend, is that there really isn’t a ‘cloud’ – there’s a bunch of rooms with servers hooked up with wires and tubes.”

One of the biggest benefits of cloud computing, especially for many small businesses and start-up companies, is that it provides an organization with the ability to focus on its core competencies, allowing non-IT companies to be more business-focused.

As Basulto explained, “instead of having to devote resources and time to figuring out the computing back-end, young Internet companies like Instagram and Pinterest could concentrate on hiring the right people and developing business models worth billions.  Hooking up to the Internet became as easy as plugging into the local electricity provider, even as users uploaded millions of photos or streamed millions of videos at a time.”

But these benefits are not just for Internet companies.  In his book The Big Switch: Rewiring the World, from Edison to Google, Nicholas Carr used the history of electric grid power utilities as a backdrop and analogy for examining the potential benefits that all organizations can gain from adopting Internet-based utility (i.e., cloud) computing.

The benefits of a utility however, whether it’s electricity or cloud computing, can only be realized if the utility operates reliably.

“A temporary glitch while watching a Netflix movie is annoying,” Basulto noted, but “imagine what happens when there’s a cloud outage that affects airports, hospitals, or yes, the real-world utility grid.”  And so, whenever any utility suffers an outage, it draws attention to something we’ve become dependent on — but, in fairness, it’s also something we take for granted when it’s working.

“Maybe the late Alaska Senator Ted Stevens was right,” Basulto concluded, “maybe the Internet really is a series of tubes rather than a cloud.  If so, the company with the best plumbing wins.”  A few years ago, I published a satirical post about the cloud, which facetiously recommended that instead of beaming your data up into the cloud, bury your data down underground.

However, if plumbing, not electricity, is the better metaphor for cloud computing infrastructure, then perhaps cloud providers should start striking ground on subterranean data centers built deep enough to prevent lightning from striking the cloud again.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

The Partly Cloudy CIO

Are Cloud Providers the Bounty Hunters of IT?

The Cloud Security Paradox

The Good, the Bad, and the Secure

The Return of the Dumb Terminal

A Swift Kick in the AAS

The UX Factor

Sometimes all you Need is a Hammer

Shadow IT and the New Prometheus

The Diffusion of the Consumerization of IT

Big Data Lessons from Orbitz

One of the week’s interesting technology stories was On Orbitz, Mac Users Steered to Pricier Hotels, an article by Dana Mattioli in The Wall Street Journal, about how online travel company Orbitz used data mining to discover significant spending differences between their Mac and PC customers (who were identified by the operating system of the computer used to book reservations).

Orbitz discovered that Mac users are 40% more likely to book a four- or five-star hotel, and tend to stay in more expensive rooms, spending on average $20 to $30 more a night on hotels.  Based on this discovery, Orbitz has been experimenting with showing different hotel offers to Mac and PC visitors, ranking the more expensive hotels on the first page of search results for Mac users.

This Orbitz story is interesting because I think it provides two important lessons about big data for businesses of all sizes.

The first lesson is, as Mattioli reported, “the sort of targeting undertaken by Orbitz is likely to become more commonplace as online retailers scramble to identify new ways in which people’s browsing data can be used to boost online sales.  Orbitz lost $37 million in 2011 and its stock has fallen by more than 74% since its 2007 IPO.  The effort underscores how retailers are becoming bigger users of so-called predictive analytics, crunching reams of data to guess the future shopping habits of customers.  The goal is to tailor offerings to people believed to have the highest lifetime value to the retailer.”

The second lesson is a good example of how word of mouth has become word of data.  Shortly after the article was published, Orbitz became a trending topic on Twitter — but not in a way that the company would have hoped.  A lot of negative sentiment was expressed by Mac users claiming that they would no longer use Orbitz since they charged Mac users more than PC users.

However, this commonly expressed misunderstanding was clarified by an Orbitz spokesperson in the article, who explained that Orbitz is not charging Mac users more money for the same hotels, but instead they are simply setting the default search rank to show Mac users the more expensive hotels first.  Mac users can always re-sort the results ascending by price in order to see the same less expensive hotels that would be displayed in the default search rank used for PC users.  Orbitz is attempting to offer a customized (albeit a generalized, not personalized) user experience, but some users see it as gaming the system against them.

This Orbitz story provides two lessons about the brave new business world brought to us by big data and data science, where more companies are using predictive analytics to discover business insights, and more customers are empowering themselves with data.

Business has always resembled a battlefield.  But nowadays, data is the weapon of choice for companies and customers alike, since, in our increasing data-constructed world, big data is no longer just for big companies, and everyone is a data geek now.

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

The Return of the Dumb Terminal

This blog post is sponsored by the Enterprise CIO Forum and HP.

In his book What Technology Wants, Kevin Kelly observed “computers are becoming ever more general-purpose machines as they swallow more and more functions.  Entire occupations and their workers’ tools have been subsumed by the contraptions of computation and networks.  You can no longer tell what a person does by looking at their workplace, because 90 percent of employees are using the same tool — a personal computer.  Is that the desk of the CEO, the accountant, the designer, or the receptionist?  This is amplified by cloud computing, where the actual work is done on the net as a whole and the tool at hand merely becomes a portal to the work.  All portals have become the simplest possible window — a flat screen of some size.”

Although I am an advocate for cloud computing and cloud-based services, sometimes I can’t help but wonder if cloud computing is turning our personal computers back into that simplest of all possible windows that we called the dumb terminal.

Twenty years ago, at the beginning of my IT career, when I was a mainframe production support specialist, my employer gave me a dumb terminal to take home for connecting to the mainframe via my dial-up modem.  Since I used it late at night when dealing with nightly production issues, the aptly nicknamed green machine (its entirely text-based display used bright green characters) would make my small apartment eerily glow green, which convinced my roommate and my neighbors that I was some kind of mad scientist performing unsanctioned midnight experiments with radioactive materials.

The dumb terminal was so-called because, when not connected to the mainframe, it was essentially a giant paperweight since it provided no offline functionality.  Nowadays, our terminals (smartphones, tablets, and laptops) are smarter, but in some sense, with more functionality moving to the cloud, even though they provide varying degrees of offline functionality, our terminals get dumbed back down when they’re not connected to the web or a mobile network, because most of what we really need is online.

It can even be argued that smartphones and tablets were actually designed to be dumb terminals because they intentionally offer limited offline data storage and computing power, and are mostly based on a mobile-app-portal-to-the-cloud computing model, which is well-supported by the widespread availability of high-speed network connectivity options (broadband, mobile, Wi-Fi).

Laptops (and the dwindling number of desktops) are the last bastions of offline data storage and computing power.  Moving more of those applications and data to the cloud would help eliminate redundant applications and duplicated data, and make it easier to use the right technology for a specific business problem.  And if most of our personal computers were dumb terminals, then our smart people could concentrate more on the user experience aspects of business-enabling information technology.

Perhaps the return of the dumb terminal is a smart idea after all.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

A Swift Kick in the AAS

The UX Factor

The Partly Cloudy CIO

Are Cloud Providers the Bounty Hunters of IT?

The Cloud Security Paradox

Sometimes all you Need is a Hammer

Why does the sun never set on legacy applications?

Are Applications the La Brea Tar Pits for Data?

The Diffusion of the Consumerization of IT

More Tethered by the Untethered Enterprise?

The Graystone Effects of Big Data

As a big data geek and a big fan of science fiction, I was intrigued by Zoe Graystone, the central character of the science fiction television show Caprica, which was a spin-off prequel of the re-imagined Battlestar Galactica television show.

Zoe Graystone was a teenage computer programming genius who created a virtual reality avatar of herself based on all of the available data about her own life, leveraging roughly 100 terabytes of personal data from numerous databases.  This allowed her avatar to access data from her medical files, DNA profiles, genetic typing, CAT scans, synaptic records, psychological evaluations, school records, emails, text messages, phone calls, audio and video recordings, security camera footage, talent shows, sports, restaurant bills, shopping receipts, online search history, music lists, movie tickets, and television shows.  The avatar transformed that big data into personality and memory, and believably mimicked the real Zoe Graystone within a virtual reality environment.

The best science fiction reveals just how thin the line is that separates imagination from reality.  Over thirty years ago, around the time of the original Battlestar Galactica television show, virtual reality avatars based on massive amounts of personal data would likely have been dismissed as pure fantasy.  But nowadays, during the era of big data and data science, the idea of Zoe Graystone creating a virtual reality avatar of herself doesn’t sound so far-fetched, nor is it pure data science fiction.

“On Facebook,” Ellis Hamburger recently blogged, “you’re the sum of all your interactions and photos with others.  Foursquare began its life as a way to see what your friends are up to, but it has quickly evolved into a life-logging tool / artificial intelligence that knows you like an old friend does.”

Facebook and Foursquare are just two social media examples of our increasingly data-constructed world, which is creating a virtual reality environment where our data has become our avatar and our digital mouths are speaking volumes about us.

Big data and real data science are enabling people and businesses of all sizes to put this virtual reality environment to good use, such as customers empowering themselves with data and companies using predictive analytics to discover business insights.

I refer to the positive aspects of Big Data as the Zoe Graystone Effect.

But there are also negative aspects to the virtual reality created by our big data avatars.  For example, in his recent blog post Rethinking Privacy in an Era of Big Data, Quentin Hardy explained “by triangulating different sets of data (you are suddenly asking lots of people on LinkedIn for endorsements on you as a worker, and on Foursquare you seem to be checking in at midday near a competitor’s location), people can now conclude things about you (you’re probably interviewing for a job there).”

On the Caprica television show, Daniel Graystone (her father) used Zoe’s avatar as the basis for an operating system for a race of sentient machines known as Cylons, which ultimately lead to the Cylon Wars and the destruction of most of humanity.  A far less dramatic example from the real world, which I explained in my blog post The Data Cold War, is how companies like Google use the virtual reality created by our big data avatars against us by selling our personal data (albeit indirectly) to advertisers.

I refer to the negative aspects of Big Data as the Daniel Graystone Effect.

How have your personal life and your business activities been affected by the Graystone Effects of Big Data?

 

This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet.

 

Sometimes all you Need is a Hammer

This blog post is sponsored by the Enterprise CIO Forum and HP.

“If all you have is a hammer, everything looks like a nail” is a popular phrase, also known as the law of the instrument, which describes an over-reliance on a familiar tool, as opposed to using “the right tool for the job.”  In information technology (IT), the law of the instrument is often invoked to justify the need to purchase the right technology to solve a specific business problem.

However, within the IT industry, it has become increasingly difficult over the years to buy the right tool for the job since many leading vendors make it nearly impossible to buy just an individual tool.  Instead, vendors want you to buy their entire tool box, filled with many tools for which you have no immediate need, and some tools which you have no idea why you would ever need.

It’d be like going to a hardware store to buy just a hammer, but the hardware store refusing to sell you a hammer without also selling you a 10-piece set of screwdrivers, a 4-piece set of pliers, a 18-piece set of wrenches, and an industrial-strength nail gun.

My point is that many new IT innovations originate from small, entrepreneurial vendors, which tend to be specialists with a very narrow focus that can provide a great source of rapid innovation.  This is in sharp contrast to the large, enterprise-class vendors, which tend to innovate via acquisition and consolidation, embedding tools and other technology components within generalized IT platforms, allowing these mega-vendors to offer end-to-end solutions and the convenience of one-vendor IT shopping.

But the consumerization of IT, driven by the unrelenting trends of cloud computingSaaS, and mobility, is fostering a return to specialization, a return to being able to buy only the information technology that you currently need — the right tool for the job, and often at the right price precisely because it’s almost always more cost-effective to buy only what you need right now.

I am not trying to criticize traditional IT vendors that remain off-premises-resistant by exclusively selling on-premises solutions, which the vendors positively call enterprise-class solutions, but their customers often come to negatively call legacy applications.

I understand the economics of the IT industry.  Vendors can make more money with fewer customers by selling on-premises IT platforms with six-or-seven-figure licenses plus five-figure annual maintenance fees, as opposed to selling cloud-based services with three-or-four-figure pay-as-you-go-cancel-anytime monthly subscriptions.  The former is the big-ticket business model of the vendorization of IT.  The latter is the big-volume business model of the consumerization of IT.  Essentially, this is a paradigm shift that makes IT more of a consumer-driven marketplace, and less of the vendor-driven marketplace it has historically been.

Although it remains true that if all you have is a hammer, everything looks like a nail, sometimes all you need is a hammer.  And when all you need is a hammer, you shouldn’t get nailed by vendors selling you more information technology than you need.

This blog post is sponsored by the Enterprise CIO Forum and HP.

 

Related Posts

Can Enterprise-Class Solutions Ever Deliver ROI?

Why does the sun never set on legacy applications?

The Diffusion of the Consumerization of IT

The IT Consumerization Conundrum

The UX Factor

A Swift Kick in the AAS

Shadow IT and the New Prometheus

The Cloud Security Paradox

Are Cloud Providers the Bounty Hunters of IT?

The Partly Cloudy CIO