The Big Data Collider

As I mentioned in a previous post, I am reading the book Where Good Ideas Come From by Steven Johnson, which examines recurring patterns in the history of innovation.  The current chapter that I am reading is dispelling the traditional notion of the eureka effect by explaining that the evolution of ideas, like all evolution, stumbles its way toward the next good idea, which inevitably, and not immediately, leads to a significant breakthrough.

One example is how the encyclopedic book Enquire Within Upon Everything, the first edition of which was published in 1856, influenced a young British scientist, who in his childhood in the 1960s was drawn to the “suggestion of magic in the book’s title, and who spent hours exploring this portal to the world of information, along with the wondrous feeling of exploring an immense trove of data.”  His childhood fascination with data and information influenced a personal project that he started in 1980, which ten years later became a professional project while he has working in the Swiss particle physics lab CERN.

The scientist was Tim Berners-Lee and his now famous project created the World Wide Web.

“Journalists always ask me,” Berners-Lee explained, “what the crucial idea was, or what the singular event was, that allowed the Web to exist one day when it hadn’t the day before.  They are frustrated when I tell them there was no eureka moment.”

“Inventing the World Wide Web involved my growing realization that there was a power in arranging ideas in an unconstrained, web-like way.  And that awareness came to me through precisely that kind of process.”

CERN is famous for its Large Hadron Collider that uses high-velocity particle collisions to explore some of the open questions in physics concerning the basic laws governing the interactions and forces among elementary particles in an attempt to understand the deep structure of space and time, and, in particular, the intersection of quantum mechanics and general relativity.

 

The Big Data Collider

While reading this chapter, I stumbled toward an idea about Big Data, which as Gartner Research explains, although the term acknowledges the exponential growth, availability, and use of information in today’s data-rich landscape, it’s about more than just data volume.  Data variety (i.e., structured, semi-structured, and unstructured data, as well as other types of data such as sensor data) and data velocity (i.e., how fast data is being produced and how fast the data must be processed to meet demand) are also key characteristics of Big Data.

David Loshin’s recent blog post about Hadoop and Big Data provides a straightforward explanation and simple example of using MapReduce for not only processing fast-moving large volumes of various data, but also deriving meaningful insights from it.

My idea was how Big Analytics uses the Big Data Collider to allow large volumes of various data particles to bounce off each other in high-velocity collisions.  Although a common criticism of Big Data is that it contains more noise than signal, smashing data particles together in the Big Data Collider may destroy most of the noise in the collision, allowing the signals that survive that creative destruction to potentially reduce into an elementary particle of business intelligence.

Admittedly not the greatest metaphor, but as we enquire within data about everything in the Information Age, I thought that it might be useful to share my idea so that it might stumble its way toward the next good idea by colliding with an idea of your own.

 

Related Posts

OCDQ Radio - Big Data and Big Analytics

OCDQ Radio - Good-Enough Data for Fast-Enough Decisions

OCDQ Radio - A Brave New Data World

Data, Information, and Knowledge Management

Thaler’s Apples and Data Quality Oranges

Data Confabulation in Business Intelligence

Data In, Decision Out

The Data-Decision Symphony

The Real Data Value is Business Insight

Is your data complete and accurate, but useless to your business?

Beyond a “Single Version of the Truth”

The General Theory of Data Quality

The Data-Information Continuum

Schrödinger’s Data Quality

Data Governance and the Buttered Cat Paradox