This is a lovely article over at the Wall Street Journal by Christian Madsbjerg and Mikkell B/ Rasmussen about what they characterize as “Thick Data”. Here is the basic premise:
In fact, companies that rely too much on the numbers, graphs and factoids of Big Data risk insulating themselves from the rich, qualitative reality of their customers’ everyday lives. They can lose the ability to imagine and intuit how the world—and their own businesses—might be evolving. By outsourcing our thinking to Big Data, our ability to make sense of the world by careful observation begins to wither, just as you miss the feel and texture of a new city by navigating it only with the help of a GPS.
Successful companies and executives work to understand the emotional, even visceral context in which people encounter their product or service, and they are able to adapt when circumstances change. They are able to use what we like to call Thick Data.
Read on to understand the excellent example of Lego and Samsung’s turn around.
But I’m a little uncomfortable with Thick Data. I think that’s fundamentally the problem with Big Data today. The data is too thick and the analysis/ability to analyze is way too thin. I prefer reducing today’s thick data to Thin Data, coupled with Thick Insights.
The reality is that the nuggets of genius are buried deep inside mounds of data. It’s the tiny hidden insights that matter most and we need thick analytics, discovery and the power of collaboration to find and act on these hidden insights. Here’s more on this from an earlier post I wrote on this topic:
Me? I’m waiting for Big Data to become Tiny Insights. Tangible bites of intelligence that help me make better decisions and improve outcomes. Make no mistake: Tiny Insights doesn’t mean tiny value. Tiny insights inform massive decisions for business or important decisions for individuals. Alert me when I walk into a restaurant that just got panned consistently across many social networks, or an employee I follow on my enterprise social network who might be able to help with my presentation for next week, or a real time reset of which component supplier is best suited at the minute my production requirements or S&OP assumptions change. There’s very little of this discussion and too much chest thumping. We need to make billions of consumers, and end users of enterprise wares give a hoot.
But these might just be nuances we can get past. Thick or thin, the larger issue is that Big Data doesn’t have a face. What I characterized as “Big Data needs its beef burrito” in the post I reference above. And Christian and Mikkell brilliantly asset the power of data to really understand “the visceral context in which people encounter their product or service” and how how they can adapt. Now that’s, without question, a big smiley face on bid data that any c-suite can get behind.
This makes incredible sense. Combined with the velocity of said adaptation that cloud-based delivery provides, and intelligent wrapping of your network of experts that well designed social collaboration tools can facilitate, you create a far more cognizant and adaptable organization that can level with how end consumers really feel.
Digital Transformation is now upon us. The ability to understand the end consumer’s emotions that govern preferences at this ridiculously micro level of granularity offers an incredible value proposition for CEO’s to really ensure that your organizations can start to understand purchasing behavior.
But this is where we can get into trouble by focusing too much on the data vs how we do something about it. I would suggest that too much focus on the size or even density of the data will get us wrapped around a pole.
Instead, we must focus on finding a hundred tiny needles instead of just building larger warehouses to storing our growing haystacks.
(Cross-posted @ Pretzel Logic)
(Cross-posted @ Pretzel Logic)