Wed. Oct 27th, 2021

Back in 2012, Steve Lohr writing for the New York Times did average folks a favor and introduced us to “Big Data.”

Yes, analysts, data scientists and the people in Silicon Valley had already heard of it, but thanks to Steve and the Times, the rest of us found out that it’s a “meme and a marketing term, for sure, but also shorthand for advancing trends in technology that open the door to a new approach to understanding the world and making decisions.”

Sounds intriguing, huh?

Despite a few years of boom for the concept of “Big Data,” the term itself is old news. Big Data came with a “measure it all” attitude that eventually gave the phrase a bad rap, but smart companies didn’t throw the baby out with the bathwater. Data plays an expanded role in our daily lives, including how we do business. Now, instead of worrying about how big the data is, the winning trend is to focus in on only what’s actually relevant.

Now fast forward to today: leveraging data for higher productivity and reduced costs in the oil patch is a reality. Not Big Data, but the right data. And to capitalize on this reality, we must think much, much smaller

Forget Big Data

Why small data? In part, that’s because Big Data is a pain in the you-know-what. It’s hard work, plagued by data quality issues, and it’s expensive to boot.

Matt Turck points out that an organization who buys into Big Data (and yes, it has to be the wholeorganization), needs to “capture data, store data, clean data, query data, analyze data and visualize data.” And while software can handle it, “some of it will be done by humans,” and all of it needs to be part of a seamless integration.

If this doesn’t sound easy, it’s because it’s not.

After you analyze anything and everything to get whatever info is available and come up with correlations, sometimes your takeaways aren’t only unexpected, but off the wall. For many businesses, coincidences continue to be suddenly pegged on cause and effect, leading many on some expensive wild goose chases.

At Slate, Will Oremus points out that Big Data’s problem isn’t that the data is bad, it’s this over-enthusiastic, fetishistic application of data to everything as often as possible. Data in the short-lived boom of Big Data was not being used in a careful, critical way.

Really, all of the data being collected was hard to interpret. When you’re collecting billions of data points – clicks or cursor positions on a website, turns of the drillbit or strokes of the pump – the actual importance of any single data point is lost.

So, what may seem to be a big, important high-level trend might not be a trend at all. There could be problems in the data, an issue with the methodology or some kind of human error at the well site.

Read More Here

Article Credit: Rigzone

Leave a Reply

Your email address will not be published. Required fields are marked *

*

code