ERP News

How to solve your big data problem

295 0
Big Data Problem

Big Data Problem

Big Data Problem- Big data brings new challenges for business, but Intel’s latest technology can help you meet them and make the most of the potential

Big data should mean big opportunities for every business, yet for many the potential is never realised. Data that could be mined for value ends up resting, unused in archives. Insights that could be fuelling sales or enhancing customer experiences are never uncovered, or never make it to the screen of someone who could make the difference. Every day, some 2.5 quintillion bytes of data is created. The volume of data being produced is expected to be 44 times greater in 2020 than it was in 2009.

Yet in 2016 the analysts at Forrester estimated that, on average, between 60% and 73% of all the data within an organisation went untouched by analytics or BI applications, and in some industries the figure may be closer to 95% or more. To call this a shame is an understatement. EU studies have shown that companies that adopt big data analytics can increase productivity by 5% to 10% over companies that don’t, and that big data practices in Europe could add 1.9% to GDP by 2020.

For many companies, some of this will come down to a lack of processes to capture, refine and structure the right data, or to a lack of the skilled workers needed to extract the most value. Yet technological issues also have a large part to play.

For a start, too many big data initiatives are held back by poor performance and underwhelming results. Big data analytics applications are extremely hardware-intensive, pushing not just compute resources to their limit – running complex operations on huge datasets is never easy – but also storage and network resources. Processor cores can be sitting, ready to go, but they’re being bottlenecked by slow transfers of ‘cold’ data in slower, mass data storage devices to ‘hot’ data resources, specifically DRAM, in more direct contact with the CPU. Today’s business thrives on speed and agility, and when mining data for insight takes too long, the excitement around these new applications dwindles. BI and analytics projects that should be powering growth become unloved and under-used.

What’s more, these initiatives are expensive. Smaller and medium-sized enterprises baulk at the cost of the hardware needed to run these applications, and the storage and supporting infrastructure required to store, move and transport all that data. Worse, the real-time, in-memory applications being used by larger enterprises to analyse data at the point it flows into the enterprise are, financially-speaking, out of reach. This isn’t simply because of licensing costs – there are less-expensive and open-source alternatives to the big names – but because the processing power, storage and high-capacity DRAM required doesn’t come cheap. It’s a sizable investment – perhaps too sizable for many smaller businesses.

Read More Here

Article Credit: ITPro

Leave A Reply

Your email address will not be published.

*

code