Are you going to power and cool the lot?
For all of its advances, the IT sector’s first five decades could be characterised as the electronic storing of systems of record.
The move to the electronic era saw paper ledgers, tax returns, bank statements stored and archived safely and legally on machines instead of paper files.
Rows of data were worked on in spreadsheets and stored in SQL relational databases. Since then data has been everywhere. It has been in data warehouses, it has been in data lakes, up mountains where it has been mined and in pools. It is now so voluminous that it can even be measured in something called a Brontobyte. (Though it is a generally accepted we are in the Zettabyte era.)
And more recently this digital era of data and big data saw enterprises embark on the quest to extract value from the information in order to make accurate forecasts.
The Holy Grail of data value began within a subset of mathematics with a discipline called probability and statistical analysis.
Specialists interrogated the data for patterns in order to conduct fraud detection, measure marketing campaign effectiveness or grade insurance claim assessments.
Probability and statistical analysis (a tough and not a very popular career choice) became Business Intelligence that in turn evolved into Data Science (highly sought after and well paid).
What a data scientist is and does has been described in many ways. It requires deep understanding of probability and statistics, a domain expertise such as finance or health and a high level of expertise in machine learning and the workings of big data frameworks such whether commercial such as SAP Hana or open source like Hadoop and their associated platforms, languages and methods.
Analytics now spans five categories: descriptive, diagnostic, predictive, prescriptive, and cognitive, with each building on the last.
The current effort of gaining value from business data is called advanced analytics.