Over the past few years, much has been made of the rise of big data. And yet research from TDWI states that at organizations where 50% of employees have access to business intelligence tools, only 20% of that group actually use them. Part of the problem is that systems are often hard to use. Another challenge is low rates of data literacy.
To get around these issues, many organizations have relied on visualizations to display information gleaned from data. While a picture may be worth a thousand words, the same can’t always be said for these charts and graphs. There are a range of causes for data misinterpretation, including insufficient domain expertise and lack of training in statistical thinking.
All of this suggests that trying to force people to become data literate is an uphill battle. But this is actually becoming less necessary thanks to the rise of artificial intelligence (AI) — and, in particular, advanced natural language generation(advanced NLG), a subfield of AI. Advanced NLG platforms — including, full disclosure, the one my company has built — start by understanding what the user wants to communicate. Then these systems perform the relevant analysis to highlight what is most interesting and important, identify and access the data necessary to tell the story, and finally deliver the analysis in a personalized, easy-to-consume way: as a narrative. Gartner predicts that by 2018, advanced NLG will be integrated into the majority of smart data discovery platforms and that 20% of business content will be generated by machines.
For Full Story, Please click here.