Follow the Money: Big Data ROI and Inline Analytics

Earlier Wikibon research showed that the on projects was 55ยข for each $1 spent. During 2014, Wikibon focused on in-depth interviews with organizations that had achieved success and high rates of returns. These interviews determined an important generality, that winners focused on operationalizing and automating their projects. They used to drive algorithms that directly connected to and facilitated automatic change in the operational systems-of-record. These algorithms were usually developed and supported by data tables derived using DeepData Analytics from Hadoop systems and/or data warehouses. Instead of focusing on enlightening the few with pretty historical graphs, successful players focused on changing the operational systems for everybody and managed the feedback and improvement process from the company as a whole.

The technical requirements of Inline Analytic systems are to enable real-time decisions within the current or new operational systems, without the current ETL (Extract, Transform & Load) processes that takes hours, days or weeks to migrate operational and other data sources to data warehouse(s) and/or Hadoop systems. The software solutions developed by the winners deployed some or all of many advanced techniques, including parallelism, data-in-memory techniques and high-speed flash storage. Wikibon has researched different Inline Analytics technology approaches, including Aerospike, IBM BLU, Oracle 12c in-memory option and SAP HANA.

The key finding of this research is that the sponsors of Big Data projects should measure success by:

The time taken to create inline analytic algorithms to automate decision-making directly into the operational systems of record, and,

The effectiveness of supporting DeepData analytics to reduce the cycle time for improving the Inline Analytic algorithms and finding new data streams to drive them.

To read the complete article, CLICK HERE

Leave a Reply