IT’s Dirty Little Secret: Data, Damn Data And Big Data!

Big Data – i.e. ridiculous amounts of information – has taken the IT market by storm, with organisations drowning in data and vendors throwing out an increasingly broad portfolio of lifebuoys. The overall objective is to deal with the three (or four) ‘Vs’ of data: volume, variety and velocity (and value), but there are a number of barriers, ranging from lack of budgets to lack of skills.

Then there’s the additional challenge that not all data is created equal. According to the 2012 Compliance, Governance, and Oversight Counsel Summit, 25% of data in an enterprise has current business value; 1% has to be preserved for litigation hold; 5% has to be managed to cover compliance requirements; and 69% of all data has no value whatsoever.

So getting business value out of data is a big and growing problem, which is exacerbated when you don’t get useful information quickly, said Terracotta CEO Robin Gilthorpe in a recent interview. “If you want to extract value from Big Data, then you need to be able to address it in a meaningful timeframe, and those meaningful timeframes are getting shorter and shorter.”

A Software AG company, Terracotta develops in-memory technologies for Big Data, and has more than 2.5 million installations globally, including the majority of Global 1000 companies. Its focus is to improve cycle times and unleash innovation based on that improved speed, he said. The in-memory data grid (IMDG) market is small, but Gartner predicts it is likely to grow fast and to reach $1 billion by 2016.

Most business leaders are only just beginning to grasp the concept of data velocity, noted Narendra Mulani, managing director of Accenture Analytics, in a recent article. ‘The pace at which data can be gathered, sorted and analyzed to produce actionable insights is increasingly becoming a determinant of success. Those who take too long to generate insights from the data they acquire – to both identify and exploit opportunities at speed – will fall behind their agile rivals.’

Companies have jumped on the Big Data bandwagon but too many forget to focus on velocity, stated another Accenture executive, Nick Millman, at the beginning of August. The need for speed should be obvious, but a lot of Big Data projects produce stale insights because managers have forgotten the importance of “time to insight.”

There are three Big Data drivers Gilthorpe is seeing with their customers: speed, scale and simplicity. The world is just moving faster, but it presents a greater opportunity because you can beat the competition to the punch, serve a customer at the point of interest, or mitigate a risk that is fast-breaking. As for scale, he said that’s a moving target, but whichever approach organisations take, they want one that won’t expose them to risks. Complexity has become an unavoidable fact of life, so simplifying things becomes another key motivator.

Extracting value from useful data in a meaningful timeframe will require a major overhaul of the entire data management construct, said Gilthorpe. Organisations will have to determine what to do with data closer to the ingest point, “bringing it much more to the front end of the data management process.”

Author: Steve Wexler

Share This Post On

Leave a Reply