IoT, Analytics, Fog & Other Data Mythconceptions

Not all data has the same value, or should be treated the same, and with the barreling down the Information Highway, how you handle that data deluge has got everybody scrambling – IT vendors, carriers, and the consumers of that data. According to Jim McHugh, VP of ’s UCS and Data Center business, is going to be a $19 trillion business, and will be a key component of that.

However, 40% of the data is going to be created at the edge and it’s not going to make sense to bring all that data to the data center and do the analysis there, he told IT Trends & Analysis. Cisco’s alternative is do it at the edge, by way of a concept it calls “ computing”, which outgoing CEO John Chambers unveiled at last year’s Cisco Live developer conference.

“The simple concept, as you move forward with the , is that you have to get the right information at the right time to the right device to the right person to make the right decision,” he said. “It sounds simple, but it is very, very difficult to do, and is almost impossible to do without our architectures and technology.”

Towards the end of last year Cisco Consulting Services surveyed 1,230 people from seven IoT-intensive industries – Manufacturing, Public Sector, Transportation, Retail, Oil & Gas, Utilities, Metals, and Mining – who identified three key challenges of dealing with IoT-generated data:

-automating the collection of data;

-integrating the data from multiple sources; and,

-analyzing the data to derive actionable insights.

He said Cisco was now focused on analytics at the 2014 Global Editors Conference in December. “We aren’t there yet, but boy, this (Cisco Connected Analytics for the Internet of Everything) is one big step.” It’s a ‘comprehensive data and analytics strategy and solutions portfolio’, which includes ‘easy-to-deploy software packages that bring analytics to data regardless of its location’.

Chambers said data analytics was “the one area we were missing”, and combining it with is intended to position the company for a big chunk of the $19 trillion market expected to be available during the next 10 years. Which is not to say that connecting 50 billion things – “on its way to 500 billion” – isn’t good news for either the networking or data center infrastructure portions of Cisco’s not-so-little ($47.1 billion FY14) IT kingdom.

However the magic – the margins – isn’t in the plumbing, the hardware, but in the turning data into knowledge, said Chambers. “$7.3 trillion of the $19 trillion is tied to data, analytics and data in motion.”

The key is to bring the analytics to the edge, to the data. Cisco is pulling it all together – mobility, cloud, social, analytics – “in a way our competitors are not.”

In a recent blog, McHugh told Cisco partners that analytics represent a truly transformational business opportunity for them. It enables enterprises to make dramatic changes in their business processes that will significantly strengthen the competitive edge.

In March the company announced resale agreements with a number of data-management partners. “Achieving the business outcomes of big data requires an analytics-ready infrastructure that enables a broad range of joint solutions with ecosystem partners. That’s why Cisco is excited to be working with Cloudera, Hortonworks and MapR and reselling their Hadoop-based data management solutions to customers,” said McHugh

There’s no shortage of data to analyze, said Cisco, although we already have more data than we know what to do with, and IoT is still in its infancy. According to IDC, less than 1% of data is analyzed. Then there’s the whole issue of outdated and useless data, a digital landfill that consists of up to 80% of all digital data, and that’s pre-IoT.

In addition to too much data, there are additional IoT pain points, stated Cisco, including: the network between the edge and the cloud can be relatively expensive (especially if you send all data to the cloud) or has limited capacity (capacity is of course correlated with price); latency to the Cloud can also be relatively high, and often lacks determinism (i.e. changing the color of traffic lights via the cloud can take too long). On the other side, fog, or , has its own challenges, including: limited resources; limited network capacity; security challenges; and resource distribution.

While numbers like 30-50 billion devices are starting to circulate, the research companies are trying to be a little more guarded in their forecasts. Gartner recently predicted that there will be 25 billion Internet “connected things by 2020, producing close to $2 trillion of economic benefit globally. Currently, the top two verticals using IoT are manufacturing (307 million installed units) and utilities (299 million).

IDC expects installed service provider datacenter capacity consumed by IoT workloads to increase nearly 750% between 2014 and 2019. It noted that the agility and scale required in IoT deployments will ensure that much of that datacenter capacity ends up residing in service provider datacenters, BUT IoT will emerge as the leading driver of new compute/storage deployment at the edge.

Much like the initial response to SDN (software defined networking) – “we don’t know what it is, but we want it” – Gartner recently reported that more than 40% of organizations expect the IoT to have a significant impact over the next three years. However, fewer than 25% of respondents have established clear business leadership for the IoT, either in the form of a single organization unit owning the issue or multiple business units taking ownership of separate IoT efforts.

McHugh agreed that his customers are struggling with analyzing this increasing amount of data. When they are doing their analytics strategies, some want to bring it to the data center and some want to do it in real time. He said examples include retail, where checkup lines getting long, healthcare, using patient sensors, and monitoring, and also in oil, where there is a need for immediate analysis when drilling, but research can be done back in the data center.

Either way – fog or cloud/data center – Cisco stands to win, and win big, if it can convince the rest of the world to come along for the ride, and can execute its vision. It already connects most of the world: 85% of global IP traffic touches Cisco equipment. “What we need to do at the core is get that infrastructure which is easy to manage… and get vertical expertise to our customers to help them make that decision”, said McHugh.

Win or lose, Cisco will not be alone on the fog computing bandwagon. IBM, along with Cisco is the “thought leaders” in fog computing, according to UBS analyst Steven Milunovich.

And where there are multiple vendors rallying around a new fad, then there must be events. Fog computing already has at least one dedicated conference, with the second event – Fog Computing Conference – scheduled for August 17-20, in Las Vegas. No worries about ground fog interrupting proceedings, but heat stroke is a definite concern.

Author: Steve Wexler

Share This Post On

Leave a Reply