Digital Transformation: Innovation With A Body Count
Apr27

Digital Transformation: Innovation With A Body Count

For the majority of the IT industry’s history the focus has been on efficiency, how to do more with less. More recently, and now lumped under the catchphrase of ‘Digital Transformation (DT/DX)’, the focus has shifted to effectiveness: it’s no longer a case of just doing things right; the emphasis is changing to doing the right things. Increasingly, DT is an extinction-level event — it’s ‘go digital or die’ — and a new survey from Dell EMC reinforces this dire forecast (or incredible opportunity). The business phenomenon Digital Transformation (AKA digitization or Industry 4.0) and its related technologies — cloud computing, Internet of Things (IoT), big data and analytics (BDA), mobility, social media and security — changes everything… and nothing. New tools and new applications drive new ways of doing things, but ultimately, it’s still about selling more goods and services with acceptable margins. According to the ESG 2017 IT Transformation Maturity Curve study conducted by Enterprise Strategy Group and commissioned by Dell EMC, only 5% of large companies are prepared to meet the IT requirements of the Digital Business era. As do so many similar studies, Dell EMC found that 95% of survey respondents are falling behind their best-of-breed competitors who are accelerating their digital business goals through IT transformation, while 71% agree that they will not be competitive without IT transformation. Given that 96% of the more mature organizations exceeded revenue targets last year and are more than 2X as likely to meet revenue goals, I have to wonder why only 71% seem worried. As Dell EMC President David Goulden noted in the press release, “… the research shows that most respondents are falling behind a small and elite set of competitors who have cracked the IT Transformation code, and they’re competing more vigorously because of it.” Trey Layton, VP and CTO with Dell EMC’s CPSD, told IT Trends & Analysis the study reinforces the company’s belief that this “is more than a business agenda, it is a digital transformation at the foundation.” A major concern is that enterprises’ foundations typically consist of separate silos, and many employees and executives feel trapped. “If you look at the IT organizations we deal with around the world, they’re in various stages of their journey to transformation… but the power centers are siloed… in compute, storage and network silos…” The biggest concern they’re finding when they talk to customers “is that the future space doesn’t have a place for them from a skill-set perspective,” he said. “CIOs are trying to break down those barriers.” Global Knowledge’s 10th annual IT Skills and Salary Survey, released earlier this month, reported that more...

Read More
Portability Is Essential In The Multi-Cloud Future
Apr20

Portability Is Essential In The Multi-Cloud Future

Pretty much everybody agrees the world is moving to the cloud — public, private (which includes managed as a service) but predominantly the combination of both (hybrid) — and the primary questions are what to move where, and when (how is also a huge concern, but while not easy, it’s really just fiddly bits). Four years ago Cisco started using the concept of the ‘world of many clouds’ to describe its customer-choice model, and earlier this month data and analytics leader Teradata unveiled database license flexibility across hybrid cloud deployments. There has been an “aggressive uptick in interest, if not deployment of public cloud” by the company’s global 1000 customers, said Brian Wood, Director, Cloud Marketing, Teradata. He told IT Trends & Analysis that over 90% of their customers plan to have hybrid IT by 2020, and “85% want to consume as a service.” The company has 100 customers in the multi-petabyte range, with the largest customer in the 90Pb range, so licensing becomes critical, smoothing out the investments, he said. With portability, “ it’s have your cake and eat it too.” This massive move to the cloud, with a mix of public, private, hybrid and on-premise resources means portability — data, software and licenses — is a critical component. Cloud lock-in is no more palatable than vendor lock-in, and while only one vendor, with a limited set of offerings — albeit a set of significant offerings — Teradata says its newest capability, an industry first, gives its data management solution for analytics the ‘very best value proposition.’ “Not only is the database license portable across the hybrid cloud options, but so are workloads, enabled by a common code base in all deployments,” said John Dinning, EVP and Chief Business Officer, Teradata, in a prepared statement. “This flexibility is a first in our industry and means that data models, applications, and development efforts can be migrated or transferred unchanged across any ecosystem.” Looking ahead reinforces the growing cloud-first future, although this cloud shift is not just about cloud, stated Gartner. “This cloud-first orientation will continue to increase the rate of cloud adoption and, consequently, cloud shift,” said Ed Anderson, research vice president. “Organizations embracing dynamic, cloud-based operating models position themselves for cost optimization and increased competitiveness.” Spending on datacenter systems is forecast to be $175 billion in 2017, growing to $181 billion through 2020. However, while DC budgets will be relatively flat, spending on cloud system infrastructure services (IaaS) will grow from $34 billion in 2017 to $71 billion through 2020, account for 39% of total spending on datacenter systems. The latest market data/forecasts demonstrate the headlong rush to...

Read More
5G Is Going To Be Huuuuuge… Eventually
Apr13

5G Is Going To Be Huuuuuge… Eventually

With almost 40 years of IT reporting experience under my — sadly expanded — belt I’ve covered a number of profound developments and countless others of less import, but the eventual emergence of 5G is expected to CHANGE EVERYTHING. Yes, 5G is just a bigger, faster pipeline, but to paraphrase POTUS, it’s going to be huuuuuge: speeds of 10 to 100 gigabits per second (1,000 times faster than the current US 4G average); latency of less than a millisecond (compared to 4G’s 40ms to 60ms); and support for a million connected devices per square kilometer [that’s 5/8th of a square mile for the metrically challenged]. 5G use cases include: Internet of Things (IoT); extreme video and gaming applications; explosive data density usage; public safety; Public Switched Telephone Networks (PSTN) sunset; and context-aware services. User-driven requirements include: battery life; per-user data rate and latency; robustness and resiliency; mobility; seamless user experience; and context-aware network. And from the infrastructure perspective, network-driven requirements include: scalability; network capacity; cost efficiency; automated system management & configuration; network flexibility; energy efficiency; coverage; security; diverse spectrum operation; and, unified system framework. However it is very early in the hype cycle, with final standards 12-18 months away, and products and services expected to trickle out over the next couple of years. The market should become relevant by 2021-22, and there will be 1 billion 5G connections by 2025. So what does that mean to IT and CXOs today? “This is going to be a transformative change even though a couple of years away from mainstream adoption,” said Varun Chhabra, unstructured data expert at Dell EMC. He told IT Trends & Analysis it’s going to be a “gamechanger”. It will enable enterprises and businesses to provide their  customers with “a completely different way to engage with their brands.” While still a work in progress, 5G needs to be: a “chameleon” technology that can adapt to differing demands of wireless services — whether to support high bandwidth, low latency, bursty traffic, ultra-reliable services, or a combination of these capabilities, according to a recent report from the Telecommunications Industry Association. The TIA survey found that operators are uncertain how 5G might prove to be transformative, but while ‘history suggests that while it may underachieve relative to expectations in the short term, it will overachieve in the long term.’ As with any significant technology transition, there are billions of dollars being spent to either lead the change, or at least minimize the threat of being roadkill on the faster, broader information highway. Some proof points include: -5G commercial services will launch in 2020 and there will be 24 million 5G subscriptions...

Read More
Compuware Revs Up Mainframe Threat Detection By 30%
Apr06

Compuware Revs Up Mainframe Threat Detection By 30%

It is generally accepted that the mainframe, AKA Big Iron, is the most secure IT platform available, and a significant reason why: 55% of enterprise apps need the mainframe; 70% of enterprise transactions touch a mainframe; and, 70-80% of the world’s corporate data resides on a mainframe. However, the things which are driving today’s dominant IT paradigm, digital transformation — cloud computing, Internet of Things (IoT), big data and analytics (BDA), mobility, social media and security — are also increasing the mainframe threatscape, and Compuware is trying to do something about that. “It is the most secure platform by far,” said Compuware CEO, Chris O’Malley. But breaches happen, he tells IT Trends & Analysis, although most of these things that happen can be prevented. “Most of the breaches are from the inside.” That was the challenge a customer presented to Compuware, identify where and how recurring breach was taking place. The mainframe software vendor’s response led to Compuware Application Audit, a cybersecurity and compliance solution that ‘enhances the ability of enterprises to stop insider threats by fully capturing and analyzing start-to-finish mainframe application session user activity.’ The new standalone solution is a one-stop shop to: -detect, investigate and respond to inappropriate activity by internal users with access; -detect, investigate and respond to hacked or illegally purchased user accounts; -support criminal/legal investigations with complete and credible forensics; and, -fulfill compliance mandates regarding protection of sensitive data. A year ago the company partnered with CorreLog to provide a similar set of capabilities by integrating Compuware’s Hiperstation Application Auditing solution with CorreLog SIEM Agent for z/OS. The new solution brings a number of advantages, including collaborations with CorreLog, Syncsort and Splunk, to enable it to be integrated with popular SIEM solutions such as Splunk, IBM QRadar SIEM and HPE Security ArcSight ESM. While cybersecurity is not and won’t be a core focus of the company, Compuware Application Audit continues the company’s recent practice of making a major product introduction every 90 days. “We’ve put in more innovation in the last 10 quarters than our competitors have done in the last 10 years,” said O’Malley. The mainframe computing environment, with protocols dating back decades, is a new frontier of exploration for both the White Hat (ethical) and the Black Hat (criminal) hackers. “Ultimately we want people to understand that, because of its widespread usage as a core system in many critical infrastructures from finance to air travel; its relative obscurity; and lack of real wide-spread exposure to the hacking public; this system is rife with opportunities to be further secured and hardened.“  Chad Rikansurd (@bigendiansmalls) What he’s saying is that mainframe computing environments...

Read More
Data Monetization: Big Potential, Bigger Challenges
Feb09

Data Monetization: Big Potential, Bigger Challenges

The concept of data monetization — the act of turning corporate data into currency (either actual dollars or data used as a bartering device or a product or service enhancement) — has been around for the better part of a decade, but it’s still very early days in transforming this concept into a reality, Dell EMC’s Steve Todd, Dell Technologies Fellow, tells IT Trends & Analysis. The power of monetization relies on variety of data sources brought together into a fluid data lake that facilitates data sharing between lines of business. “We’re seeing a lot of customers that don’t have that data lake strategy.” That’s problem number one, he says. The second big problem is that business executives have not considered business data to have an asset value. “Everybody is trying to get to data monetization but nobody is thinking about tracking that value.” So the second issue is that “data needs to be treated as a capital asset”. Also referred to as infonomics, the economics of information, data monetization is predicted to be huge in the not-too-distant future. IDC figures revenue growth from information-based products will double the rest of the product/service portfolio for one third of Fortune 500 companies by the end of this year.  Gartner predicts that 10% of organizations will have a highly profitable business unit specifically for productizing and commercializing their information assets by 2020. However the road to data monetization will be bumpy, as Todd noted. While more than 85% of respondents report that their firms have started programs to create data-driven cultures, only 37% report success so far, with key roadblocks including management understanding, organizational alignment, and general organizational resistance. The range of ways to do information monetization is endless, says Gartner VP and distinguished analyst Doug Laney, but the first and biggest vision roadblock is a failure to think beyond selling information. Rather than limit the economic potential of your information, he advises businesses to think more broadly about “methods utilized to generate profit,” which can range from indirect methods in which information contributes to some economic gain, or to more direct methods in which information generates an actual revenue stream. Todd was involved as a collaborator and in joint research on data value and data monetization with EMC and the University of San Diego. Back in 2014 EMC did a Big Data survey with Capgemini that found that 61% of the over 1,000 C-suite and senior decision makers acknowledged that Big Data is now a driver of revenues in its own right and is becoming as valuable to their businesses as their existing products and services. “The fact that monetization...

Read More
CybSec Scores An ‘F’
Feb02

CybSec Scores An ‘F’

With the the RSA Conference 2017 just a week away, cybersecurity surveys are showing up everywhere, including Cisco’s 10th study, 2017 Annual Cybersecurity Report. However, while the networking giant wants to paint a more positive picture, my big takeaway is that the bad guys are winning. There are a number of positive developments in the survey — with input from 3,000 CISOs and SecOps from 15 countries, as well as telemetry data — but the key findings are, if not surprising, at the very least cause for increased concern. The key findings Cisco focused on were: -over one-third of organizations that experienced a breach in 2016 reported substantial customer, opportunity and revenue loss of more than 20%; and, -90% of these organizations are improving threat defense technologies and processes after attacks by separating IT and security functions (38%), increasing security awareness training for employees (38%), and implementing risk mitigation techniques (37%). The Cisco findings that concerned me were: -just 56% of security alerts are investigated and less than half of legitimate alerts remediated; -more than 50% of organizations faced public scrutiny after a security breach; operations and finance systems were the most affected, followed by brand reputation and customer retention; -for organizations that experienced an attack, the effect was substantial: 22% of breached organizations lost customers — 40% of them lost more than 20% of their customer base; 29% lost revenue, with 38% percent of that group losing more than 20% of revenue; and, 23% lost business opportunities, with 42% percent of them losing more than 20%. Cisco is also touting (justifiably) that it has reduced the ‘time to detection’, the window of time between a compromise and the detection of a new threat, from a median of 14 hours in early 2016 to as low as six hours in the last half of the year. That’s good, but hardly good enough: while the industry average for TTD is 201 days (with a range of 20 to 569 days), in  almost all breaches (93%), it took attackers minutes or less to compromise systems, and data exfiltration occurred within minutes in 28% of the cases. These issues are not a new story, said Cisco’s Security Business Group Architect, Franc Artes. He told IT Trends & Analysis that there are ongoing issues around budgets, trained personnel and the complexity of security environments, “but at the end of the day it’s really a human issue. We’re leaving a lot on the cutting room floor.” People are a big problem when it comes to CybSec. They both cause most of the security vulnerabilities — 55% of all attacks were carried out by either...

Read More