HPE InfoSight Brings Autonomous DC (i.e. Skynet) Closer
Dec07

HPE InfoSight Brings Autonomous DC (i.e. Skynet) Closer

The upcoming termination of Meg Whitman’s reign is not the only Big Bang due out of Hewlett Packard Enterprise early next year: in January the drastically slimmed-down enterprise IT powerhouse will roll out a 3PAR-enabled artificial intelligence recommendation engine (InfoSight AIRE) that will take HPE closer to the autonomous datacenter, according to company officials. “Infosight is AI for the datacenter,” HPE’s Gavin Cohen, VP, Product and Solutions Marketing, Storage, told IT Trends & Analysis. “That’s something Nimble started building on from the start.” HPE announced the completion of its $1.2 billion acquisition of Nimble Storage in April, and while that significantly beefed up its flash and cloud storage assets, the company said it would be leveraging InfoSight across both its storage and server portfolios. Calling InfoSight the “crown jewels” of the Nimble acquisition, the AI power of the platform provides HPE and its partners with a big competitive advantage against any and all competitors, said Meg’s CEO successor-to-be (as of  February 1) HPE President Antonio Neri. “Nobody has this,” he said in a recent interview. The predictive analytics capabilities are sure to power dramatic reductions in storage total cost of ownership (TCO) for businesses of all sizes, he said. “It delivers the best performance with the best uptime and lowest TCO optimized for the specific workloads that run on the platform. The customer gets the best experience at the lowest cost.” Beyond storage are servers and ultimately the datacenter, and bringing AI and predictive analytics to the datacenter is not only necessary for protecting existing revenue streams, but essential to the autonomous datacenter. While we hopefully won’t get a Skynet, Terminator’s rise (and fall) of the machines, AI in the datacenter is coming quickly. By 2019, 40% of digital transformation initiatives will use AI services; by 2021, 75% of commercial enterprise apps will use AI; and the majority of adopters have seen quantified returns meeting or exceeding expectations. “AI is a positive force for change,” stated Mark Purdy, Managing Director-Economic Research, Accenture Research. “It has the potential to markedly increase growth rates and substantially raise economic output across industries, while helping organizations to more easily rotate to the new way of doing business.” A recent survey found that AI could boost average profitability rates by 38% and lead to an economic increase of $14 trillion by 2035. But all that remains in the future; today, we have AI-powered storage, or at least Nimble, and shortly, 3PAR, and the benefits are equally compelling. The AI and predictive analytics capabilities of InfoSight reduce the time spent troubleshooting issues by up to 85% and help to deliver greater than 99.9999% of guaranteed...

Read More

IBM Advances Cluster Virtualization…

On the classic Groucho Marx quiz show You Bet Your Life if a contestant accidently said the “secret word” of the day, he or she would win a prize. There’s no prize included in this commentary, but the secret word of the day is virtualization, especially as it relates to IBM’s new HPC and AI solutions. IBM defines virtualization as “A technology that makes a set of shared resources appear to each workload as if they were dedicated only to it.” IT is very familiar with this concept, what with operating system-level virtualization, server virtualization, network virtualization, and storage virtualization all continuing to permeate more and more through computing infrastructures and the collective consciousness. So, it should come as no surprise that IBM is advancing the concept of cluster virtualization in its latest announcement, tying it closely to cloud and cognitive computing. IBM’s cluster virtualization initiative combines products from its Spectrum Computing family, namely Spectrum LSF, Spectrum Symphony, and Spectrum Conductor, along with overall cluster virtualization software (Spectrum Cluster Foundation) to manage the whole process. And that includes the storage that is delivered through IBM Spectrum Scale, another member of the IBM Spectrum Storage family. The goal of this approach is to automate the self-service provisioning of multiple heterogeneous high-performance computing (HPC) and analytics (AI and big data) clusters on a shared secure multi-tenant compute and storage infrastructure. Doing so delivers multiple benefits to numerous technical computing end users, including data scientists and HPC professionals. The announcement focuses on these products: IBM Spectrum LSF, IBM Spectrum Conductor, and IBM Spectrum Scale. For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

Storage…The Impact of CI/HCI…

ESG recently completed in-depth research on the state of the storage market; its own technologies and market trends as well as its key intersections to other notable IT implementations and shifts. We are presenting some of the extended highlights from the findings in multiple ESG Briefs (each focused on a particular topic), as well as tighter summaries of those Briefs in accompanying ESG videos. These will be rolling out over the next few weeks and we’ll capture all the available links in these blogs each time a new piece is posted.    To read the complete article, CLICK...

Read More

Storage Trends Research – Flash Storage…

ESG recently completed in-depth research on the state of the storage market; its own technologies and market trends as well as its key intersections to other notable IT implementations and shifts. We are presenting some of the extended highlights from the findings in multiple ESG Briefs (each focused on a particular topic), as well as tighter summaries of those Briefs in accompanying ESG videos. These will be rolling out over the next few weeks and we’ll capture all the available links in these blogs each time a new piece is posted. To read the complete article, CLICK...

Read More

IBM’s LTO-8 – Building a Bright Future for Tape Storage

Hang around the IT industry long enough and you notice that rumors of the impending demise of some product or class of products are always making the rounds. Sometimes they’re honest opinions expressed by canny industry-watchers. More often they reflect the hopes of desperate vendors trying to poke holes in competitors’ cash cows and/or businesses. Most importantly, they’re generally wrong. Why do I say that? Because if you examine the evidence, you find that technologies tend to die for one of two reasons. The first are vendor-led extinctions where a vendor decides to pull the plug on a given technology (or the market pulls the plug on the vendor). For example, HP’s 2000 acquisition of Compaq and its subsequent adoption of Intel’s Itanium CPUs resulted in the company killing its own HP-UX chips, as well as Compaq’s Alpha and Tandem silicon. Technologies also die when they fail to keep pace with alternatives or lose the faith of core customers. Data storage technologies provide a rich smorgasbord of examples, including the appearance/disappearance of 8-inch, 5¼-inch and 3 ½-inch floppy disks, and Iomega’s Zip and Jaz drives, all of which were driven under by decreasingly costly/increasingly popular HDD and CD/RW technologies. Which brings us to tape storage, particularly data center-focused tape technologies. Those have been under a death-watch since 2002 when EMC introduced its Centera platform, the industry’s first HDD-based solution for data archiving, long a tape bastion. More to the point, despite surviving well beyond competitors’ hopes and expectations, tape storage also continues to evolve as evidenced by the new generation LTO-8 offerings just announced by IBM. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More