Understanding The Hype Around Hyperconverged Infrastructure

There is a lot of hype around hyperconverged infrastructure (HCI). All the big vendors and a number of lesser-known smaller ones are in the game. Dell EMC has doubled down on its HCI portfolio investments; NetApp is entering the market leveraging its Solidfire technology; HPE is investing in growing its SimpliVity line; Cisco acquired Springpath so it could offer its own line, but it also partners with Nutanix, HPE and just about everyone else! Speaking of Nutanix, it was a category pioneer (along with SimpliVity) and its Dell EMC branded business is still growing, even though Dell EMC has somewhat competing products with VxRack and VxRail (the 3 HCI products serve different use cases – a topic for another blog!). Nutanix is also doing a healthy business through Lenovo and its channel partners and it has an agreement with IBM to offer its HCI on Power systems. Lesser-known (but fast growing) Pivot3 just announced 50% growth in bookings! Hitachi Vantara has a product it is also leveraging for Lumada IoT, and VMware sells vSAN for HCI use cases. I’m still just scratching the surface- I know I’ve left some vendors out – it’s a long list! What’s behind all this vendor investment and noise? Lots of user interest. Edwin Yuen and I recently sat down and dug into our new HCI research. In this video, we define what HCI is, discuss why IT organizations are so interested, and look at how HCI will impact more traditional approaches to IT infrastructure. Please watch and I would love to hear your feedback! To read the complete article, CLICK...

Read More

IBM & Anaconda… Cognitive Development

Solutions designed to reduce or eliminate complexity and risk are mainstays in the IT industry, especially when it comes to emerging technologies. The short and long term intentions of such efforts are pretty straightforward; by doing so, vendors aim to ease customers into trying on new offerings in hopes that the experience will result in future sales opportunities. That’s been the case with a number of efforts IBM has initiated around its Power Systems and technologies, including open sourcing its POWER processor architecture and helping to found the OpenPOWER Foundation with vendors interested in leveraging Power across a range of innovative new enterprise and data center solutions. But it’s also apparent in the company’s recently announced partnership with Continuum Analytics to offer its Anaconda Open Data Science platform on IBM’s POWER-based Cognitive Systems and to integrate Anaconda with the PowerAI distribution for machine learning. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

SAP/IBM Announce T&E Program for HANA on Power

Last week’s SAP Sapphire NOW user and partner conference in Orlando was something of a love fest for the company’s HANA in-memory database technology. That was hardly surprising, since SAP has made it clear that HANA represents far more than just an innovative solution for a variety of big data, analytics and business intelligence workloads. In fact, the HANA platform, along with an evolving number of complementary solutions, including the new cloud-based PaaS and subscription offerings SAP announced in March and April represent the elemental future for the company and its customers and partners. That’s a significant commitment by most any measure, so we were intrigued by a particular element in one of the press releases SAP published during the conference; that the company and IBM have begun a testing and evaluation (T&E) program for HANA running on IBM’s POWER7+ and POWER8-based systems. Specifically, the program allows for select IBM and SAP clients to deploy HANA solutions on Power Systems in SuSE Linux-based environments. Why is this of interest? Because until now, SAP HANA deployments have all been in Intel x86-based systems and appliances. For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

Even if Disk Were Free You’d Still Want Tape

The cost of disk capacity has come down dramatically over the last two decades and, thanks to technologies like scale-out NAS and object storage, the ability to manage petabytes of data is certainly a possibility. But the cost to power and cool this storage plus the cost of data center floor space have increased dramatically. The net impact is that even if storage vendors gave disk capacity away, the cost to maintain that capacity could bankrupt you. To read the complete article, CLICK HERE NOTE: This column was originally published in the Storage Switzerland Weekly...

Read More

Emerson… Forecasts Major Data Center Challenges & Changes

Emerson Network Power released “Data Center 2025: Exploring the Possibilities,” a report summarizing four months of global research designed to identify industry experts’ vision of the data center in the year 2025. Based on interviews with more than 800 data center professionals from around the world (and contributions from dozens of others who shared their thoughts via email and video), the results ranged from the expected to the ambitious. What was clear throughout is that IT experts believe modern data centers will undergo significant, even massive changes over the next decade. Viewed collectively, survey results indicate that most in the field remain bullish on the data center industry and continued IT innovation. For example, on average, experts predict density in 2025 will climb to an extremely ambitious (some would say unrealistic) 52 kW per rack. According to the Data Center Users’ Group (sponsored by Emerson Network Power), average density has remained relatively flat since peaking around 6 kW nearly a decade ago, but some experts anticipate a radical upswing in density. For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More