HCI: A Cure For IT Complexity?
Feb08

HCI: A Cure For IT Complexity?

All-in-one computing, or IT in a box, is experiencing huge growth under the hyperconverged infrastructure (HCI) label, but while it has quickly moved from hype to mainstream, it still has a long way to go before the software-centric architecture – that integrates compute, storage and virtualization resources in a single system, typically x86 hardware – becomes the preferred way to build your IT infrastructure. HCI first showed up on the Gartner Hype Cycle in 2015, paired with Integrated Systems and taking its initial step of its Hype journey, Innovation Trigger, with the expectation of reaching the Plateau of Productivity in 5-10 years. Just a year later, in Gartner 2016 Hype Cycle For Storage Technologies, HCI was poised atop the very Peak of Inflated Expectations, with an estimated mainstream adoption of less than two years. On Tuesday Gartner released its inaugural Magic Quadrant for Hyperconverged Infrastructure, which placed Nutanix, along with Dell EMC, VMware and HPE in its Leaders category. Honorable mentions went to: Cisco, Huawei and Pivot3 (Challengers); Stratoscale and Microsoft (Visionaries); and Scale Computing, DataCore and HTBase (Niche Players). The research giant predicts that by 2020, 20% of business-critical applications currently deployed on three-tier IT infrastructure will transition to hyperconverged infrastructure. According to the latest numbers from IDC, converged systems market revenue increased 10.8% year over year to $2.99 billion during the third quarter of 2017 (3Q17), but hyperconverged systems sales grew 68.0% YoY to $1 billion (33.5% for the total market). Dell was the HCI leader – $306.8 million in revenue and a 30.6% share – followed by Nutanix in second place, with $207.4 million in revenue and a share of 20.7%. IDC’s list of key players included Atlantis Computing, Cisco, Fujitsu, Gridstore, HPE, SimpliVity, Maxta, Nimboxx, Pivot3, Scale Computing, NetApp, DataCore and Vmware. Another company with HCI aspirations is Microsoft, which entered the HCI space in late 2016 when it made its datacenter OS, Windows Server 2016, generally available. “Hyperconverged infrastructure is a key part of our Windows Server 2016 software-defined strategy spanning software-defined compute, storage, network and assurance,” noted Siddhartha Roy, principal group program manager for high availability and storage in Windows Server. “The converged systems market expanded on multiple fronts, most notably within hyperconverged solutions,” said IDC’s Eric Sheppard, research director, Enterprise Storage & Converged Systems. “While hyperconvergence is not the sole source of market growth, it has undeniably driven an expansion of this market into new environments at a very rapid pace.” 451 Research predicts the HCI market will expand at a compound annual growth rate (CAGR) of 41% through 2020 to just under $6 billion, while Technology Business Research estimated that the...

Read More
Showdown At The DCIM Corral
Oct08

Showdown At The DCIM Corral

Data center infrastructure management (DCIM) specialist Nlyte Software, which just announced the integration of its data center service management (DCSM) solution with Tableau, believes its time the data center got its services act together. “The data center in many ways is sort of a cowboy going its own way,” said CMO Mark Gaydos. When it comes to ITSM the data center is the last frontier, he said. In a lot of places data center administrators are still using spreadsheets and Visio drawing tools. Nlyte calls it Phase 0 – Managed Chaos and keeping the data center running tactically – of the DCIM Maturity Model, and where most companies are today. The other phases are: Phase 1 – Information & Application Consolidation; Phase 2 – Process Optimization; Phase 3 – Strategic Data Center Planning & ITSM Integration; and, Phase 4 – Automation. Gaydos said their customers drove them to this integration. “At a high level, people are saying data center has to be part of ITSM and ITIL.” In July Nlyte, which has been selling DCIM software since before the term came into existence, introduced three DCSM solutions – Nlyte for BMC ITSM, Nlyte for HP ITSM and Nlyte for ServiceNow ITSM – that combine DCIM, workflow capabilities and integration into ITSM solutions. “DSCM not only bridges the divide between DCIM and ITSM, but it gives enterprises full control and visibility of data center processes, not just from the data center perspective, but even more importantly, from the purview of IT,” said Rob Neave, Nlyte CTO and co-founder , in a prepared statement. CA Technologies has been working on the next stage of DCIM for more than a year, what 451 Research calls datacenter service optimization or DCSO. Last October Analyst Rhonda Ascierto noted that CA wanted to take DCIM higher by combining it with other management tools. ‘We and others believe this type of DCSO approach will become increasingly common for end-to-end datacenter service management, including costing and best-execution management. CA is developing these and other converged capabilities and interfaces as part of a substantial, multi-year investment. It is a strategy that is ahead of the market. CA could be early with an integrated DCIM-ITSM tool, but it is not likely to be alone.’ According to 451, there are more than 60 DCIM players in the market (and another 100-plus ITSM vendors), with the large datacenter equipment companies – Emerson Network Power, Schneider Electric and Panduit – leading the pack. In addition to CA and Nlyte, competitors include ABB, Siemens, Raritan, Huawei, CommScope (iTRACS) and RiT Technologies. CommScope and HP recently partnered to blend DCIM and ITSM. Raritan even...

Read More
SDN/NVF A Work In Progress… And What Progress!
Jul28

SDN/NVF A Work In Progress… And What Progress!

The math is simple: mobility plus Big Data plus the Internet of Things/Everything plus analytics mean networks – datacenter, cloud and at the edge – must handle bigger workloads faster, and IT budgets can’t even come close to addressing these requirements with current technologies. Which brings us to this week’s OpenDaylight Summit, where software-defined networking (SDN) and network function virtualization (NFV) will be trumpeted as the technologies that can solve this equation. Whose vision(s) of SDN and NFV will prevail is still very much in question, but what isn’t is the need, and the progress that has been made so far. There are four use cases for SDN/NFV, said Neela Jacques, Executive Director, OpenDaylight, in a phone interview with IT Trends & Analysis. The first is visibility and a better level of unification and orchestration, and while it’s the ‘least sexy’, it represents the biggest opportunity over the next 3-4 years. Customers “are frustrated with existing network management”. The other three use cases are: “trying to do real time management of your network, which is closest to what we consider traditional SDN”; NFV; and the fourth is cloud. Each of these use cases bleed into each other, he said. “At the same time that SDN and NFV are coming up, you’re seeing a shift from proprietary to open-based solutions.” Which leads us to ODL. ‘OpenDaylight is a highly available, modular, extensible, scalable and multi-protocol controller infrastructure built for SDN deployments on heterogeneous multi-vendor networks. In English, instead of jargon, OpenDaylight is meant to handle any level of networking with pretty much any software or hardware. With top backers such as Brocade, Cisco, Intel, and Juniper, OpenDaylight has the business support needed to back up its technical boasts.’ Back in May Jacques stated that the networking industry has embraced open source as the right path forward for SDN, and that OpenDaylight has become the industry’s “de facto standard” open source SDN project. There are over 300 developers working across company lines to deliver a common and interoperable SDN and NFV platform that anyone can see, contribute to and use. ODL members include Brocade, Cisco, Dell, HP, Intel, IBM, Ericsson, Huawei, Oracle, NEC, Microsoft and VMware. A month ago ODL announced Lithium, its third open SDN software release. It also announced the OpenDaylight Advisory Group (AG), consisting of enterprise, telco and academic users who will provide technical input to the OpenDaylight developer community. Foundational members include representatives from Telefónica I+D; AT&T; Orange; CableLabs; JArizona State University; Comcast; Caltech; China Telecom; Nasdaq; Deutsche Telekom; T-Mobile; and China Mobile. According to recent numbers from IHS Infonetics: -the global NFV hardware, software and services...

Read More

SAP TechEd 2014 – Taking HANA Further Into the Market

IT solutions seldom follow an entirely linear path, either technologically or commercially. Instead, they proceed in fits and spurts – overcoming points of resistance, adding key new features and innovations, adapting to marketplace dynamics and pursuing new opportunities when and where they emerge. These points were on clear display at SAP’s recent TechEd 2014 conference, the company’s annual get together for technically-inclined customers, partners and IT professionals, where improvements to and presentations concerning SAP’s HANA technologies abounded. But what was particularly interesting about the gathering was a significant shift in how SAP is talking about HANA and explaining its capabilities and value to businesses. Let’s take a closer look at that. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More
Object Storage Addresses Runaway Unstructured Data Growth
Jan15

Object Storage Addresses Runaway Unstructured Data Growth

While not as sexy as the flash mobs surrounding the solid-state memory bandwagon, it looks like 2014 will be a good year for object-based storage too. Also known as object storage, this architecture, which first surfaced in 1996, manages data as objects, as opposed to files (NAS) or blocks (SAN), and it is generating a lot of interest. IDC said the OBS market is still in its infancy but it offers a promising future for organizations trying to balance scale, complexity, and costs. The leaders include Amplidata, Cleversafe, Data Direct Networks, EMC, and Scality, with other notables such as Caringo, Cloudian, Hitachi Data Systems, NetApp, Basho, Huawei, NEC, and Tarmin. Last year OBS solutions were expected to account for nearly 37% of file-and-OBS (FOBS) market revenues, with the overall FOBS market projected to be worth $23 billion, and reach $38 billion in 2017, according to IDC. At a compound annual growth rate (CAGR) of 24.5% from 2012 to 2017, scale-out FOBS – delivered either as software, virtual storage appliances, hardware appliances, or self-built for delivering cloud-based offerings – is taking advantage of the evolution of storage to being software-based. “FOBS solutions are much more versatile and will quickly outpace more rigid, hardware-based options,” said Ashish Nadkarni, Research Director, Storage Systems, IDC. “Scale-up solutions, including unitary file servers and scale-up appliances and gateways, will fall on hard times throughout the forecast period, experiencing sluggish growth through 2016 before beginning to decline in 2017.” IDC said emerging OBS technologies include: Compustorage (hyperconverged), Seagate Open Storage platform, and Intel’s efforts with OpenStack. The revenue of all of OBS vendors combined is relatively small (but expected to grow rapidly) with a total addressable market (TAM) expected to be in the billions, noted Nadkarni. “Vendors like EMC and NetApp have not ignored this market – if anything they have laid the groundwork for it.” In December Storage Switzerland analyst Eric Slack wrote that one of the drivers of OBS is the huge growth in unstructured data. “This is one of the primary benefits of object storage, that its flat index is more easily searched by a computer than a traditional file system.” Then there are the economic benefits, he added. Traditional RAID-based storage systems use data replication to create multiple copies in order to protect a given data set, with this capacity overhead reaching as high as 300%, making these storage systems unfeasible. “Object storage with erasure coding can actually provide better data protection with overhead under 50%.” In addition, because many object storage systems are software solutions that can be run on nodes using low cost server hardware and high capacity disk drives,...

Read More