IBM Enterprise2014: Life After Intel, Disks & (Storage) Hardware
Oct07

IBM Enterprise2014: Life After Intel, Disks & (Storage) Hardware

LAS VEGAS: It’s the second annual IBM Enterprise event, which combines IBM System z (mainframe, or Big Iron) Technical University, the IBM Power Systems (the mini-me platform) Technical University and the Enterprise Executive Summit, featuring heavy doses of education, training and certification. However, for an event billed as the premier enterprise infrastructure (i.e. hardware) conference, which drew an audience of 3,600 customers and partners, up 35% from last year’s debut, software featured prominently in the Day 1 announcements. Technically, the hardware announcements weren’t really made at the conference, and instead were released on Friday, the day after the announcement that the initial closing for Lenovo’s acquisition of IBM’s x86 server business  had been completed. Now that it has completely severed its ties to the Wintel duopoly that handcuffed its PC and server businesses, Big Blue has announced what it calls a “superior alternative” to x86-based commodity servers, with nearly 20% better price/performance. Built on the IBM Power8 CPU and OpenPower stack, and integrating IBM and other OpenPOWER member technologies, including NVIDIA’s GPU accelerator technology, the Power S824L servers enable clients to run data-intensive tasks on the CPU while offloading other compute-intensive Big Data workloads to GPU accelerators. IBM said these accelerators are capable of running millions of data computations in parallel and are designed to significantly speed up compute-intensive applications. It plans to optimize applications like DB2 to take advantage of GPU acceleration on Power Systems. In addition, future Power systems, due out in 2016,  will feature NVIDIA NVLink technology, eliminating the need to transfer data between the CPU and GPUs over the PCI Express interface. Other Power8 announcements included: IBM Data Engine for NoSQL; IBM Data Engine for Analytics – Power Systems Edition; Power Enterprise Systems; and Power Enterprise Pools. All offerings are scheduled for GA on October 31. Unlike its commodity server business, which IBM was happy to unload on — excuse me, sell to — Lenovo, storage appears to be a much more attractive opportunity, at least from the flash and software-defined perspectives. IDC reported that sales of Software Defined Storage Platforms grew more than 15% in the second quarter, and that IBM was the SDS-P leader for the Worldwide Storage Software QView for the 2Q14, based on software revenue. Meanwhile, over at Gartner, IBM was crowned as the 2013 worldwide leader in flash storage arrays, based on revenue. Being first is good, but when it comes to enterprise storage, disk is still king, and EMC still wears the crown (although Dell claims top spot for combined external and internal storage shipments, based on capacities). Given the negligible shares flash and SDS account for in the enterprise storage...

Read More

Making Virtual Server Recovery-In-Place Viable

Backup technology for virtualized environments has become increasingly more advanced. Many organizations have implemented backup applications which are specifically designed to efficiently backup data in a virtualized environment without causing any disruption to application performance. In addition, some backup applications, like Veeam, now allow for data residing on a disk based backup target to be used as a boot device to support instant VM recoveries. To read the complete article, CLICK HERE NOTE: This column was originally published in the Storage Switzerland Weekly...

Read More

Abstracting Disk/Tape To Solve The Unstructured Data Problem

Unstructured data are consuming vast amounts of disk capacity in data centers, breaking IT budgets. The sheer number of files that make up the unstructured data also breaks the data protection process. Most of this data has not been accessed in years and should be archived. But the mere mention of “archive” conjures up thoughts of complex, hard to manage collection of components that eventually becomes almost as expensive as the production storage it was designed to replace. As we discuss in our video “Protecting Unstructured Data” part of this complexity comes from the fact that IT Professionals have to choose between two imperfect options when selecting the storage device to store archive data, disk or tape. They also have the difficult challenge of identifying and actually moving this old data from production storage to archive storage. To read the complete article, CLICK HERE NOTE: This column was originally published in the Storage Switzerland Weekly...

Read More

Even if Disk Were Free You’d Still Want Tape

The cost of disk capacity has come down dramatically over the last two decades and, thanks to technologies like scale-out NAS and object storage, the ability to manage petabytes of data is certainly a possibility. But the cost to power and cool this storage plus the cost of data center floor space have increased dramatically. The net impact is that even if storage vendors gave disk capacity away, the cost to maintain that capacity could bankrupt you. To read the complete article, CLICK HERE NOTE: This column was originally published in the Storage Switzerland Weekly...

Read More
American Megatrends’ Limited-Time Free Storage Assessments
Apr14

American Megatrends’ Limited-Time Free Storage Assessments

Although American Megatrends (AMI) products ship in more than half of all computers in the world today, its storage offerings (disk, SSD Hybrid, and Full Flash SAN and NAS units) consist of a much more modest base of 1,126 installations. The company has much bigger ambitions with the launch of the StorTrends iDATA (Intelligent Data Analysis Tracking Application), a free software tool designed to assess IT infrastructure performance, capacity and throughput requirements. “This is something we stumbled on last year when we were introducing our 3500i (storage area network), said AMI’s Justin Bagby, Director, StorTrends Division. As they were prepping for the end-of-January release of the SAN that combined solid state drive caching and SSD tiering into a single storage appliance, it discovered that customers and prospects typically don’t know details of their infrastructure. A survey of more than 900 of its customers revealed the problem. “They understand the capacity play, but they don’t understand what lay underneath”, he said. AMI’s response was to create the tool, which is up and running in 200 customer betas. “As the market continues to move to flash, this is a tool to helps the market understand so they don’t underspend or overspend.” For a seven-day period iDATA collects customer data and then reports on critical metrics, including capacity utilization, IOPS usage, reads versus writes for volumes, network bandwidth performance, server statistics and application information. This data is then used by an AMI engineer to help develop a short and long-term storage strategy customized to each user, said Bagby. When it comes to storage infrastructure, the unknowns can have a serious impact on your environment’s success, he said. With the StorTrends iDATA software tool, we are eliminating those unknown factors that can cause critical storage headaches down the road. “Our research finds that one of the key reasons storage installations fail is that IT administrators lack the accurate information needed to properly provision the performance, capacity and throughput the environment requires to deliver optimum storage efficiency,” said Deni Connor, founding analyst, SSG-NOW, in a canned quote. “Are you considering SSDs in an All-Flash or Hybrid Storage Array? How much hot data or active data do you have within your existing environment and how much SSD or Flash space do you need? Don’t worry, most IT Directors and IT Admins don’t know the answer either. The StorTrends iDATA tool is a free and powerful way to answer this question and many more. It provides the key details organizations need, such as the amount of IOPS a SAN requires, the environment’s expected growth rate, and the amount of hot data or active data – to...

Read More