IBM Advances Cluster Virtualization…

On the classic Groucho Marx quiz show You Bet Your Life if a contestant accidently said the “secret word” of the day, he or she would win a prize. There’s no prize included in this commentary, but the secret word of the day is virtualization, especially as it relates to IBM’s new HPC and AI solutions. IBM defines virtualization as “A technology that makes a set of shared resources appear to each workload as if they were dedicated only to it.” IT is very familiar with this concept, what with operating system-level virtualization, server virtualization, network virtualization, and storage virtualization all continuing to permeate more and more through computing infrastructures and the collective consciousness. So, it should come as no surprise that IBM is advancing the concept of cluster virtualization in its latest announcement, tying it closely to cloud and cognitive computing. IBM’s cluster virtualization initiative combines products from its Spectrum Computing family, namely Spectrum LSF, Spectrum Symphony, and Spectrum Conductor, along with overall cluster virtualization software (Spectrum Cluster Foundation) to manage the whole process. And that includes the storage that is delivered through IBM Spectrum Scale, another member of the IBM Spectrum Storage family. The goal of this approach is to automate the self-service provisioning of multiple heterogeneous high-performance computing (HPC) and analytics (AI and big data) clusters on a shared secure multi-tenant compute and storage infrastructure. Doing so delivers multiple benefits to numerous technical computing end users, including data scientists and HPC professionals. The announcement focuses on these products: IBM Spectrum LSF, IBM Spectrum Conductor, and IBM Spectrum Scale. For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

IBM Introduces Transparent Cloud Tiering for DS8880…

Archiving data from mainframe storage systems has been traditionally limited to an on premises physical or virtual tape tier. However, IBM has overcome that limitation with the introduction of Transparent Cloud Tiering (TCT) software that runs on DS8880 storage systems for z Systems. TCT widens the archiving storage targets to cloud environments and that brings the benefits of hybrid cloud with it, such as creating more and better options for managing both capital and operating expenses. Why IBM is doing this reflects the fact that data tends to change in value over time. Keeping older data on primary production storage is expensive not only in terms of storage costs, but also in terms of the resources needed to manage that data (such as for backup and disaster recovery). The solution is to archive less frequently used data to a different (and less expensive tier) of storage, but also making sure that the information can be easily recalled upon request. In the mainframe world, archiving is optimized only for the use of tape. That means an on premises solution, which while useful, lacks some of the benefits of a hybrid cloud solution that IBM TCT supports. Let’s consider that more closely. For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

DEW17: Emphasizing Fundamental Storage Principles

Most large IT vendor conferences — especially those held in Las Vegas — tend to resemble a three-ring information overload circus. Attendees can easily be overwhelmed with the breadth and depth of what is being presented. At times, focusing on the tried and true basics helps to refresh and clear one’s mind. As an example, let’s turn to the storage solutions that were highlighted this week at Dell EMC World. Naturally, Dell EMC continues to evolve its storage portfolio, but it is not doing so by abandoning the core storage products and principles that have been fundamental to its success for more than two decades. Now, Dell Technologies is the rubric under which all the businesses of the company fall. Dell EMC is the data center infrastructure business and incorporates Dell’s business servers and a midrange storage product line as well as EMC’s traditional high end, midrange, and scale-out (unstructured data) storage systems. In addition, converged infrastructure solutions, including hyperconverged infrastructure (HCI) offerings falls with the purview of Dell EMC. For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

IBM Continues its Leadership in Software-Defined Storage

“We’re #1!” is the proud cry that every team and organization would like to make, and IBM can claim that proud distinction for software-defined storage. The evidence comes from market research vendor International Data Corporation (IDC), which has ranked IBM #1 in the worldwide software-defined storage (SDS) market for the third straight year. This is a meaningful distinction as the software-defined storage market is large and is expected to continue its rapid growth. IDC estimates that the market for SDS would grow at a 40% CAGR in 2015-2020 and reach $1 billion in 2016. This is the fastest of any of the seven storage software functional markets that IDC tracks and shines in comparison to what IDC says is the low performance of storage replication and infrastructure solutions. In short, IBM has chosen the right functional storage market horse to ride (although, of course, it participates in the other functional markets where it is amongst the leaders in all storage software categories, as well as a large full-spectrum IT vendor). For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

IBM Continues to Advance Its Strategic Storage Investments

In 2015, IBM announced that it would spend $1 billion on software-defined storage (SDS) R&D over the coming five years. Recent enhancements in its SDS portfolio — namely the IBM Spectrum Storage family — reflect how that ongoing investment is benefiting storage users and IBM customers. IBM Spectrum Storage family: Responding to changing times Regarding IBM’s Spectrum Storage family, recall what SDS is and why just one product won’t do. SDS decouples the software that manages storage from the underlying physical storage hardware. That increases the flexibility of deployment. So customers can choose to use software-only with virtually any heterogeneous storage systems, i.e., not necessarily IBM storage, although all or part of the mix could include IBM equipment. A second SDS deployment model is with an appliance. In the case of selected IBM Spectrum Storage products, the software can be sold with specific IBM hardware making it a more traditional approach, but it also means that the software can take fuller advantage of the underlying physical hardware. An example is the tight coupling of the IBM DeepFlash 150 with IBM Spectrum Scale that results in a high-capacity, all-flash (meaning high performance) system (called DeepFlash Elastic Storage Server) with the scale-out file management capabilities. A third SDS deployment model is as the foundation of a cloud service. Since the “cloud” in its many permutations and manifestations continues to proliferate applications and data, SDS can provide the support needed for accompanying storage systems. But why the need for multiple products? The answer is that the variety of applications and data types continues to explode in numerous dimensions, all of them additive with none taken away. Traditional block-based, structured data online transaction processing systems and file-based systems, such as for semi-structured data as document management, are still critically important. But now, big data, Internet of Things, Web-based applications, and mobile applications are taking center stage, as well. NOTE: This column was originally published in the Pund-IT Review. For more information, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More