IBM Eases Customers’ Path to AI and Hadoop

Hardware/software/data integration and interdependencies are important for enterprise workloads but are especially critical when it comes to performance-sensitive applications, such as artificial intelligence (AI). Unfortunately, they’re also easy points to misunderstand when highly complex technologies are in the early stages of commercial development and deployment like, again, AI. As a result, if or when organizations move to or beyond AI proof of concept (PoC) exercises without clearly knowing the challenges and risks they face, it’s all too easy for them to run into problems, fail unnecessarily, then scale-back or abandon their efforts. So, it’s great when vendors help customers anticipate and steer clear of avoidable pitfalls with solutions designed to contend with and overcome fundamental technological complexities. Those points came to mind regarding new offerings from the IBM Cognitive Systems and IBM Analytics groups. Let’s consider those announcements separately, along with how they’ll affect the company’s analytics, AI and other offerings, and customers’ related efforts. IBM Power Systems – A reference architecture for AI Despite the considerable hype being directed at AI’s commercial possibilities, the vast majority of businesses are still in very early stages of exploring the technology and its potential impact on their businesses. There are multiple reasons for this but prime among them is the complexity of most solutions, including hardware/software stacks and workflow/data flow processes. As a result, pursuing and succeeding in AI requires technological sophistication and “roll your own” IT skills that are beyond the capabilities of many, if not most companies. To help address those issues, IBM’s Cognitive Systems group introduced the first iteration (v 1.1) of PowerAI Enterprise and a related reference architecture for on-premises AI deployments. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More
CA’s BTCS2.1: Where Do We GrOw From Here?
Jun14

CA’s BTCS2.1: Where Do We GrOw From Here?

At last week’s second annual Built to Change Summit CA Technologies updated analysts and journalists on where it and the markets it’s pursuing — primarily DevSecOps, with a heaping helping of mainframe — are, where they’re going, and how the software toolmaker will grab a bigger slice of the rapidly growing digital transformation (DT) pie, which is being largely driven by software, and more specifically, applications. While the money being lavished on DT and DevSecOps are staggering, CA’s ability to grow with this opportunity remains at best a work in progress, with relatively flat sales and forecasts. Based on the market data, CA should be in the DT/DevSecOps sweet spot, and poised for rapid and sustainable growth. According to a new report, IDC’s forecast for the global DevOps software market — in excess of $5.6 billion by 2021 — was way off. MarketsandMarkets predicts that CA’s future has a much bigger potential upside — $10.31 billion by 2023 — up from $3.42 billion in 2018. Even better for CA, the market growth will be powered ‘due to the increase in the adoption rate of Artificial Intelligence (AI) and machine learning among enterprises.’ So all that remains to be seen is if CA can continue to grow with the software-enabled, data-driven, digital transformation business phenomenon that will run on DevSecOps, while reducing, if not eliminating, the shackles of its legacy businesses and embraces software-as-a-service and more flexible pay-as-you-go consumption models. It faces many competitors — including IBM, Micro Focus (HPE), Puppet, Red Hat, Microsoft and Chef Software — and must continue to innovate at speed, and execute with precision and agility. That’s a lot to ask, but for a company that’s been around since 1976, probably not too much. Automation, AI and ML were front and center at BTCS 2, and while the company didn’t coin this phrase — “Software is eating the world but AI is eating software” — it was critical to the company’s future, said Ashok Reddy, Group GM, DevOps. He and other company execs, made it clear that artificial intelligence and machine learning were being aggressively pursured in a multitude of initiatives and products. Just prior to the summit, CA’s CTO and EVP Otto Berkes said there is “massive potential” to apply machine learning and machine intelligence. amd that the company has some “very pragmatic solutions” already in the market, and is doing a “lot of experimentation” on machine learning and machine intelligence. They figured prominently in last weeks product initiatives, as well as a number of its boundary-stretching initiatives, i.e. CA Accelerator, its internal fail-fast venture-capital program, and its Strategic Research intiative, under which a...

Read More

Converged vs Hyperconverged – What’s Driving the Decision

Organizations are being told to digitally transform, to become more agile, and to respond to the business faster in order to survive in a highly competitive market. And one way organizations are digitally transforming is by modernizing their infrastructures, which means shifting from a traditional 3-tier architecture to a solution that integrates compute, storage, networking, and virtualization. Such a solution must deliver a more cloud-like experience on-premises, making the eventual transition to the cloud easier or better yet, enabling organizations to confidently move cloud-native applications from the public cloud back to an on-premises private cloud. Both converged infrastructure (CI) and hyperconverged infrastructure (HCI) fit the bill, but what is driving organizations to pick one over the other? In our latest ESG research covering both CI and HCI, we asked midmarket (100-999 employees) and enterprise-class (1,000+ employees) organizations why they chose one over the other and results were interesting to say the least. To read the complete article, CLICK...

Read More

Lenovo Wants to Simplify Conference Room Collaboration…

Equipping a conference room used to be really easy. You’d specify a speaker phone for the room, maybe select a couple of white boards and a flip chart, specify a conference table and chairs and that’d be about it. Video conferencing attempted to disrupt this several times, but a lack of compatibility, poor ease of use, and extreme expense tended to keep it from getting to true critical mass. The bigger problem was the systems tended to be underutilized once in. Same with digital white boards there was a bit of excitement around them, but folks didn’t seem to want to learn how to use them, so they too never really got to critical mass. The choice seemed top be, keep it simple and get complaints about not having tools that were really expensive, make a guess about the advanced technology and then try to defend the expense against little subsequent usage, or pass the task of equipping the conference rooms to someone you really don’t like. Generally, the last choice tended to be the best for you, but it hardly put you in the running for best co-worker of the year. What makes the Lenovo ThinkSmart Hub 700 interesting is that it addresses most of the pain points I know of in conference room technology without adding a ton of complexity. It seems to follow the KISS rule of “Keep It Simple Stupid” which is something we all should have had engraved on our foreheads years ago. Let’s talk about conference room solutions this week. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

Cisco Security Synopsis from CiscoLive

Cisco held its annual customer event this week in Orlando FLA and invited the industry analysts to attend. CEO Chuck Robbins highlighted the company’s commitment to security in his CiscoLive keynote while other executives elaborated on more security product and services details. After a few days of meetings, I believe Cisco’s cybersecurity strategy focuses on: Product integration. Cisco wants a common cybersecurity product architecture that spans endpoints, networks, data centers, and the public cloud, and that can service most of its customers’ cybersecurity technology needs. As a result, Cisco is busy integrating products and services like AMP, Umbrella, Firepower, Talos, etc. Cisco demonstrated its platform and discussed its future roadmap in detail. -Openness and programmability. Beyond gluing its own products together, Cisco’s cybersecurity platform is built with connectors and APIs for third-party integration and programmability. To illustrate its technology alliance partner ecosystem, Cisco crowed about dozens of partners including Anomali, IBM, LogRhythm, and McAfee. Cisco’s intent-based networking programmability also extends to security for service providers taking advantage of APIs and building value-added services on top of Cisco security tools. To read the complete article, CLICK...

Read More

IBM, NVDIA, Oak Ridge Labs and the Summit of Supercomputing

Supercomputers and other top-end high performance computing (HPC) installations have long defined and delivered the bleeding edge of compute performance. However, the underlying systems in those projects often reflect and portend broader changes in the commercial IT marketplace. That was certainly the case during the steady move away from the proprietary technologies and highly customized systems that once ruled supercomputing toward servers leveraging Intel and AMD x86 CPUs and other Industry Standard components. As a result of those changes, supercomputing and HPC have become increasingly affordable and available for mainstream use cases. A similar fundamental shift is relevant to the new Summit installation revealed this week by the Department of Energy’s (DoE’s) Oak Ridge Laboratory and IBM which now qualifies as the world’s leading supercomputer. Let’s take a closer look at that announcement. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More