IBM Z Systems Software…

The interactions of computing software and hardware have long provided one of the best examples of truly synergistic relationships. Without hardware, software code is a mass of commands to which nothing listens. Without software, computers (especially enterprise systems) are little more than expensive, ungainly doorstops. However, working together they can make magic greater than the sum of their parts. Despite those interdependencies, public attention has long focused more on hardware than software. That may be because it’s simpler to get your head around physical objects than the abstract code that gives them life—easier to grasp and understand the machine than its soul. Whatever the case, the practice is unfair. However, that situation has been changing for the better as the critical importance of enterprise developer and operations professionals comes into ever sharper focus. Developers, after all, have instigated and helped to drive the success of numerous new technologies and behaviors, including public cloud computing. They’ve also been the core audience and interpreters of data- and data center-centric solutions inspiring new business growth and market opportunities. At the same time, enterprise operations personnel have never had a higher profile. As enterprises increasingly demand more efficiency and value from compute resources, IT admins and managers are the frontline troops responsible for achieving those requirements and delivering positive results. So, it made complete sense for IBM to host an analyst forum, IBM Z: Software for Digital Business Transformation, at its headquarters in Durham, NC. Along with highlighting the work and insights of its Z mainframe software teams, the event examined how those groups are interacting with developers and operations professionals to deliver strategic value to their enterprise employers. The event was also unique—in the dozens of IBM mainframe events that I’ve attended over the years, software has always been a supporting player, never the star. So, the meeting in Durham also qualified as IBM Z software’s first turn in the spotlight. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

Cisco and Collaboration

I’m at C-Scape, Cisco’s big analyst event which is held during Cisco Live, this week. One of the more interesting sessions was by Jonathan Rosenberg who is the VP and CTO of Cisco’s collaboration business. What caught my attention is that he opened with Metcalf’s law, which states that the value of a network is the square of the number of people on the network and he suggested this law also applied to communications tools. The reason this caught my attention is that it seems that most of the folks that are building collaboration/communications tools seem to believe that just building the tool is all you need. But, as Jonathan pointed out, if you don’t have a critical mass of folks actually using the tool it is worthless. He made a number of interesting additional observations let’s cover a few of them. Tools Are Gaining Communications/Collaboration Features According to Jonathan, there are a ton of developer tools that are gaining communications and collaboration features which may be causing some confusion about the purpose of these tools. This doesn’t change these tools into an alternative to email—the features just enhance these tools. However, they are creating (along with the social media stuff) a huge problem with regard to tracking the related conversations and managing them. The implication is there is an increasing need for a tool that can aggregate all of these conversations for the user. Kind of like the BlackBerry hub or Hootsuite for social media, but with far more reach. Cisco is developing just such a tool—a tool that can aggregate all communications—with WebEx Teams. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

Lenovo Wants to Simplify Conference Room Collaboration…

Equipping a conference room used to be really easy. You’d specify a speaker phone for the room, maybe select a couple of white boards and a flip chart, specify a conference table and chairs and that’d be about it. Video conferencing attempted to disrupt this several times, but a lack of compatibility, poor ease of use, and extreme expense tended to keep it from getting to true critical mass. The bigger problem was the systems tended to be underutilized once in. Same with digital white boards there was a bit of excitement around them, but folks didn’t seem to want to learn how to use them, so they too never really got to critical mass. The choice seemed top be, keep it simple and get complaints about not having tools that were really expensive, make a guess about the advanced technology and then try to defend the expense against little subsequent usage, or pass the task of equipping the conference rooms to someone you really don’t like. Generally, the last choice tended to be the best for you, but it hardly put you in the running for best co-worker of the year. What makes the Lenovo ThinkSmart Hub 700 interesting is that it addresses most of the pain points I know of in conference room technology without adding a ton of complexity. It seems to follow the KISS rule of “Keep It Simple Stupid” which is something we all should have had engraved on our foreheads years ago. Let’s talk about conference room solutions this week. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

IBM, NVDIA, Oak Ridge Labs and the Summit of Supercomputing

Supercomputers and other top-end high performance computing (HPC) installations have long defined and delivered the bleeding edge of compute performance. However, the underlying systems in those projects often reflect and portend broader changes in the commercial IT marketplace. That was certainly the case during the steady move away from the proprietary technologies and highly customized systems that once ruled supercomputing toward servers leveraging Intel and AMD x86 CPUs and other Industry Standard components. As a result of those changes, supercomputing and HPC have become increasingly affordable and available for mainstream use cases. A similar fundamental shift is relevant to the new Summit installation revealed this week by the Department of Energy’s (DoE’s) Oak Ridge Laboratory and IBM which now qualifies as the world’s leading supercomputer. Let’s take a closer look at that announcement. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More

Intel Makes AI Understandable and Accessible…

Though they populate an industry that prides itself on tackling and solving complex puzzles, many IT vendors prefer simplistic story-telling. That’s partly due to simplicity being easier to sell than complexity, even if it fails to address many or even most of the issues related to complicated engineering efforts. But simple tales also feed the industry’s love of self-promotional mythologies, including the triumph and massive remuneration of plucky entrepreneurs. I raise this issue because storytelling shorthand also tends to infect areas where accuracy is a critical component, like still-emerging technologies. Keeping things easy may seem to be beneficial in terms of helping an audience initially understand difficult subjects. But relying on simplistic exposition can also mask over-inflated claims and promote questionable reports about a technology’s potential for commercial success. We’ve seen this dynamic occur many times in the past—virtual reality headsets and associated technologies are just one good example. More than four years after Facebook paid an unprecedented $2B for VR start-up Oculus—a deal that was supposed to rapidly propel VR into the commercial mainstream—the industry and vendors continues to be hindered by many of the same core technological barriers that existed in 2014. So, it’s a pleasure to find vendors that are willing and able to discuss complex work in both realistic and understandable terms. That was certainly the case at Intel DevCon 2018, the inaugural conference for artificial intelligence (AI) developers that Intel hosted recently in San Francisco. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT...

Read More