Why Lenovo Dominates the SAP Hana Market

It is kind of amazing how much Lenovo has changed in the last 15 years or so. In 2001, I doubt most of us had even heard of the company and then they bought the IBM PC product group, along with one of the most iconic PC brands; ThinkPad. Most recently they bought IBM’s System x x86 server business, and on a call last week SAP confirmed that Lenovo sells over 50% of the solutions for SAP’s HANA. SAP Hana is one of the leading analytics engines and it has been designed and tuned to run on x86 platforms. These implementations tend to be large and sell well into the enterprise space which, outside of PCs, hasn’t been a Lenovo strength historically. Consider also that IBM’s System x business was under resourced to a near starvation level and carried massive IBM overhead so it is wonder it even operated let alone came to dominate a critical market segment like SAP HANA. I think it was the result of three things; applied stress, an unusually close relationship with SAP, and Intel. Read more at http://www.tgdaily.com/enterprise/157256-why-lenovo-dominates-the-sap-hana-market#M56KgYATXqHaJmD8.99 NOTE: This column was originally published in the Pund-IT Review. Share this:TweetMoreEmailPrintShare on...

Read More

In-The-Moment OI Poised For Rapid Growth

In an increasingly data-driven economy, making the most effective use of information in a timely fashion can be the difference between success and failure. Real time analytics is table stakes today, says in-memory computing technology vendor ScaleOut Software, but the trend you should be paying attention to is in-the-moment operational intelligence, according to CEO and founder Dr. Bill Bain. ‘While business intelligence provides insights for static datasets, usually identifying long-term trends based on historical data, operational intelligence targets short-lived business opportunities, offering timely, actionable insights. Operational intelligence tracks the behavior of live systems, integrating streaming data with customer preferences and historical information to create a comprehensive view and generate immediate feedback.’ It’s still early days for OI, Bain told IT Trends & Analysis, with little in the way of ROI or TCO studies, but a number of his customers are starting to see the benefits. “What we see is that customers recognize the need to be able to respond in a personalized way to customers… and don’t have the tools… they typically go to look in the analytics community for solutions.” Some ScaleOut OIcustomer examples include: -enabling a cable TV provider ingest, correlate, and enrich real-Sme events from cable boxes to provide immediate upsell offers, manage services, and compete with OTT players (e.g., NeVlix); -helping a large telecomm. company track bandwidth demand and adjust cable box bandwidth in real-Sme to prevent hot spots; -assisting a financial services company handle multiple incoming data streams and compute indexes in real-time with a flexible architecture that avoids data silos; and, -enabling a large hospital to track real-time patient data, generate alerts, and feed it into a Hadoop backend system for analysis. Big data and analytics (BDA) is drawing a lot of attention, according to IDC: -by 2020, 75% of databases (relational and non-relational) will be based on memory-optimized technology; -data monetization efforts will result in enterprises pursuing digital transformation initiatives, increasing the marketplace’s consumption of their own data by 100-fold or more; and, -the high-value data — part of the Digital Universe — that is worth analyzing to achieve actionable intelligence will double. Predictive analytics will see a Compound Annual Growth Rate (CAGR) of 27.4% between 2015-2020, growing from $2.74 billion to $9.20 billion. A Teradata study last Fall reported that 59% of respondents consider BDA either a top five issue or the single most important way to achieve a competitive advantage, and 90% reported medium to high levels of investment, and about a third called their investment “very significant.”. Last Spring Cisco’s John Chambers stated that turning data into knowledge was critical over the next decade, especially in regards to the...

Read More
Perishable Data: Not Just What, But Where (and When)
Nov30

Perishable Data: Not Just What, But Where (and When)

We are drowning in data, and with the rise of the machines — sensors/Internet of Things, not Skynet — we will no doubt look back upon merely 100% data-volume growth-per-year with nostalgia. However, in addition to Big Data’s Vs — volume, velocity, variety, veracity and value [also, the more Vs, the varier:  verbality, verbosity, versatility, viscosity and visibility] — we have a relatively new Big Data phenomenon called perishable data, information that can substantially decrease in value over a period of time. A decade ago, high-value data was put into data warehouses, said Mike Flannagan, VP/GM, Data & Analytics, Cisco. Over the last 5 years, that data has been dumped into Hadoop, but today, “data is not going to be stored in Hadoop or data warehouses at all”, he said. “We will have data in three locations… data warehouse… Hadoop… and real-time data that will likely have to be processed and stored very near to the location.” Industries like oil and gas that collect a lot of data locally, at wells, can use that data to extend the life of these assets. Perishable data is not for everyone, but Flannagan said a recent Cisco survey found that 37% of customers said three years from now most data (generated by IoT) will processed at the edge. Just the IoT is expected to be worth $19 trillion business, with analytics a key component of that ($7.3 trillion is tied to data, analytics and data in motion). Currently, according to IDC, less than 1% of data is analyzed. The volume of data is one challenge, he said, i.e. sensors in oil & gas wells generate 1-10Tb a day. Another challenge “is the timeframe I need to analyze and get value from that data.” For many applications, days, weeks, months, or a quarter can be sufficient, but for others, you need to know right away. “If you have very low latency for getting insight from your data, and very high volumes… moving that data back to the datacenter is unfeasible. It just becomes impossible.” Edge analytics is very focused, very low latency or real time-sensitive, said Flannagan. He believes this will also create a big access problem, with storing data at the edge, and in the datacenter, with the best result generated by combining that data. “When it comes to edge analytics… no solution will operate entirely at the edge”. More data, whether processed locally, at the datacenter, or some combination is a huge opportunity for Cisco, for its networking, datacenter and relatively new analytics businesses. Addressing the perishable data issue means the company can solve customer problems that are really high-value, said...

Read More

IBM, Apple, and Watson: You are about to become obsolete

I used to work for IBM and it is always a bit of a kick when I’m at an IBM event. IBM was not only the greatest company I ever worked for, it was the most frustrating. It was a firm defined by obsolete processes at the time, and decisions that were based on a worldview that seemed decades out of date. This is what makes analytics in particular so interesting for IBM. If anything could address fixing a problem like this, it would be analytics—and IBM is all in. One of my biggest takeaways from this event is how much the entire focus of this event has changed from last year and I think it is largely because IBM isn’t just “selling” analytics. It is using analytics and instead of making decision based on old—and often false—data it is increasingly making decisions on current accurate data. To read the complete article, CLICK HERE NOTE: This column was originally published in the Pund-IT Review. Share this:TweetMoreEmailPrintShare on...

Read More
Dell World 2015: The Fiddly Bits
Oct22

Dell World 2015: The Fiddly Bits

AUSTIN, TEXAS:  Although pretty much overshadowed by the EMC acquisition and a significantly expanded CDW sales relationship, i.e. joining IT’s 800-pound gorilla club and confronting the elephant in the room, there were a number of product and service announcements made at Dell World 2015. They included cloud, big data and analytics, datacenter, IOT, mobility and security. Here’s a brief overview of those announcements and their implications. The products I found most interesting were the ‘first’ Datacenter Scalable Solutions products targeted at service providers, telecommunications providers and web tech customers. An extension, or expansion of the 9-year-old Data Center Solutions business unit which serves the dozen biggest Internet users, the global hyperscale organizations, DSS is going after the next tier of customers, web tech, telecommunications service providers, hosting companies, oil and gas, and research organizations. Dell said this segment is growing three times faster than the traditional x86 server market and represents a $6 billion-plus total addressable market. Officially unveiled back in August, it has been operating under the radar for the last 12 months, said Jyeh Gan, Director, Product Management and Strategy, DSS. The TAM for DSS is $6.6 billion, but it will be worth $25 billion, he said. “Because of that growth… we wanted to bring all of that learning from DCS… to all those customers who weren’t the 10-12 hyperscale”.  Customer interest has been high, said Gan. At the time of the August announcement, DSS had grown 460%; since then, it’s now over 800% year-to-year. “We are working with 250-plus customers.” The announcements included the DSS 7000, what Dell calls the industry’s densest storage server, capable of delivering up to 720 terabytes of storage in a single 4U chassis. The DSS 1500, DSS 1510 and DSS 2500 are 1U and 2U servers that feature a minimalistic design, flexible storage and IO options, industry-standard baseboard management controller (BMC) systems management and the latest Intel Xeon processors. Alan Atkinson, VP & GM, Dell Storage, calls the SC9000 storage array controller, with all-flash and hybrid flash configurations, “a gamechanger as far as price.” It offers the industry’s lowest cost-per-gigabyte for SSD storage, as low as 65 cents-per-gigabyte net effective capacity “At those price points I think we pretty much have eliminated disk.” An essential part of the SC9000 announcement is the updated release of Storage Center array software (V6.7). New capabilities include: Live Volume auto-failover for built-in disaster recovery with zero workload downtime and integrated host-side data protection for Oracle, Microsoft and VMware environments; and active data compression capabilities offer up to 93% flash capacity savings. There appeared to be some confusion over the pending EMC acquisition and what it would...

Read More