Digital Business Platforms – The API is the Product

Modern business is evolving to a point where the interaction between companies and the marketplace has become a set of programmatic, digital interfaces. By creating a digital interface for the company, businesses can begin to interact with the marketplace in ways that were never possible before. To read the complete article, CLICK...

Read More

Follow the Money: Big Data ROI and Inline Analytics

Earlier Wikibon research showed that the ROI on Big Data projects was 55¢ for each $1 spent. During 2014, Wikibon focused on in-depth interviews with organizations that had achieved Big Data success and high rates of returns. These interviews determined an important generality, that Big Data winners focused on operationalizing and automating their Big Data projects. They used Inline Analytics to drive algorithms that directly connected to and facilitated automatic change in the operational systems-of-record. These algorithms were usually developed and supported by data tables derived using DeepData Analytics from Big Data Hadoop systems and/or data warehouses. Instead of focusing on enlightening the few with pretty historical graphs, successful players focused on changing the operational systems for everybody and managed the feedback and improvement process from the company as a whole. The technical requirements of Inline Analytic systems are to enable real-time decisions within the current or new operational systems, without the current ETL (Extract, Transform & Load) processes that takes hours, days or weeks to migrate operational and other data sources to data warehouse(s) and/or Hadoop systems. The software solutions developed by the winners deployed some or all of many advanced techniques, including parallelism, data-in-memory techniques and high-speed flash storage. Wikibon has researched different Inline Analytics technology approaches, including Aerospike, IBM BLU, Oracle 12c in-memory option and SAP HANA. The key finding of this research is that the sponsors of Big Data projects should measure success by: The time taken to create inline analytic algorithms to automate decision-making directly into the operational systems of record, and, The effectiveness of supporting DeepData analytics to reduce the cycle time for improving the Inline Analytic algorithms and finding new data streams to drive them. To read the complete article, CLICK...

Read More

Primary Data Comes Out of Stealth and into Big MetaData

Problem: Zettabytes of Data Primary Data came out of stealth in November 2014. David Flynn, a co-founder, CTO and architect of the Primary Data solution, was the CTO (later CEO) and chief architect of Fusion-io, and made major contributions to flash as an extension of DRAM technologies, before being ousted by investors wanting to cash out to SANDisk for about $1 billion. The problem Primary Data is addressing is the exabytes of data locked up in storage arrays, cloud services, and tapes. Each storage array family & cloud service is different, with unique data services. The data itself includes all the information about the data, the metadata. Each storage array is an island, each file a rock on the island. Sure you can connect arrays together in NetApp’s ONTAP 8 storage virtualization network and move the rocks round the island; it is still an island, and the data still includes all the metadata. Sure, EMC’s ViPR allows file systems to be created across different storage arrays, but the data services are still within the storage array, and the data itself includes all the metadata. To read the complete article, CLICK...

Read More

Data Service Providers Find a Home in AWS

Amazon Web Services didn’t make any Big Data-related announcements at its annual customer conference last week, choosing instead to focus on a slew of developer-targeted services and a new relational database offering. But the topic of Big Data was none-the-less top of mind for many AWS enterprise customers. This goes for direct enterprise customers — Philips Healthcare took the re:Invent keynote stage to discuss how it is using data analytics in the cloud to better diagnose and coordinate treatment in cancer patients – as well as enterprise Big Data and analytics vendors. Splunk and SAP, for example, both said they are seeing customers move their deployments from on-premises to the AWS cloud with increasing frequency. To read the complete article, CLICK...

Read More

The New IT Normal and AWS

At the third annual Amazon Web Services user conference (AWS re:Invent 2014), SVP Andy Jassy stated that “cloud has become the new normal”. AWS claims more than 1 million active customers and the fastest revenue growth rate (>40%) of any multi-billion dollar enterprise IT vendor. As the trailblazer and leader of IaaS, AWS cannot be ignored by IT. There are only two types of companies: those officially using AWS and those whose users are going around IT to consume AWS (see “Stealth IT”). CIOs must either be using AWS solutions or benchmarking themselves against what can be done with AWS. The ascendency of AWS has been compared to Microsoft and VMware. Like those previous cases, it will take time before we know how large they will become and there are growth headwinds when an IT supplier becomes too powerful in a partner ecosystem or takes a large amount of revenue from customers. To read the complete article, CLICK...

Read More