…Buyers & Sellers Must Learn to Compete in the Amazon…

Premise: Amazon has turned the data center into an API. This trend is having profound impacts on enterprise IT customers. In particular, the economics of infrastructure outsourcing (i.e. deployment, provisioning, management and orchestration), which formerly had negative economies of scale at volume, are beginning to track software marginal economics – i.e. incremental costs go to $0. To compete with these cost structures, IT organizations and competitive cloud vendors will have to either have massive scale or become highly vertically integrated. To read the complete article, CLICK...

Read More

Evaluating Hadoop Vendor Partnership Strategies

Look at the data management architecture and technology portfolio of any large enterprise and you will more than likely find a heterogeneous collection of databases, data warehouses, data integration tools and business intelligence applications from multiple vendors. Over the years, most large enterprises have spent multiple-millions of dollars procuring, deploying and operating these data management technologies, which today support many mission-critical business processes and revenue-generating applications. While many of these technologies are growing long in the tooth and cost enterprise customers millions of dollars a year in maintenance and support, they none-the-less form the backbone of many enterprise operations and are not going away any time soon. It is against this backdrop that Hadoop makes its appearance. The open source Big Data platform began life as a set of related data storage and processing approaches developed by web giants like Google and Yahoo to solve specific tasks (first among them, indexing the world wide web.) But Hadoop quickly evolved into a general-purpose framework supporting multiple analytic use-cases. A number of forward-thinking enterprises took notice, as, simultaneously, the ever-increasing volume, variety and velocity of data (a.k.a. Big Data) raining down on the enterprise began to overwhelm the traditional data management stack. According to feedback from the Wikibon community, many data management practitioners today are eager to leverage Hadoop to at once relieve some of this pressure on existing data management infrastructure and to develop new, differentiating analytic capabilities. To read the complete article, CLICK...

Read More

Catalog Software Solves Copy Chaos

One of the greatest operational challenges in modern data centers is copy management. The number of copies of data is proliferating. Part of the reason is the availability on all storage array platforms of snapshot, in particular space efficient snapshots. The space efficient snapshots were first introduced by NetApp, and are logical copies of data based on metadata held in WAFL (Write Anywhere File Layout). By taking just the delta changes between two snapshots, these enable much more efficient replication of data either locally or to remote sites. Another reason for so many copies is that while current disk drives have increased radically in density (with 4TB drives becoming the norm), the access density (the number of IOs and the amount of data that can be extracted from this drives in a unit of time) has remained the same or declined. To ensure copies of data can be actually used, physical copies of data have to be made. The average number of copies of data exceeds 10 in a even a well run data center. The cost implications of these copies of data are great. The management challenges of managing all these copies are even greater. Finding snapshots uses the same principles as paper files – the newest one is the one on top, with the least amount of dust. Keeping track of snapshots, when they were taken, which developers or end-users have used them, whether they have been deleted and the provenance of data snapshots used in down stream processing and data warehousing is extremely suspect in most organizations. This leaves security and compliance less than adequate. The solution put forward by catalog software vendors is to keep track of all copies of data made, and keep track of the usage made to every copy of data. The resultant metadata about the copies can be used to: To read the complete article, CLICK...

Read More

VCE returns to EMC as a full member of the Federation

The joint venture between Cisco and EMC to create and sell Vblock converged infrastructure, now known as VCE, formerly Acadia, the VCE Coalition and the VCE Company, has been difficult to fully understand. A Vblock takes the pieces from the parent companies, integrates them and wraps them with software and services to deliver a solution that simplifies IT. The money trail between the parent companies, funding salaries of more than 1,400 employees and profit or loss is a bit more complex. While it is often said that the investments are losses, Wikibon Chief Analyst Dave Vellante said years ago that EMC only bleeds green. VCE is without question in the vanguard of pushing convergence to the marketplace, yet despite all of the buzz that this segment of the market gets, VCE is one of the quietest $1B companies in tech. The ownership of the company and governance model is now simplified as VCE is moved into the EMC Federation, answering any questions about dissolution or exit strategy. To read the complete article, CLICK...

Read More

Evaluation Hybrid Cloud Strategic Options For…

Recently Wikibon looked at the state of play for hybrid clouds in a research posting called “Beyond Virtualization: from Consolidation to Orchestration and Automation”. The general conclusions were: The key business driver behind a hybrid cloud strategy is to increase the value of applications to the end-user and the business. As a consequence, hybrid cloud projects require: Buy-in and leadership as business-led initiatives, with an application focus; Clear strategic milestones and goals for each stage, with an expectation of constant re-evaluation and pre-planning; Hybrid cloud is early in the development cycle and very early in the adoption cycle; Hybrid cloud adoption is a strategic imperative, in order to be competitive with applications deployed on in-house private clouds, and take advantage of workloads that would be better suited for deployment on public clouds with better access to big data and the internet of things (industrial internet); One key component is seamless migration from private to and from public clouds, and seamless migration between public clouds; Virtualization, orchestration and automation were essential elements of effective migration strategies; Virtualization, orchestration and automation of storage is one of the biggest challenges for hybrid cloud (virtualization of servers with hypervisors and containerization is advanced, and the virtualization of networks is technically simpler than storage). To read the complete article, CLICK...

Read More