Evaluating Hadoop Vendor Partnership Strategies
Look at the data management architecture and technology portfolio of any large enterprise and you will more than likely find a heterogeneous collection of databases, data warehouses, data integration tools and business intelligence applications from multiple vendors. Over the years, most large enterprises have spent multiple-millions of dollars procuring, deploying and operating these data management technologies, which today support many mission-critical business processes and revenue-generating applications. While many of these technologies are growing long in the tooth and cost enterprise customers millions of dollars a year in maintenance and support, they none-the-less form the backbone of many enterprise operations and are not going away any time soon.
It is against this backdrop that Hadoop makes its appearance. The open source Big Data platform began life as a set of related data storage and processing approaches developed by web giants like Google and Yahoo to solve specific tasks (first among them, indexing the world wide web.) But Hadoop quickly evolved into a general-purpose framework supporting multiple analytic use-cases. A number of forward-thinking enterprises took notice, as, simultaneously, the ever-increasing volume, variety and velocity of data (a.k.a. Big Data) raining down on the enterprise began to overwhelm the traditional data management stack. According to feedback from the Wikibon community, many data management practitioners today are eager to leverage Hadoop to at once relieve some of this pressure on existing data management infrastructure and to develop new, differentiating analytic capabilities.
To read the complete article, CLICK HERE