Monday 17 November 2014

Hadoop - An Analytical Tool



What is Hadoop?

I often came across this term while attending the Business Analytics classes and especially when we talk about Big Data. So I have collected some information about Hadoop from internet. First of all what is Hadoop?Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation.( by Margaret Rouse )

                                       

This yellow elephant image, is the copyrighted icon which depicts Hadoop and it is also interesting to know that Hadoop was the name of the founder’s (Doug Cutting’s) son’s toy elephant.
As it is rightly said that need is the mother of invention. Hadoop was invented out of a need to process big data, because the amount of generated data was continuing to increase rapidly. As the Web and other sources generated more and more information, it was becoming a challenging job to index the content, so Google created MapReduce in 2004, then Yahoo! created Hadoop as a way to implement the MapReduce function. Hadoop is now an open-source Apache implementation project.


When Mike Olson (Cloudera CEO ) was asked he told that the Hadoop platform was designed to solve problems where you have a lot of data — perhaps a mixture of complex and structured data — and it doesn’t fit nicely into tables. It’s for situations where you want to run analytics that are deep and computationally extensive, like clustering and targeting. Hadoop applies to a bunch of markets. In finance, if you want to do accurate portfolio evaluation and risk analysis, you can build sophisticated models that are hard to jam into a database engine. But Hadoop can handle it. In online retail, if you want to deliver better search answers to your customers so they’re more likely to buy the thing you show them, that sort of problem is well addressed by this platform. 

No comments:

Post a Comment