Apache Hadoop


Apache Hadoop is an open-source software library that supports the distributed processing of large data sets across computer clusters. The framework is designed to scale up from single servers to multiple machines, each offering local computation and storage. At the same time, all modules of Hadoop are designed with the basic assumption that hardware failures are common and should be automatically handled by the framework.

The core of the Hadoop framework consists of a storage path known as HDFS (Hadoop Distributed File System), and a processing part called MapReduce. Hadoop splits files into large blocks and distributes them across nodes in a cluster. In order to process the data, the packaged code is transferred by Hadoop to nodes, so that the packages are processed in parallel, according to the data that needs to be processed.

Advantages

  • Highly scalable storage platform - can store and distribute large data sets across hundreds of parallel operated servers
  • Cost effective - is designed as a scale-out architecture that can store all data of a company for later use, at an affordable price
  • Flexible - enables businesses to easily access new data sources and tap into different types of data to generate value from it
  • Fast - the storage method is based on a distributed file system that basically "maps" data whenever it is located on a cluster
  • Resilient to failure - data sent to an individual node is also replicated to other nodes in the cluster, meaning that in the event of failure, there is another copy available for use

Disadvantages

  • Security concerns - Hadoop is missing encryption at storage and network levels
  • Vulnerable - being written almost entirely in Java, it has been heavily exploited by cyber-criminals
  • Not fit for small data - due to its high capacity design, it lacks the ability to efficiently support the random reading of small files
  • General limitations - Hadoop misses the ability to improve the efficiency and reliability on data collection, aggregation and integration

Components

  • Hadoop Common
  • HDFS
  • MapReduce
  • YARN

Development tools

  • Hadoop Development Tools (HDT) plugin
  • Eclipse IDE
  • HUE web interface
  • Azkaban
  • Hortonworks Sandbox

Versions

  • Version 0.20
  • Version 0.22
  • Version 1.0.0
  • Version 1.2.0
  • Version 2.0.0
  • Version 2.2.0
  • Version 2.5.0
  • Version 2.6.0
  • Version 2.7.0



Recent posts on our blog
4 ways to build a killer mobile app
Aug 25, 2017, by Dragos
There still is a lot of growth potential in the mobile industry. There are untapped areas that will be discovered in the near future. You can do it as well, therefore this is the right time to choose a mobile development career.... read more
Hadoop: the future is now - and it's all about Big Data
Jun 05, 2017, by Dragos
Apache Hadoop is an open-source software written in Java, a framework that allows for the distributed processing of very large data sets across clusters of computers.... read more
Linux goes down in history: a mature operating system
Mar 14, 2017, by AdrianC
As we’ve seen in the second part of this series, the number of Linux distros has increased rapidly, as the Open Source community responded in a positive manner to the flexibility of the Linux kernel.... read more
Products  |  Press  |  Site Map  |  Technologies  |  Terms of Use  |  Privacy Policy
© 2017 SBP Romania. All rights reserved.