HDFS Hadoop Distributed File System
HDFS is a powerful technology known as the Hadoop Distributed File System. It is important to understand that Apache Hadoop offers a data processing engine but also a well designed file system based on data replication techniques. That means big data sets are available at different locations. This enables the Hadoop system to process tasks close to the data instead of doing data transfers to large processing systems.
Introduction video about HDFS
The following video provides more information about this topic:
Follow us on Facebook:
Use HDFS to effectively store big data sets: http://goo.gl/nwShNY
Posted by big-data.tips on Wednesday, April 6, 2016