Alison's New App is now available on iOS and Android! Download Now

Hadoop: Intermediate

Learn about the major changes and services within the Hadoop framework with this free online Hadoop course.

Publisher: Workforce Academy Partnership
This free online Hadoop Intermediate course introduces you to one of the most significant changes made in Hadoop, the introduction of YARN. YARN stands for Yet Another Resource Negotiator and this Advanced Hadoop course provides practical knowledge and hands-on training in defining some Spark applications relevant to programming. Topics covered include YARN, NiFi, HBase, MapReduce and Creating HBase Applications.
Hadoop: Intermediate
  • Duration

    5-6 Hours
  • Students

  • Accreditation


Share This Course And
Earn Money  

Become an Affiliate Member





View course modules


Hadoop Intermediate is a free online course that offers you a comprehensive guide to understanding how YARN improves the multiprocessing capability of Hadoop. This Advanced Hadoop course explains how Hadoop works with the Hadoop Distributed File System (HDFS) in order to provide a more complete parallel environment for applications. The content explains some of the terms that are related to MapReduce, HBase, Tez, Spark’s shell and NiFi and more about Hadoop’s most significant changes in recent years known as Yet Another Resource Negotiator (YARN). With the help of YARN, Hadoop can easily maintain a multi-tenant environment, with better security controls and better availability. In this Hadoop course, discover that NiFi enables real-time visual establishment of dataflows and how you can use the HBase Shell from the command line interface to communicate with HBase.

The following section of this course will teach you about Hadoop ecosystem newcomers, Tez and Spark. Tez improves efficiency for certain kinds of functions over MapReduce while Spark expands Hadoop's development capabilities by adding a number of new languages such as Python, and the Apache application base. Next up you will analyze the Spark shell and how to run a series of commands in the scala language. You will discover some additional tools for ETL such as Flume, Linux and NiFi. Flume provides a way to perform real-time capture as well as streaming data and Linux provides a number of command utilities for data capturing, transformation and storage. Nifi on the other hand is a generic workflow load tool recently introduced and designed to make it easier to perform ETL operations over a wide variety of data sources and types. This course explores the functionality of some workflow tasks often referred to as processors. The final part of this course gets you going on how to start the NiFi service within Hadoop and explains how to file up a NiFi workflow together with the various processors or tasks needed to build a full workflow process. The course demonstrates all the components of the ETL process and how NiFi can play a significant role in the development of ETL processes within Hadoop. HBase is the next theme covered and you will analyze how to start the HBase shell, its architecture, various commands, design practices and the role it plays in delivering fast robust processing within Hadoop. 

Hadoop is a great BigData solution that can benefit many professions because it is flexible, scalable and fast. As an IT professional you could enrol in this Hadoop Intermediate course to easily learn MapReduce programming as well as many others. If you have a bit of BigData experience or are comfortable with the basics of Hadoop, then this course is your best next step.

Start Course Now