From 0 to 1: The Oozie Orchestration Framework

A first-principles guide to working with Workflows, Coordinators and Bundles in Oozie

What's Inside

Course Description

Prerequisites: Working with Oozie requires some basic knowledge of the Hadoop eco-system and running MapReduce jobs

Taught by a team which includes 2 Stanford-educated, ex-Googlers and 2 ex-Flipkart Lead Analysts. This team has decades of practical experience in working with large-scale data processing jobs.

Oozie is like the formidable, yet super-efficient admin assistant who can get things done for you, if you know how to ask

Let's parse that

formidable, yet super-efficient: Oozie is formidable because it is entirely written in XML, which is hard to debug when things go wrong. However, once you've figured out how to work with it, it's like magic. Complex dependencies, managing a multitude of jobs at different time schedules, managing entire data pipelines are all made easy with Oozie

get things done for you: Oozie allows you to manage Hadoop jobs as well as Java programs, scripts and any other executable with the same basic set up. It manages your dependencies cleanly and logically.

if you know how to ask: This is the key, knowing the right configurations parameters which gets the job done, that is the key to mastering Oozie

What's Covered:

Workflow Management: Workflow specifications, Action nodes, Control nodes, Global configuration, real examples with MapReduce and Shell actions which you can run and tweak

Time-based and data-based triggers for Workflows: Coordinator specification, Mimicing simple cron jobs, specifying time and data availability triggers for Workflows, dealing with backlog, running time-triggered and data-triggered coordinator actions

Data Pipelines using Bundles: Bundle specification, the kick-off time for bundles, running a bundle on Oozie

Talk to us!

  • Mail us about anything - anything! - and we will always reply :-)

What are the requirements?

  • Students should have basic knowledge of the Hadoop eco-system and should be able to run MapReduce jobs on Hadoop

What am I going to get from this course?

  • Install and set up Oozie
  • Configure Workflows to run jobs on Hadoop
  • Configure time-triggered and data-triggered Workflows
  • Configure data pipelines using Bundles

What is the target audience?

  • Yep! Engineers, analysts and sysadmins who are interested in big data processing on Hadoop
  • Nope! Beginners who have no knowledge of the Hadoop eco-system

Get started now!



Certificate Available
48114+ Students
24 Lectures
4+ Hours of Video
Lifetime Access
24/7 Support
Instructor Rating
Loonycorn

Loonycorn is comprised of a couple of individuals —Janani Ravi and Vitthal Srinivasan—who have honed their tech expertises at Google and Stanford. The team believes it has distilled the instruction of complicated tech concepts into funny, practical, engaging courses, and is excited to be sharing its content with eager students.

Popular Bundles