Flash Sale: Flat 70% + Addl. 25% Off on all Courses | Use Coupon DS25 in Checkout | Offer Expires In

  Enroll Now

Running Oozie Workflow From Command Line

Hands On Steps

  1. Login to Web Console

  2. Copy oozie examples to your home directory in web console: cp /usr/hdp/current/oozie-client/doc/oozie-examples.tar.gz ./

  3. Extract files from tar tar -zxvf oozie-examples.tar.gz

  4. Edit examples/apps/map-reduce/job.properties and set: nameNode=hdfs://10.142.1.1:8020 jobTracker=10.142.1.2:8050 queueName=default examplesRoot=examples

  5. Copy the examples directory to HDFS hadoop fs -copyFromLocal examples

  6. Run the job oozie job -oozie http://10.142.1.2:11000/oozie -config examples/apps/map-reduce/job.properties -run

  7. Check the job status for the job_id printed in previous step oozie job -oozie http://10.142.1.2:11000/oozie -info job_id

Script

Let’s run an Oozie job for MapReduce action. Login to CloudxLab Linux console. Copy Oozie examples to your home directory in the console. Extract files from tar. Edit examples/apps/map-reduce/job.properties and set the value of namenode and jobtracker. We can find the namenode host from Ambari under “HDFS” section.

We will be running examples/apps/map-reduce/workflow.xml in our job. Copy the examples directory to HDFS and run the job using the command displayed on the screen. cxln2.c.thelab-240901.internal:11000 is the host and port where Oozie server is running.

Press enter. We will get the job id in the command prompt. To check the status of job type command displayed on the screen. Job status is "Running".


No hints are availble for this assesment

Answer is not availble for this assesment

Loading comments...