Login using Social Account
     Continue with GoogleLogin using your credentials
Login to Web Console
Copy oozie examples to your home directory in web console: cp /usr/hdp/current/oozie-client/doc/oozie-examples.tar.gz .
Extract files from tar
tar -zxvf oozie-examples.tar.gz
Edit examples/apps/map-reduce/job.properties and set:
nameNode=hdfs://10.142.1.1:8020
jobTracker=10.142.1.2:8050
queueName=default
examplesRoot=examples
Copy the examples directory to HDFS
hadoop fs -copyFromLocal examples
Run the job
oozie job -oozie http://10.142.1.2:11000/oozie -config examples/apps/map-reduce/job.properties -run
Check the job status for the job_id printed in previous step
oozie job -oozie http://10.142.1.2:11000/oozie -info job_id
Let’s run an Oozie job for MapReduce action. Login to CloudxLab Linux console. Copy Oozie examples to your home directory in the console. Extract files from tar. Edit examples/apps/map-reduce/job.properties and set the value of namenode and jobtracker. We can find the namenode host from Ambari under “HDFS” section.
We will be running examples/apps/map-reduce/workflow.xml in our job. Copy the examples directory to HDFS and run the job using the command displayed on the screen. cxln2.c.thelab-240901.internal:11000 is the host and port where Oozie server is running.
Press enter. We will get the job id in the command prompt. To check the status of job type command displayed on the screen. Job status is "Running".
Loading comments...