I am trying to successfully run a sqoop-action in Oozie using a Hadoop Cluster.
Whenever I check on the jobs status, Oozie returns with the following status update:
Actions
ID Status Ext ID Ext Status Err Code
0000037-140930230740727-oozie-oozi-W#:start: OK - OK -
0000037-140930230740727-oozie-oozi-W#sqoop-load ERROR job_1412278758569_0002 FAILED/KILLEDJA018
0000037-140930230740727-oozie-oozi-W#sqoop-load-fail OK - OK E0729
Which leads me to believe that there is nothing wrong with my Workflow, as opposed to some permission I am missing.
My jobs.properties config:
nameNode=hdfs://mynamenode.demo.com:8020
jobTracker=mysnamenode.demo.com:8050
queueName=default
workingRoot=working_dir
jobOutput=/user/test/out
oozie.use.system.libpath=true
oozie.libpath=/user/oozie/share/lib
oozie.wf.application.path=${nameNode}/user/test/${workingRoot}
MyWorkFlow.xml :
<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns='uri:oozie:workflow:0.4' name='sqoop-workflow'>
<start to='sqoop-load' />
<action name="sqoop-load">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="${nameNode}/user/test/${workingRoot}/out-data/sqoop" />
<mkdir path="${nameNode}/user/test/${workingRoot}/out-data"/>
</prepare>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<command>import --connect jdbc:oracle:thin:#10.100.50.102:1521/db --username myID --password myPass --table SomeTable -target-dir /user/test/${workingRoot}/out-data/sqoop </command>
</sqoop>
<ok to="end"/>
<error to="sqoop-load-fail"/>
</action>
<kill name="sqoop-load-fail">
<message>Sqoop export failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name='end' />
</workflow-app>
Steps I have taken:
Looking up the Error...didn't find much beyound what I mentioned previously
checking that the required ojdbc.jar file was executable and that the /user/oozie/share/lib/sqoop directory and is accessible on HDFS
checking to see if I have any prexisting directories that might be causing a problem
I have been searching the internet and my log files for an answer....any help provided would be much appreciated....
Update:
Ok...so I add ALL of the jars within /usr/lib/sqoop/lib to /user/oozie/share/lib/sqoop. I am still getting the same errors. checking the job log...there is something I did not post previously:
2014-10-03 11:16:35,586 WARN CoordActionUpdateXCommand:542 - USER[ambari-qa] GROUP[-] TOKEN[] APP[sqoop-workflow] JOB[0000015-141002171510902-oozie-oozi-W] ACTION[-] E1100: Command precondition does not hold before execution, [, coord action is null], Error Code: E1100
As you can see I am running the job as "Super User".....and the error is exactly the same. So it cannot be a permission issue. I am thinking there is a jar that is required other than those required to be in the /user/oozie/share/lib/sqoop directory.....perhaps I need to copy the jars for mapreduce to be in /user/oozie/share/lib/mapreduce ?
Ok...problem solved.
Apparently EVERY component of the Oozie Workflow/Job must have it's corresponding *.jar dependencies uploaded to the Oozie SharedLib(/user/oozie/share/lib/) directories corresponding to those components.
I copied ALL the *.jars in /usr/lib/sqoop/lib into -> /user/oozie/share/lib
I copied ALL the *.jars in the /usr/lib/oozie/lib into -> /user/oozie/share/lib/oozie
After running the job again....the workflow stalled, and the error given was different from the last one....namely that this time around....the workflow was trying to create a directory on HDFS that already existed, so I removed that directory and then ran the job again.....
SUCCESS!
Side Note: People really need to write better exception messages. If this was just an issue a few people where having....then fine....but this is simply not the case. This particular error is giving more than a few people fits if the requests for help online are any indication.
I faced the same problem. Just adding a
<archive>path/in/hdfs/ojdbc6.jar#ojdbc6.jar</archive>
to my workflow.xml within the <sqoop> </sqoop> tags worked for me. Got the reference here.
Related
I am trying to copy a file from HDFS one directory to other directory in HDFS, with the help of shell script as a part of oozie Job, but i am not able to copy it through oozie.
Can we copy file from HDFS one directory to other director in HDFS using oozie.
when i am running the oozie job, i am not any getting error.
it is showing status SUCCEEDED but file is not copying to destination directory.
oozie Files are below.
test.sh
#!/bin/bash
echo "listing files in the current directory, $PWD"
sudo hadoop fs -cp /user/cloudera/RAVIOOZIE/input/* /user/cloudera/RAVIOOZIE/output/
ls # list files
my workflow.xml is
<workflow-app name="RAMA" xmlns="uri:oozie:workflow:0.5">
<start to="shell-381c"/>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="shell-381c">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<exec>/user/cloudera/test.sh</exec>
<file>/user/cloudera/test.sh#test.sh</file>
<capture-output/>
</shell>
<ok to="End"/>
<error to="Kill"/>
</action>
<end name="End"/>
and my job.properties
oozie.use.system.libpath=True
security_enabled=False
dryrun=True
jobTracker=localhost:8032
nameNode=hdfs://quickstart.cloudera:8020
oozie.wf.application.path=${nameNode}/user/cloudera/test/
please help on this. why file is not copying to my destination director.
please let me know is there any thing i missed.
As mentioned in the comments by #Samson:
If you want to do hadoop actions with oozie, you should use a hdfs action rather than a shell action for that.
I am not sure why you don't get an error, but here is some speculation on what might happen:
You give oozie the task of starting a shell action, it succesfully starts the shell action and reports a success. Then the shell action fails, but that's not oozies problem.
can I write a sqoop import command in a script and excute it in oozie as coordinator workflow?
I have tired to do so and found an error saying sqoop command not found even if i give the absolute path for sqoop to execute
script.sh is as follows
sqoop import --connect 'jdbc:sqlserver://xx.xx.xx.xx' -username=sa -password -table materials --fields-terminated-by '^' -- --schema dbo -target-dir /user/hadoop/CFFC/oozie_materials
and I have placed the file in HDFS and gave oozie its path.The workflow is as follows :
<workflow-app xmlns='uri:oozie:workflow:0.3' name='shell-wf'>
<start to='shell1' />
<action name='shell1'>
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<exec>script.sh</exec>
<file>script.sh#script.sh</file>
</shell>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Script failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name='end' />
the oozie returns an error as sqoop command not found in the mapreduce log.
so is that a good practice?
Thanks
The shell action will be running as a mapper task as you have observed. The sqoop command needs to be present on each data node where the mapper is running. If you make sure sqoop command line is there and has proper permission for the user who submitted the job, it should work.
The way to verify could be :
ssh to datanode as specific user
run command line sqoop to see if it works
try to add sqljdbc41.jar sqlserver driver to your HDFS and add archive tag in your workflow.xml as below and then try to run oozie workflow run command:
<archive>${HDFSAPATH}/sqljdbc41.jar#sqljdbc41.jar</archive>
If problem exists then..add hive-site.xml with below properties,
javax.jdo.option.ConnectionURL
hive.metastore.uris
Keep hive-site.xml in HDFS, and add file tag in workflow.xml and restart oozie workflow.xml
The following is my workflow.xml
<workflow-app xmlns="uri:oozie:workflow:0.3" name="import-job">
<start to="createtimelinetable" />
<action name="createtimelinetable">
<sqoop xmlns="uri:oozie:sqoop-action:0.3">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.compress.map.output</name>
<value>true</value>
</property>
</configuration>
<command>import --connect jdbc:mysql://10.65.220.75:3306/automation --table ABC --username root</command>
</sqoop>
<ok to="end"/>
<error to="end"/>
</action>
<end name="end"/>
</workflow-app>
Getting the following error on trying to submit the job:
Error: E0701 : E0701: XML schema error, cvc-elt.1.a: Cannot find the declaration of element 'action'.
However, oozie validate workflow.xml returns:
Valid worflow-app
Anyone who faced and resolved a similar issue in the past?
Confirm if you have copied your workflow.xml to hdfs. You need not copy job.properties to hdfs but have to copy all the other files and libraries to hdfs
For those who reached here by googling the error message below is the general way to resolve Oozie schema issues:
Once your workflow.xml is complete, it's a best practice to validate it against oozie XSD schema file rather than submitting the Ooozie job and facing the issue later.
note on What is XSD schema:
XSD schema is a kind of validation file which narrates,
a. Sequence of tags
b. whether a tag should be present or not
c. what are the valid sub-tags in a tag, etc.
How to validate workflow XML against XSD?
a. find out the specific XSD, this is seen in xmlns(xml namespace) property
< workflow-app name='FooBarWorkFlow' xmlns="uri:oozie:workflow:0.4">
in this case, it is uri:oozie:workflow:0.4. find the XSD file of uri:oozie:workflow:0.4(get it from appendix of Oozie official site or can be found easily by Googling)
b. There are numerous XML validation sites(example https://www.liquid-technologies.com/online-xsd-validator), provide your Workflow XML file ,XSD file and validate
Errors in workflow XML file will be listed out with line and column info. Rectify these, now use the valid Workflow XML file to avoid schema validation errors in oozie.
oozie validate some_workflow.xml
Tells you line numbers and is much easier to understand than logging output.
Oozie has a config property called oozie.launcher.action.main.class where you can pass in the name of a "main class" for a map-reduce action (or a shell action), like so:
<configuration>
<property>
<name>oozie.launcher.action.main.class</name>
<value>com.company.MyCascadingClass</value>
</property>
</configuration>
But I need to pass arguments to my main class and can't see a way to do it. Any ideas?
I'm asking because I'm trying to launch a Cascading class/flow from within Oozie and all options I've tried so far have failed. If anyone has gotten Cascading to work from Oozie, let me know and I'll post another question asking that in particular.
As of Oozie 3 (haven't tried Oozie 4 yet), the answer to my main question is: you can't. There is no facility (strangely) for specifying any arguments to your main class defined with the oozie.launcher.action.main.class property.
#Dmitry's suggestion in the comments to just use the Oozie java action works for a Cascading job (or any Hadoop dependent job) because Oozie puts all the Hadoop jars in the classpath when it launches the job.
I've documented a working example of launching a Cascading job from Oozie at my blog here: http://thornydev.blogspot.com/2013/10/launching-cascading-job-from-apache.html
Here is the workflow.xml file that worked for me:
<workflow-app xmlns='uri:oozie:workflow:0.2' name='cascading-wf'>
<start to='stage1' />
<action name='stage1'>
<java>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<main-class>com.mycompany.MyCascade</main-class>
<java-opts></java-opts>
<arg>/user/myuser/dir1/dir2</arg>
<arg>my-arg-2</arg>
<arg>my-arg-3</arg>
<file>lib/${EXEC}#${EXEC}</file>
<capture-output />
</java>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>FAIL: Oh, the huge manatee!</message>
</kill>
<end name="end"/>
</workflow-app>
In the job.properties file that accompanies the workflow.xml, the EXEC property is defined as:
EXEC=mybig-shaded-0.0.1-SNAPSHOT.jar
and the job is put into the lib directory below where these two definition files are.
I'm trying to run a hive action through Oozie. My workflow.xml is as follows:
<workflow-app name='edu-apollogrp-dfe' xmlns="uri:oozie:workflow:0.1">
<start to="HiveEvent"/>
<action name="HiveEvent">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>oozie.hive.defaults</name>
<value>${hiveConfigDefaultXml}</value>
</property>
</configuration>
<script>${hiveQuery}</script>
<param>OUTPUT=${StagingDir}</param>
</hive>
<ok to="end"/>
<error to="end"/>
</action>
<kill name='kill'>
<message>Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name='end'/>
And here is my job.properties file:
oozie.wf.application.path=${nameNode}/user/${user.name}/hiveQuery
oozie.libpath=${nameNode}/user/${user.name}/hiveQuery/lib
queueName=interactive
#QA
nameNode=hdfs://hdfs.bravo.hadoop.apollogrp.edu
jobTracker=mapred.bravo.hadoop.apollogrp.edu:8021
# Hive
hiveConfigDefaultXml=/etc/hive/conf/hive-default.xml
hiveQuery=hiveQuery.hql
StagingDir=${nameNode}/user/${user.name}/hiveQuery/Output
When I run this workflow, I end up with this error:
ACTION[0126944-130726213131121-oozie-oozi-W#HiveEvent] Launcher exception: org/apache/hadoop/hive/cli/CliDriver
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/cli/CliDriver
Error Code: JA018
Error Message: org/apache/hadoop/hive/cli/CliDriver
I'm not sure what this error means. Where am I going wrong?
EDIT
This link says error code JA018 is: JA018 is output directory exists error in workflow map-reduce action. But in my case the output directory does not exist. This makes it all the more confusing
I figured out what was going wrong!
The class org/apache/hadoop/hive/cli/CliDriver is required for execution of a Hive Action. This much is obvious from the error message. This class is within this jar file: hive-cli-0.7.1-cdh3u5.jar. (In my case cdh3u5 in my cloudera version).
Oozie checks for this jar in the ShareLib directory. The location of this directory is usually configured in hive-site.xml, with the property name as oozie.service.WorkflowAppService.system.libpath, so Oozie should find the jar easily.
But in my case, hive-site.xml did not include this property, so Oozie didn't know where to look for this jar, hence the java.lang.NoClassDefFoundError.
To resolve this, I had to include a parameter in my job.properties file to point oozie to the location of the ShareLib directory, as follows:
oozie.libpath=${nameNode}/user/oozie/share/lib. (depends on where SharedLib directory is configured on your cluster).
This got rid of the error!