could not locate oozie share lib - hadoop

The details of my oozie job are given below:
I am getting the following error:
57-oozie-oozi-W#MR] Error starting action [MR]. ErrorType [FAILED], ErrorCode [EJ001], Message [Could not locate Oozie sharelib]
org.apache.oozie.action.ActionExecutorException: Could not locate Oozie sharelib
at org.apache.oozie.action.hadoop.JavaActionExecutor.addSystemShareLibForAction(JavaActionExecutor.java:603)
at org.apache.oozie.action.hadoop.JavaActionExecutor.addAllShareLibs(JavaActionExecutor.java:698)
at org.apache.oozie.action.hadoop.JavaActionExecutor.setLibFilesArchives(JavaActionExecutor.java:689)
at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:884)
at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1135)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:228)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:63)
I have put lib in /user/root ,user/oozie .But still it is not able to find it.

Quoting the Oozie documentation a MapReduce action should not care about Shared Libs:
The Oozie sharelib TAR.GZ file bundled with the distribution contains
the necessary files to run Oozie map-reduce streaming, pig, hive,
sqooop, and distcp actions ... other actions (mapreduce, shell, ssh,
and java) do not require the sharelib to be installed.
Anyway, the root dir for the Shared Libs should be something like
/user/oozie/share/lib/
Your setup is probably missing the "share" part (see Quick Start)
And if you want to try a non-default location then look at the documentation for "sharelib create" command, and also for specifics about Oozie server config.
http://oozie.apache.org/docs/4.1.0/AG_Install.html#Oozie_Share_Lib
plus #Oozie_Server_Setup

Related

oozie fails with Could not load db driver class: oracle.jdbc.OracleDriver

I am getting below error while executing sqoop export command(in shell script) with oozie.
"java.lang.RuntimeException: Could not load db driver class: oracle.jdbc.OracleDriver"
sqoop export from cli(edge node) works fine.
I have added the ojdbc6.jar to below locations.
/opt/cloudera/parcels/CDH-5.7.1-1.cdh5.7.1.p0.11/lib/sqoop/lib/
(HDFS locations)
/user/oozie/share/lib/sqoop/ and
/user/oozie/share/lib/lib_20161215195933/sqoop
i have also set oozie.use.system.libpath=true in my oozie job.properties file
Please guide me if i am missing any setting.
log content
Thanks & Regards,
Sonali
Make sure that you upload a file to a directory /user/oozie/share/lib/sqoop (it could looks like /user/oozie/share/lib/lib_${timestamp}/sqoop for Cloudera and HDP).
Check if ojdbc6.jar file is correct - check if it contains OracleDriver.class and make sure size of the file is ok. It could be error while downloading.
Check permissions to ojdbc6.jar file (eventually, you can try to give 755 permissions to this file). Check who is the owner of the file - it should be oozie by default.
Update Oozie sharelib by execute below command (run this command on the host where Oozie Server is located):
sudo -u oozie oozie admin -oozie http://<Oozie_Server_Host>:11000/oozie -sharelibupdate
Verify sharelib for sqoop:
sudo -u oozie oozie admin -oozie http://<Oozie_Server_Host>:11000/oozie -shareliblist sqoop*
You can always restart Oozie service. It should update sharelib.
Create a directory named lib next to your workflow.xml in HDFS and put jars in there. Oozie will automatically make those jars available to all actions in that workflow.
Cloudera users should check this article. Especially paragraph 'One Last Thing'.

Oozie sharelib does not exists

I am setting up oozie for first time. After all the setup is done, when I ran example workflows, they failed. And -errorlog shows following message
org.apache.oozie.action.ActionExecutorException:
File /user/userName/share/lib does not exist
I did run the below command to create sharelib while doing setup and I can see the sharelib with all the jars in hdfs webconsole-
./oozie-setup.sh sharelib create -fs hdfs://localhost:9000
but when I verify the available sharelib using
oozie admin -shareliblist
it shows only [Availble Sharelib] message with no libs actually listed.
Any idea, what can be the issue?

Unable to deploy Spark jobs using Oozie

I need to keep a spark job running 24/7 and for this I am using Oozie. To do this I have written a workflow.xml and job.properties files, containing the needful information to invoke it.
However when I try to send the oozie job using this:
oozie job –config /home/oozie/tst/job.properties -run
I get the following error message, which is very clear:
java.io.IOException: configuration is not specified
at org.apache.oozie.cli.OozieCLI.getConfiguration(OozieCLI.java:816)
at org.apache.oozie.cli.OozieCLI.jobCommand(OozieCLI.java:1055)
at org.apache.oozie.cli.OozieCLI.processCommand(OozieCLI.java:686)
at org.apache.oozie.cli.OozieCLI.run(OozieCLI.java:639)
at org.apache.oozie.cli.OozieCLI.main(OozieCLI.java:225)
configuration is not specified
The problem here is that the configuration file (job.properties) exists locally on the path specified. I also PUT the directory containing both files and .jar in the HDFS.
Any idea why is this failing?
Is Oozie the best tool for this task I have?
The config parameter takes local path not HDFS. check job.properties present in /home/oozie/tst/job.properties
check job.properties contain oozie.wf.application.path=PATH_TO_HDFS_PATH_WHERE_WORKFLOW.XML_IS_PRESENT
Plus I see the dash(-) given in config parameter is different then dash(-) in run parameter
Specify the host in your command
oozie job --oozie http://your_host:11000/oozie -config /home/oozie/tst/job.properties -run
11000 is deafult port

Running Spark Jobs via Oozie

Is it possible to run Spark Jobs e.g. Spark-sql jobs via Oozie?
In the past we have used Oozie with Hadoop. Since we are now using Spark-Sql on top of YARN, looking for a way to use Oozie to schedule jobs.
Thanks.
Yup its possible ... The procedure is also same, that you have to provide Oozia a directory structure having coordinator.xml, workflow.xml and a lib directory containing your Jar files.
But remember Oozie starts the job with java -cp command, not with spark-submit, so if you have to run it with Oozie, Here is a trick.
Run your jar with spark-submit in background.
Look for that process in process list. It will be running under java -cp command but with some additional Jars, that are added by spark-submit. Add those Jars in CLASS_PATH. and that's it. Now you can run your Spark applications through Oozie.
1. nohup spark-submit --class package.to.MainClass /path/to/App.jar &
2. ps aux | grep '/path/to/App.jar'
EDITED: You can also use latest Oozie, which has Spark Action also.
To run Spark SQL by Oozie you need to use Oozie Spark Action.
You can locate oozie.gz on your distribution. Usually in cloudera you can find this oozie examples directory at below path.
]$ locate oozie.gz
/usr/share/doc/oozie-4.1.0+cdh5.7.0+267/oozie-examples.tar.gz
Spark SQL need hive-site.xml file for execution which you need to provide in workflow.xml
< spark-opts>--file /hive-site.xml < /spark-opts>

Oozie + Sqoop: JDBC Driver Jar Location

I have a 6 node cloudera based hadoop cluster and I'm trying to connect to an oracle database from a sqoop action in oozie.
I have copied my ojdbc6.jar into the sqoop lib location (which for me happens to be at: /opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/lib/sqoop/lib/ ) on all the nodes and have verified that I can run a simple 'sqoop eval' from all the 6 nodes.
Now when I run the same command using Oozie's sqoop action, I get "Could not load db driver class: oracle.jdbc.OracleDriver"
I have read this article about using shared libs and it makes sense to me when we're talking about my task/action/workflow specific dependencies. But I see a JDBC driver installation as an extention to sqoop and so I think it belongs in the sqoop installation lib.
Now the question is, while sqoop sees this ojdbc6 jar I have put into it's lib folder, how come my Oozie workflow doesn't see it?
Is this something expected or am I missing something?
As an aside, what do you guy think about where is the appropriate location for a JDBC driver jar?
Thanks in advance!
The JDBC driver jar (and any jars it depends on) should go in your Oozie sharelib folder on HDFS. I'm running Hortonworks Data Platform 1.2 instead of Cloudera 4.2 so the details may vary, but my JDBC driver is located in /user/oozie/share/lib/sqoop. This should allow you to run Sqoop with the JDBC via Oozie.
It is not necessary to put to the JDBC driver jar in the sqoop lib on the data nodes. In my setupt I can't run a simple sqoop eval from the command line on my data nodes. I understand the logic for why you thought this would work. The reason the JDBC driver jar needs to be on HDFS is so that all the data nodes have access to it. Your solution should accomplish the same goal. I'm not familiar enough with the inner workings of Oozie to say why using the sharelib works but your solution does not.
In CDH5, you should put the jar to '/user/oozie/share/lib/lib_${timestamp}/sqoop', and after that, you must update the sharelib or restart oozie.
update sharelib:
oozie admin -oozie http://localhost:11000/oozie -sharelibupdate
If you are using CDH-5 the JDBC driver jar (and any jars it depends on) should go in '/user/oozie/share/lib/lib_timestamp/sqoop' folder on HDFS.
I was facing the same issue it was not able to find the mysql jar. I am using cloudera 4.4 in this even oozie admin -oozie http://localhost:11000/oozie -sharelibupdate command will not work
To resolve the issue I had followed the below steps:
create a user in Hue with hdfs and provide the admin privileges
using Hue UI upload the jar into /user/oozie/share/lib/sqoop hdfs path
or you can use below command:
hadoop put /var/lib/sqoop2/mysql-connector-java.jar /user/oozie/share/lib/sqoop
Once the jar is placed run the oozie command.

Resources