oozie: Could not perform authorization operation, Failed on local exception - hadoop

I was trying to run the oozie example from https://oozie.apache.org/docs/3.1.3-incubating/DG_Examples.html
Then I copied the job.properties from examples/apps/aggregator to my home dictionary in local filesystem, and edited it like below:
nameNode=hdfs://localhost:9000
jobTracker=localhost:8032
queueName=default
examplesRoot=examples
oozie.coord.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/aggregator
start=2010-01-01T01:00Z end=2010-01-01T03:00Z
I am sure the port is right. Cause hadoop fs -ls hdfs://localhost:9000 returned the files in hdfs.
Then I typed this in terminal:
oozie job -oozie http://127.0.0.1:11000/oozie -config ~/job.properties -run
, but it returned:
Error: E0501 : E0501: Could not perform authorization operation, Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "Master/x.x.x.x"; destination host is: "localhost":9000
Any help would be appreciated.

Related

error while running example of oozie job

I tried running my first oozie job by following a blog post.
I used oozie-examples.tar.gz, after extracting, placed examples in hdfs.
I tried running map-reduce job in it but unfortunately got an error.
Ran below command:
oozie job -oozie http://localhost:11000/oozie -config /examples/apps/map-reduce/job.properties -run
Got the error:
java.io.IOException: configuration is not specified at
org.apache.oozie.cli.OozieCLI.getConfiguration(OozieCLI.java:787) at
org.apache.oozie.cli.OozieCLI.jobCommand(OozieCLI.java:1026) at
org.apache.oozie.cli.OozieCLI.processCommand(OozieCLI.java:662) at
org.apache.oozie.cli.OozieCLI.run(OozieCLI.java:615) at
org.apache.oozie.cli.OozieCLI.main(OozieCLI.java:218) configuration is
not specified
I don't know which configuration it is asking for as I am using Cloudera VM and it has by default got all the configurations set in it.
oozie job -oozie http://localhost:11000/oozie -config /examples/apps/map-reduce/job.properties -run
The -config parameter takes an local path not an HDFS path. The workflow.xml needs to be present in the HDFS and path is defined in the job.properties file with the property:
oozie.wf.application.path=<path to the workflow.xml>

hive-builtins-0.9.0.jar FileNotFoundException

I am newbie and I am trying to run a hive query
hive> SELECT xpath('<a><b id="foo">b1</b><b
id="bar">b2</b></a>','//#id') FROM src LIMIT 1;
when I execute the above command I get the following error
Job Submission failed with exception
'java.io.FileNotFoundException(File does not exist:
hdfs://localhost:9100/usr/local/hive/lib/hive-builtins-0.9.0.jar)'
Execution failed with exit status: 2 Obtaining error information
Task failed! Task ID: Stage-1
It is trying to look for hive-builtins-0.9.0.jar in hdfs. But this file is available under $HIVE_HOME/lib. why should it be uploaded to HDFS?
I have the following setting at the start of the hive
~/.hiverc
set hive.cli.print.current.db=true;
set hive.exec.mode.local.auto=true;
If I add this hadoop property in hive-site.xml then it gives me the required output
<property>
<name>fs.defaultFS</name>
<value>file:///</value>
</property>
but ideally I want to set it to
<value>hdfs://localhost</value>
as I have other hadoop specific java programs that use hdfs. What is the mistake I am making here. Is there a configuration that I need to set while starting up.
As requested $PATH information
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/local/hadoop/bin:/usr/local/hadoop/sbin:/usr/local/hadoop/bin:/usr/local/hadoop/sbin:/usr/local/hadoop/bin:/usr/local/hadoop/sbin:/usr/local/hive/bin
Please help.
Many thanks

Regarding job's failure running in oozie

Job running in oozie is getting the following error:
hduser#ubuntu:~/oozie/distro/target/oozie-3.3.2-distro/oozie-3.3.2$ bin/oozie job -oozie -config examples/apps/map-reduce/job.properties -run
Error: E0902 : E0902: Exception occured: [Call to 127.0.0.1:8020 failed on local exception: java.io.IOException: Broken pipe]
How can I solve this. Thanks
Looks like Oozie has trouble connecting to your name-node.
Its trying to connect to 127.0.0.1:8020, is this where the namenode is running?
If yes, double check it is running.
If no, make sure job.properties has a line like: nameNode=hdfs://namenode_host:8020
Which should point to the correct name node location.
Reference ${nameNode} in the section of your action.

Running Hadoop file commands generate error

I have created a hadoop psuedo-distributed cluster on VirtualBox Unbutu 12.04.
Running jps command shows that DataNode and NameNode processes are up.
I am trying to execute the following DFS command
hadoop fs -put conf input
But the above command or any other DFS command fails with
Bad connection to FS. Command aborted. exception: No FileSystem for scheme: hsdf
Any suggestions to get the above working?
Please check the path of the conf and input file your trying to pass.

Oozie job configuration app directory not found on HDFS

I installed a pseudo-distributed version of Cloudera on my Linux box, and ran some simple MapReduce examples with success. However, I'm trying to get Oozie to work, and am completely baffled by the errors I am receiving when attempting to execute a simple job workflow:
tim#phocion:~$ oozie version
Oozie client build version: 3.1.3-cdh4.0.1
Copy the pre-packaged examples to HDFS and execute, per the documentation:
tim#phocion:~$ oozie job -oozie http://phocion:11000/oozie -config /user/tim/examples/apps/map-reduce/job.properties -run
Error: E0504 : E0504: App directory [hdfs://phocion:8020/user/tim/examples/apps/map-reduce] does not exist
Check to see if the file exists:
tim#phocion:~$ hdfs dfs -ls /user/tim/examples/apps/map-reduce
Found 3 items
-rwxr-xr-x 1 tim tim 995 2012-10-03 14:47 /user/tim/examples/apps/map-reduce/job.properties
drwxrwxr-x - tim tim 4096 2012-10-03 14:47 /user/tim/examples/apps/map-reduce/lib
-rwxr-xr-x 1 tim tim 2559 2012-10-03 14:47 /user/tim/examples/apps/map-reduce/workflow.xml
It does. Can I connect to phocion:8020?
tim#phocion:~$ telnet phocion 8020
Trying 127.0.1.1...
Connected to phocion.
Escape character is '^]'.
I can. So, basically, I'm at a total loss as to what this error is trying to tell me - the folder very much does exist. I'm assuming the error is too vague to fully communicate what the issue is, but I've found virtually nothing out there that could point me in the right direction.
I can also replicate this error with other 3rd party tutorials.
Spent much time pouring through configuration files to the point of not wanting to look at a computer ever again. Maybe I'm over thinking the issue here, but any help would be greatly appreciated.
EDIT: Adding the full job.properties (not too different from the default):
nameNode=hdfs://phocion:8020
jobTracker=phocion:8021
queueName=default
examplesRoot=examples
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/map-reduce
outputDir=map-reduce
MORE EDITS: I get the same exact error when the folder DOES NOT exist, and after I put if back into hdfs. Last-ditch idea that its a permissions issue, chmod 777 still gets the same error. Full HDFS path passed on the command line doesn't fix the issue. Running it under oozie and even root accounts don't work:
tim#phocion:~$ oozie job -oozie http://phocion:11000/oozie -run -config /home/tim/examples/apps/map-reduce/job.properties -Doozie.wf.application.path=hdfs://phocion:8020/user/tim/examples/apps/map-reduce
Error: E0504 : E0504: App directory [hdfs://phocion:8020/user/tim/examples/apps/map-reduce] does not exist
tim#phocion:~$ hdfs dfs -put examples/ /user/tim/
12/10/04 13:26:43 INFO util.NativeCodeLoader: Loaded the native-hadoop library
tim#phocion:~$ oozie job -oozie http://phocion:11000/oozie -run -config /home/tim/examples/apps/map-reduce/job.properties -Doozie.wf.application.path=hdfs://phocion:8020/user/tim/examples/apps/map-reduce
Error: E0504 : E0504: App directory [hdfs://phocion:8020/user/tim/examples/apps/map-reduce] does not exist
tim#phocion:~$ hdfs dfs -chmod -R 777 /user/tim/examples/
12/10/04 13:28:16 INFO util.NativeCodeLoader: Loaded the native-hadoop library
tim#phocion:~$ oozie job -oozie http://phocion:11000/oozie -run -config /home/tim/examples/apps/map-reduce/job.properties -Doozie.wf.application.path=hdfs://phocion:8020/user/tim/examples/apps/map-reduce
Error: E0504 : E0504: App directory [hdfs://phocion:8020/user/tim/examples/apps/map-reduce] does not exist
tim#phocion:~$ sudo -u oozie oozie job -oozie http://phocion:11000/oozie -run -config /home/tim/examples/apps/map-reduce/job.properties -Doozie.wf.application.path=hdfs://phocion:8020/user/tim/examples/apps/map-reduce
[sudo] password for tim:
Error: E0504 : E0504: App directory [hdfs://phocion:8020/user/tim/examples/apps/map-reduce] does not exist
tim#phocion:~$ sudo -u root oozie job -oozie http://phocion:11000/oozie -run -config /home/tim/examples/apps/map-reduce/job.properties -Doozie.wf.application.path=hdfs://phocion:8020/user/tim/examples/apps/map-reduce
Error: E0504 : E0504: App directory [hdfs://phocion:8020/user/tim/examples/apps/map-reduce] does not exist
Should this command work in theory?
tim#phocion:~$ hdfs dfs -ls hdfs://phocion:8020/user/tim/examples/apps/map-reduce
ls: `hdfs://phocion:8020/user/tim/examples/apps/map-reduce': No such file or directory
This shows up in hadoop-hdfs logs after executing the oozie command:
2012-10-04 13:50:00,152 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 113297
2012-10-04 13:50:00,874 INFO org.apache.hadoop.hdfs.server.namenode.TransferFsImage: Opening connection to http://localhost.localdomain:50090/getimage?getimage=1&txid=113296&storageInfo=-40:2092007576:0:cluster8
2012-10-04 13:50:00,875 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.net.ConnectException: Connection refused
2012-10-04 13:50:00,876 WARN org.mortbay.log: /getimage: java.io.IOException: GetImage failed. java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:529)
at java.net.Socket.connect(Socket.java:478)
at sun.net.NetworkClient.doConnect(NetworkClient.java:163)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:395)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:530)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:234)
at sun.net.www.http.HttpClient.New(HttpClient.java:307)
at sun.net.www.http.HttpClient.New(HttpClient.java:324)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:970)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:911)
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:836)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1172)
In addition to HarshJ's comment, check your error message:
Error: E0504 : E0504: App directory [hdfs://phocion:8020/user/tim/examples/apps/demo] does not exist
And the hadoop fs -ls listing you provided:
/user/tim/examples/apps/map-reduce/
And play spot the difference:
/user/tim/examples/apps/demo
/user/tim/examples/apps/map-reduce/
try configuring as follows:
oozie.wf.application.path=/user/tim/examples/apps/map-reduce
I had a same issue and got it fixed by exporting the correct oozie url.
To export you should use the below command
export OOZIE_URL=http://someip:11000/oozie
To get this oozie url you need to use hue to connect you cluster and navigate to Workflows where you can find a tab called oozie. Inside this you should see gauges where a lot of properties will be listed. Look for the property oozie.servers.
What you need to do is to -copyFromLocal the examples folder to the location specified in the jobs config.

Resources