Nifi 1.10 STATELESS -Caused by: java.nio.file.NoSuchFileException - apache-nifi

RUN THIS COMMAND
/bin/nifi.sh stateless RunFromRegistry Once --file ./test/stateless_test1.json
LOG
Note: Use of this command is considered experimental. The commands and approach used may change from time to time.
Java home (JAVA_HOME): /home/deltaman/software/jdk1.8.0_211
Java options (STATELESS_JAVA_OPTS): -Xms1024m -Xmx1024m
13:48:39.835 [main] INFO org.apache.nifi.StatelessNiFi - Unpacking 100 NARs
13:50:51.513 [main] INFO org.apache.nifi.StatelessNiFi - Finished unpacking 100 NARs in 131671 millis
Exception in thread "main" java.lang.reflect.InvocationTargetException
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.StatelessNiFi.main(StatelessNiFi.java:103)
... 5 more
Caused by: java.nio.file.NoSuchFileException: ./test/stateless_test1.json
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
at java.nio.file.Files.newByteChannel(Files.java:361)
at java.nio.file.Files.newByteChannel(Files.java:407)
at java.nio.file.Files.readAllBytes(Files.java:3152)
at org.apache.nifi.stateless.runtimes.Program.runLocal(Program.java:119)
at org.apache.nifi.stateless.runtimes.Program.launch(Program.java:67)
... 10 more
it seems no exist file,but i can find the file as follows:
$ cat ./test/stateless_test1.json
{
"registryUrl": "http://10.148.123.12:9991",
"bucketId": "ec1b291e-c3f1-437c-a4e4-c069bd2f6ed1",
"flowId": "b1f73fe8-2874-47a5-970c-6b25eea19497",
"parameters": {
"text" : "xixixixi"
}
}
CONFIGURATION
IDK WHAT IS THE PROBLEM?
ANY SUGGESTION IS APPRECIATION!

/bin/nifi.sh stateless RunFromRegistry Once --file ./test/stateless_test1.json
it is relative path,must use full path,such as
/home/NiFi/nifi-1.10.0/bin/nifi.sh stateless RunFromRegistry Once --file /home/NiFi/nifi-1.10.0/test/stateless_test1.json

Related

How to run .exe file in specific location using gradle

I need to run a .exe file which is in specific location.
This is the code I'm used
tasks.register( 'buildComponents', Exec )
{
dependsOn createSetupIni
doLast
{
exec
{
workingDir = file('tools/configBuild')
executable = 'ConfigBuilder.exe'
args = [ "${configLocation}/setup.ini", "${logFilePath}/configBuild"]
}
}
}
But I'm getting below error during the execution.
Caused by: org.gradle.process.internal.ExecException: A problem occurred starting process 'command 'ConfigBuilder.exe''
at org.gradle.process.internal.DefaultExecHandle.execExceptionFor(DefaultExecHandle.java:232)
at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:209)
at org.gradle.process.internal.DefaultExecHandle.failed(DefaultExecHandle.java:356)
at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:86)
at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: net.rubygrapefruit.platform.NativeException: Could not start 'ConfigBuilder.exe'
at net.rubygrapefruit.platform.internal.DefaultProcessLauncher.start(DefaultProcessLauncher.java:27)
at net.rubygrapefruit.platform.internal.WindowsProcessLauncher.start(WindowsProcessLauncher.java:22)
at net.rubygrapefruit.platform.internal.WrapperProcessLauncher.start(WrapperProcessLauncher.java:36)
at org.gradle.process.internal.ExecHandleRunner.startProcess(ExecHandleRunner.java:97)
at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:70)
... 4 more
Caused by: java.io.IOException: Cannot run program "ConfigBuilder.exe" (in directory "<full path>\tools\configBuild"): CreateProcess error=2, The system cannot find the file specified
at net.rubygrapefruit.platform.internal.DefaultProcessLauncher.start(DefaultProcessLauncher.java:25)
... 8 more
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
... 9 more
Any Idea about this?
Sloved it by removing exec {} block.

Copying data between 2 different hadoop clusters

I am trying to copy data from one HDFS directory to another using distcp:
Source hadoop version:
hadoop version
Hadoop 2.0.0-cdh4.3.1
Destination hadoop version:
hadoop version
Hadoop 2.0.0-cdh4.4.0
Command I am using is:
hadoop distcp hftp://foo.test.net:50070/new_data/raw/new_logs/utc_date=2014-09-01/utc_hour=22 hdfs://localhost.localdomain/user/cloudera/new_logs
Error message I am getting is:
java.io.IOException: Copied: 0 Skipped: 0 Failed: 13
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
==
Logs from "Task Logs":
2014-09-02 13:50:30,947 INFO org.apache.hadoop.tools.DistCp: FAIL utc_hour=22/000012_0 : java.io.IOException: HTTP_OK expected, received 500
at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderUrlOpener.connect(HftpFileSystem.java:376)
at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:119)
at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:187)
at java.io.DataInputStream.read(DataInputStream.java:83)
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:424)
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:547)
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:314)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
2014-09-02 13:50:31,118 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2014-09-02 13:50:31,131 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:java.io.IOException: Copied: 0 Skipped: 0 Failed: 13
2014-09-02 13:50:31,131 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: Copied: 0 Skipped: 0 Failed: 13
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
2014-09-02 13:50:31,145 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
=
Any help?
Thanks,
Rio

giraph/ hadoop reading manifest file

I am trying to run RandomWalkWith Restart example https://github.com/apache/giraph/blob/release-1.0/giraph-examples/src/main/java/org/apache/giraph/examples/RandomWalkWithRestartVertex.java
My Input is data is
12 34 56
34 78
56 34 78
78 34
and I am running
hadoop jar giraph-examples-1.1.0-for-hadoop-2.2.0-jar-with-dependencies.jar GiraphRunner -Dgiraph.zkList=<host>:port -libjars giraph-examples-1.1.0-for-hadoop-2.2.0-jar-with-dependencies.jar
org.apache.giraph.examples.RandomWalkWithRestartComputation
-mc org.apache.giraph.examples.RandomWalkVertexMasterCompute
-wc org.apache.giraph.examples.RandomWalkWorkerContext
-vof org.apache.giraph.examples.VertexWithDoubleValueDoubleEdgeTextOutputFormat
-vif org.apache.giraph.examples.LongDoubleDoubleTextInputFormat
-vip giraph_algorithms/personalized_pr/input/graph.txt
-op giraph_algorithms/personalized_pr/out1 -w 1
But I am getting this error.. :-/
Error: java.lang.IllegalStateException: run: Caught an unrecoverable exception
For input string: "PK�uE META-INF/��PKPK�uEMETA-INF/MANIFEST.MF�M��LK-.�" at
org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101) at
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at
org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554) at
org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by:
java.lang.NumberFormatException: For input string: "PK�uE META-INF/��PKPK�uEMETA-
INF/MANIFEST.MF�M��LK-.�" at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at
java.lang.Long.parseLong(Long.java:441) at java.lang.Long.parseLong(Long.java:483) at
org.apache.giraph.examples.RandomWalkWorkerContext.initializeSources(
RandomWalkWorkerContext.java:131) at org.apache.giraph.examples.RandomWalkWorkerContext.
setStaticVars(RandomWalkWorkerContext.java:160) at
org.apache.giraph.examples.RandomWalkWorkerContext
.preApplication(RandomWalkWorkerContext.java:146) at
org.apache.giraph.graph.GraphTaskManager.workerContextPreApp(
GraphTaskManager.java:815) at
org.apache.giraph.graph.GraphTaskManager.
prepareGraphStateAndWorkerContext(GraphTaskManager.java:451) at
org.apache.giraph.graph.GraphTaskManager.execute(GraphTaskManager.java:266) at
org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:91) ... 7 more
Why is it reading manifest file.. When I specifically saying it to read a file and not even a directory?
Because you passed the libjar argument as the vertex class file.
Like the other arguments, you need to say: -D libjars=your_jar.jar.

Execute permissions do not stick on HDFS files

Calling chmod is not sticking for the execute permissions.
-bash-3.2$ hadoop dfs -ls /user/myuser/myapp/
Found 1 items
-rw-rw-rw- 3 myuser myuser 2056 2013-09-13 23:14 /user/myuser/myapp/paf-rules.properties
-bash-3.2$ hadoop dfs -chmod ugo+x /user/myuser/myapp/*
-bash-3.2$ hadoop dfs -ls /user/myuser/myapp/
Found 1 items
-rw-rw-rw- 3 myuser myuser 2056 2013-09-13 23:14 /user/myuser/myapp/paf-rules.properties
We get a permissions error in which HDFS complains that the Execute permission is missing:
Unable to open readFileFromHdfs based rules file /user/myuser/dqm/paf-rules.properties : org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=myuser, access=EXECUTE, inode="paf-rules.properties":myuser:myuser:rw-rw-rw-
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:960)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:524)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:768)
at com.apple.ist.gbi.edw.etl.fwrk.udf.dqm.UDFPafDqm.readFileFromHdfs(UDFPafDqm.java:81)
at com.apple.ist.gbi.edw.etl.fwrk.udf.dqm.UDFPafDqm.<init>(UDFPafDqm.java:110)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.initialize(GenericUDFBridge.java:145)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:98)
at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:214)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:767)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:888)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:89)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:88)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:125)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:102)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:165)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:7755)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:2310)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:2112)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:6165)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:6136)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:6762)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:6702)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:6723)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:7531)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:244)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:431)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:336)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:909)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:557)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=myuser, access=EXECUTE, inode="paf-rules.properties":myuser:myuser:rw-rw-rw-
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:155)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:125)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5222)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:5201)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:2027)
at org.apache.hadoop.hdfs.server.namenode.NameNode.getFileInfo(NameNode.java:825)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1122)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
at org.apache.hadoop.ipc.Client.call(Client.java:1092)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at $Proxy6.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
at $Proxy6.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:958)
... 42 more

Wso2 bam 2.3.0 ERROR ExecDriver "No such file or directory"

I've installed Wso2 bam 2.3.0 and started Analytics.
I completed all the steps from "Introduction to BAM Analytics Framework"
My system Windows 7 64bit.
I've installed cygwin with packages base, net, security and updated my PATH variable by appending ;C:\cygwin\bin
After "execute" in the console, errors show up. Note that the file exists:
TID: [0] [BAM] [2013-06-18 10:31:35,383] ERROR {org.apache.hadoop.hive.ql.exec.ExecDriver} - Job Submission failed with exception 'org.apache.hadoop.util.Shell$ExitCodeException(chmod: getting attributes of `C:\\wso2\\wso2bam\\tmp\\hadoop\\staging\\gbelyaev-911074626\\.staging': No such file or directory
)'
org.apache.hadoop.util.Shell$ExitCodeException: chmod: getting attributes of `C:\\wso2\\wso2bam\\tmp\\hadoop\\staging\\gbelyaev-911074626\\.staging': No such file or directory
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:553)
at org.apache.hadoop.fs.RawLocalFileSystem.execSetPermission(RawLocalFileSystem.java:545)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:531)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:324)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:798)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:792)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1123)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:792)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:766)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:458)
at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:728)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
{org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2013-06-18 10:31:35,383] ERROR {org.apache.hadoop.hive.ql.exec.ExecDriver} - Job Submission failed with exception 'org.apache.hadoop.util.Shell$ExitCodeException(chmod: getting attributes of `C:\\wso2\\wso2bam\\tmp\\hadoop\\staging\\gbelyaev-911074626\\.staging': No such file or directory
)'
org.apache.hadoop.util.Shell$ExitCodeException: chmod: getting attributes of `C:\\wso2\\wso2bam\\tmp\\hadoop\\staging\\gbelyaev-911074626\\.staging': No such file or directory
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:553)
at org.apache.hadoop.fs.RawLocalFileSystem.execSetPermission(RawLocalFileSystem.java:545)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:531)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:324)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:798)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:792)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1123)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:792)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:766)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:458)
at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:728)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
{org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2013-06-18 10:31:37,903] ERROR {org.apache.hadoop.hive.ql.exec.Task} - Execution failed with exit status: 2 {org.apache.hadoop.hive.ql.exec.Task}
TID: [0] [BAM] [2013-06-18 10:31:37,903] ERROR {org.apache.hadoop.hive.ql.exec.Task} - Execution failed with exit status: 2 {org.apache.hadoop.hive.ql.exec.Task}
TID: [0] [BAM] [2013-06-18 10:31:37,907] ERROR {org.apache.hadoop.hive.ql.exec.Task} - Obtaining error information {org.apache.hadoop.hive.ql.exec.Task}
TID: [0] [BAM] [2013-06-18 10:31:37,907] ERROR {org.apache.hadoop.hive.ql.exec.Task} - Obtaining error information {org.apache.hadoop.hive.ql.exec.Task}
TID: [0] [BAM] [2013-06-18 10:31:37,915] ERROR {org.apache.hadoop.hive.ql.exec.Task} -
Task failed!
After reboot windows it worked fine

Resources