I have tried to create database in hive using java.
But I got this error while running the code:
Exception in thread "main" java.sql.SQLException: Error while
processing statement: FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got
exception: org.apache.hadoop.security.AccessControlException
Permission denied: user=hive, access=WRITE,
inode="/user/hive/warehouse/sampledb.db":root:supergroup:drwxr-xr-x
I have given read an write privileges to the folder /user/hive/warehouse/. Still I am getting this error. Any remedies???
Related
I am getting below error while starting the dse:
ERROR [main] 2020-02-26 13:08:33,269 DseModule.java:97 - {}. Exiting...
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) An exception was caught and reported. Message: Unable to check disk space available to /u01/dse_ops/logs. Perhaps the Cassandra user does not have the necessary permissions
at com.datastax.bdp.DseModule.configure(Unknown Source)
While creating one table in Hive I am getting this below error:
Error while processing statement: FAILED: Execution Error, return code
1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:Got exception:
org.apache.hadoop.security.AccessControlException Permission denied:
user=admin, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281)
at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262)
at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242)
at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:169)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6621)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6603)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6555)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4350)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4320)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4293)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:869)
at
org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:323)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:608)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at
org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086) at
org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080) )
You should read the log before posting here. Also it would be helpful to write the hive statement you tried.
org.apache.hadoop.security.AccessControlException Permission denied:
user=admin, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x
means that you run the statement as a user(admin) with no authorization to write on the hive database. You should run the statement as another user or give user admin the rights to save on hive
Why is H2 trying to access the file below?
org.h2.message.DbException: Log file error: "/private/var/empty/test.trace.db", cause: "java.io.FileNotFoundException: /private/var/empty/test.trace.db (Permission denied)" [90034-195]
org.h2.message.DbException: Log file error: "/private/var/empty/test.trace.db", cause: "java.io.FileNotFoundException: /private/var/empty/test.trace.db (Permission denied)" [90034-195]
at org.h2.message.DbException.get(DbException.java:168)
Running the following commands is sucessful
hadoop fs -ls /path/
hadoop fs -cat /path/.pig_schema
And all the files in that dir has a -rwxr-xr-x permission
However, in the pig console, when running:
A = LOAD '/path/' USING PigStorage();
B = LIMIT A 5;
DUMP B;
Encounters a permission error
2015-08-27 08:47:59,734 [main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: Permission denied
2015-08-27 08:47:59,735 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2017: Internal error creating job configuration.
Any idea why ?
EDIT 1: Added error log
================================================================================ Pig Stack Trace
--------------- ERROR 2017: Internal error creating job configuration.
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable
to open iterator for alias B at
org.apache.pig.PigServer.openIterator(PigServer.java:857) at
org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:746)
at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:196)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:171)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69) at
org.apache.pig.Main.run(Main.java:543) at
org.apache.pig.Main.main(Main.java:157) Caused by:
org.apache.pig.PigException: ERROR 1002: Unable to store alias B at
org.apache.pig.PigServer.storeEx(PigServer.java:956) at
org.apache.pig.PigServer.store(PigServer.java:919) at
org.apache.pig.PigServer.openIterator(PigServer.java:832) ... 7 more
Caused by:
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException:
ERROR 2017: Internal error creating job configuration. at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:874)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:297)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1285) at
org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1270)
at org.apache.pig.PigServer.storeEx(PigServer.java:952) ... 9 more
Caused by: java.io.IOException: Permission denied at
java.io.UnixFileSystem.createFileExclusively(Native Method) at
java.io.File.createTempFile(File.java:1879) at
java.io.File.createTempFile(File.java:1923) at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:538)
... 14 more
Can any one suggest me why the following error is occurring and how to resolve it??
Not only the below command, running any command related to Hive is returning the same..
hive> show databases;
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.thrif
t.transport.TTransportException java.net.SocketException: Connection reset by pe
er: socket write error)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTa
sk
Check out your hive-site.xml. It is possible your javax.jdo.option.ConnectionURL, the URL for the Hive metastore, isn't right.