H2 Trying to access /private/var/empty directory - h2

Why is H2 trying to access the file below?
org.h2.message.DbException: Log file error: "/private/var/empty/test.trace.db", cause: "java.io.FileNotFoundException: /private/var/empty/test.trace.db (Permission denied)" [90034-195]
org.h2.message.DbException: Log file error: "/private/var/empty/test.trace.db", cause: "java.io.FileNotFoundException: /private/var/empty/test.trace.db (Permission denied)" [90034-195]
at org.h2.message.DbException.get(DbException.java:168)

Related

Dse is not starting stating unable to write to commit log directory

I am getting below error while starting the dse:
ERROR [main] 2020-02-26 13:08:33,269 DseModule.java:97 - {}. Exiting...
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) An exception was caught and reported. Message: Unable to check disk space available to /u01/dse_ops/logs. Perhaps the Cassandra user does not have the necessary permissions
at com.datastax.bdp.DseModule.configure(Unknown Source)

Hadoop FS File system error - copyToLocal([class org.apache.hadoop.fs.Path, class org.apache.hadoop.fs.Path]) does not exist

Inside the PysPark session , I want to copy file from S3 to Hadoop Cluster local directory while doing this got following error. Please help.
file_system.copyToLocal(false, java_path_src, java_path_dst)
Parameters-
java_path_src - s3://sandbox/metadata/2018-06-07T183915/test.jsonl
java_path_dst - /home/hadoop/output/
Error-
py4j.protocol.Py4JError: An error occurred while calling o144.copyToLocal. Trace:
py4j.Py4JException: Method copyToLocal([class org.apache.hadoop.fs.Path, class org.apache.hadoop.fs.Path]) does not exist
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
at py4j.Gateway.invoke(Gateway.java:272)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)

Permission denied error while creating database through java in hive

I have tried to create database in hive using java.
But I got this error while running the code:
Exception in thread "main" java.sql.SQLException: Error while
processing statement: FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got
exception: org.apache.hadoop.security.AccessControlException
Permission denied: user=hive, access=WRITE,
inode="/user/hive/warehouse/sampledb.db":root:supergroup:drwxr-xr-x
I have given read an write privileges to the folder /user/hive/warehouse/. Still I am getting this error. Any remedies???

Pig permission denied while hdfs file is readable

Running the following commands is sucessful
hadoop fs -ls /path/
hadoop fs -cat /path/.pig_schema
And all the files in that dir has a -rwxr-xr-x permission
However, in the pig console, when running:
A = LOAD '/path/' USING PigStorage();
B = LIMIT A 5;
DUMP B;
Encounters a permission error
2015-08-27 08:47:59,734 [main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: Permission denied
2015-08-27 08:47:59,735 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2017: Internal error creating job configuration.
Any idea why ?
EDIT 1: Added error log
================================================================================ Pig Stack Trace
--------------- ERROR 2017: Internal error creating job configuration.
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable
to open iterator for alias B at
org.apache.pig.PigServer.openIterator(PigServer.java:857) at
org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:746)
at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:196)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:171)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69) at
org.apache.pig.Main.run(Main.java:543) at
org.apache.pig.Main.main(Main.java:157) Caused by:
org.apache.pig.PigException: ERROR 1002: Unable to store alias B at
org.apache.pig.PigServer.storeEx(PigServer.java:956) at
org.apache.pig.PigServer.store(PigServer.java:919) at
org.apache.pig.PigServer.openIterator(PigServer.java:832) ... 7 more
Caused by:
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException:
ERROR 2017: Internal error creating job configuration. at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:874)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:297)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1285) at
org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1270)
at org.apache.pig.PigServer.storeEx(PigServer.java:952) ... 9 more
Caused by: java.io.IOException: Permission denied at
java.io.UnixFileSystem.createFileExclusively(Native Method) at
java.io.File.createTempFile(File.java:1879) at
java.io.File.createTempFile(File.java:1923) at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:538)
... 14 more

Error with datastax enterprise with hadoop enable and kerberos enable

I have configured dse with hadoop enable and kerberos authentication. But I see this ERROR in the log. I can execute dse hadoop fs commands and nodetool commands but cannot run map reduce jobs.
The following is the log :-
ERROR [TASK-TRACKER-INIT] 2014-02-07 20:45:03,813 TaskTrackerRunner.java (line 128) Hadoop Task Tracker caused an exception in state STARTING:
java.io.IOException: Cannot run program "/usr/share/dse/hadoop/native/Linux-amd64- 64/bin/task-controller" (in directory "."): error=13, Permission denied
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
at org.apache.hadoop.util.Shell.startProcess(Shell.java:199)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:225)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:401)
at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1470)
at com.datastax.bdp.hadoop.mapred.TaskTrackerRunner.initService(TaskTrackerRunner.java:104)
at com.datastax.bdp.hadoop.mapred.TaskTrackerRunner.initService(TaskTrackerRunner.java:31)
at com.datastax.bdp.hadoop.mapred.ServiceRunner.run(ServiceRunner.java:121)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.io.IOException: error=13, Permission denied
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:135)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
... 10 more
ERROR [Thrift:8] 2014-02-07 20:45:12,624 TNegotiatingServerTransport.java (line 293) An error occurred during transport negotiation
com.datastax.bdp.transport.common.TTransportNegotiationException: Improper authentication type requested. Requested auth: No authentication with service principal: FRAMED_TRANSPORT_FAKE_PRINCIPAL, Allowed auth: Kerberos
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getUnderlyingFactory(TNegotiatingServerTransport.java:485)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.handleTransportNegotiation(TNegotiatingServerTransport.java:286)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.open(TNegotiatingServerTransport.java:192)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getTransport(TNegotiatingServerTransport.java:517)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getTransport(TNegotiatingServerTransport.java:408)
at org.apache.cassandra.thrift.CustomTThreadPoolServer$WorkerProcess.run(CustomTThreadPoolServer.java:193)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
ERROR [Thrift:8] 2014-02-07 20:45:12,625 TNegotiatingServerTransport.java (line 524) Failed to open server transport.
com.datastax.bdp.transport.common.TTransportNegotiationException: An error occurred during transport negotiation
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.handleTransportNegotiation(TNegotiatingServerTransport.java:294)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.open(TNegotiatingServerTransport.java:192)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getTransport(TNegotiatingServerTransport.java:517)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getTransport(TNegotiatingServerTransport.java:408)
at org.apache.cassandra.thrift.CustomTThreadPoolServer$WorkerProcess.run(CustomTThreadPoolServer.java:193)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: com.datastax.bdp.transport.common.TTransportNegotiationException: Improper authentication type requested. Requested auth: No authentication with service principal: FRAMED_TRANSPORT_FAKE_PRINCIPAL, Allowed auth: Kerberos
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getUnderlyingFactory(TNegotiatingServerTransport.java:485)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.handleTransportNegotiation(TNegotiatingServerTransport.java:286)
... 7 more
ERROR [Thrift:8] 2014-02-07 20:45:12,626 CustomTThreadPoolServer.java (line 219) Error occurred during processing of message.
java.lang.RuntimeException: Failed to open server transport: unknown
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getTransport(TNegotiatingServerTransport.java:525)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getTransport(TNegotiatingServerTransport.java:408)
at org.apache.cassandra.thrift.CustomTThreadPoolServer$WorkerProcess.run(CustomTThreadPoolServer.java:193)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: com.datastax.bdp.transport.common.TTransportNegotiationException: An error occurred during transport negotiation
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.handleTransportNegotiation(TNegotiatingServerTransport.java:294)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.open(TNegotiatingServerTransport.java:192)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getTransport(TNegotiatingServerTransport.java:517)
... 5 more
Caused by: com.datastax.bdp.transport.common.TTransportNegotiationException: Improper authentication type requested. Requested auth: No authentication with service principal: FRAMED_TRANSPORT_FAKE_PRINCIPAL, Allowed auth: Kerberos
at com.datastax.bdp.transport.server.TNegotiatingServerTransport$Factory.getUnderlyingFactory(TNegotiatingServerTransport.java:485)
at com.datastax.bdp.transport.server.TNegotiatingServerTransport.handleTransportNegotiation(TNegotiatingServerTransport.java:286)
... 7 more
WARN [TASK-TRACKER-INIT] 2014-02-07 20:45:13,828 MetricsSystemImpl.java (line 200) Source name ugi already exists!
This is the task controller :-
-rwsr-x--- 1 root cassandra 40111 Jan 9 18:14 /usr/share/dse/hadoop/native/Linux-amd64-64/bin/task-controller
I am using
DSE 3.2.3
Java 1.7.0_25
I have configured properly in cassandra.yaml, dse.yaml, core-site.xml, mapre-site.xml, /etc/default/dse files
You should not start the daemon from root home directory. This sounds weird but try to start the daemon other than root home Directory.
/usr/share/dse/hadoop/native/Linux-amd64- 64/bin/task-controller" (in directory "."): error=13, Permission denied
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
Above clearly shows the user doesn't have the permission to run it. Change the permission so that it has execute permission for the user. Use chmod to change the permission

Resources