I have installed Elasticsearch on an Amazon Linux Machine using the latest rpm package from their website. After thatt, I have attached an EBS volume and created a directory on this volume. I want this directory to be the data directory of Elasticsearch. So, I started the elasticsearch service first with defaults. I created a new directory in the user ec2-user home directory
mkdir my_data
Then I changed the path.data in the /etc/elasticsearch/elasticsearch.yml file to point to this new directory
path.data: /home/ec2-user/my_data
Then I changed the ownership of this directory:
sudo chown -R elasticsearch:elasticsearch /home/ec2-user/my_data
So, currently the permissions look like this
[ec2-user#ip-XXXXXX ~]$ ls -lrt
total 28632
drwxrwxr-x 2 elasticsearch elasticsearch 4096 Feb 4 06:18 my_data
However, when I try to start elasticsearch, I get the error:
Starting elasticsearch: Exception in thread "main" java.lang.IllegalStateException: Unable to access 'path.data' (/home/ec2-user/my_data)
Likely root cause: java.nio.file.AccessDeniedException: /home/ec2-user/my_data
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:84)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:383)
at java.nio.file.Files.createDirectory(Files.java:630)
at java.nio.file.Files.createAndCheckIsDirectory(Files.java:734)
at java.nio.file.Files.createDirectories(Files.java:720)
at org.elasticsearch.bootstrap.Security.ensureDirectoryExists(Security.java:337)
at org.elasticsearch.bootstrap.Security.addPath(Security.java:314)
at org.elasticsearch.bootstrap.Security.addFilePermissions(Security.java:256)
at org.elasticsearch.bootstrap.Security.createPermissions(Security.java:212)
at org.elasticsearch.bootstrap.Security.configure(Security.java:118)
at org.elasticsearch.bootstrap.Bootstrap.setupSecurity(Bootstrap.java:196)
at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:167)
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:285)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
Refer to the log for complete error details.
[FAILED]
I found it surprising, but in the latest version of Elasticsearch, if you create a data directory inside home of other user, ES is unable to access it. Though logically it is perfect too. What i suggest that you either mount an external hard disk for elasticsearch or create a data directory inside /home/ on the parallel of ec2-user. so you directory should have a path /home/my-data and it will work like a charm. :)
Thanks ,
Bharvi
In case this helps anyone with the problem that I was seeing...
This seems to be an oddity with java.nio.file.Files.createDirectories. The doc says "Unlike the createDirectory method, an exception is not thrown if the directory could not be created because it already exists." In your case, the folder exists so you should not get an exception. But the check for existence done in UnixFileSystemProvider is via mkdir which will throw an access-denied exception before it throws an already-exists exception. The access-denied exception which you are seeing then is not that elasticsearch doesn't have access to /home/ec2-user/my_data but rather that it doesn't have access to make that directory. So the solution is to fix the permission problem that is preventing elasticsearch from making the directory /home/ec2-user/my_data. For you this would be to make /home/ec2-user writeable by elasticsearch or to create a path like /home/ec2-user/my_data_holder/my_data and then make /home/ec2-user/my_data_holder writeable by elasticsearch.
Related
I am trying to mount an S3 bucket into an EC2 server.
To keep things simple, since the bucket has root permissions only, I am using the root user on the EC2 instance. Running
s3fs mybucket ./mybucket/
Correctly mounts the bucket, so if I run ls ./mybucket
I get all the directories.
The problem arises when I try to list one of the subdirectory with ls ./mybucket/subdirectory, and I get the following error:
ls: reading directory '.': Software caused connection abort
From that moment on, any request will yield the error
ls: cannot open directory '.': Transport endpoint is not connected
Am I doing something wrong ? Is there a way to fix this ?
I'm trying to get Drill up and running on my machine. However, whenever I enter drill-embedded mode (bin/drill-embedded on Bash), I get this error:
Error: Failure in starting embedded Drillbit: java.lang.IllegalStateException: Local udf directory [/tmp/drill/udf/udf/local] must be writable for application user (state=,code=0)
If I try to run a query at this point, it'll give back:
No current connection
Any idea how to fix this? I've tried starting with a clean shell with no luck. Is it a permissions issue?
You have to give the directory /tmp/drill/udf/udf/local write access. Since it is a directory in /tmp, you might need root access to give permissions, or you will have to use sudo. To give permission, use this:
chmod 777 -R /tmp/drill/udf/udf/local
Also make sure the user is having at least read permission on the parent directories, otherwise you will get a permission denied error again.
I try to install elasticsearch as Windows service. I set the environment variables to change the data and logs path with DATA_DIR and LOG_DIR.
If the LOG_DIR is not created yet, and it is only 1 level, the directory will be created (as expected).
The problem is when I specify LOG_DIR with nested directory and the directory doesn’t exist yet, it will throw error:
Unable to create logger at ''
For example:
LOG_DIR=D:/test/logs
If this location doesn’t exist, the error will occur.
Is there any way to tell ES to create the directory recursively?
Thank you!
The logs directory should be created automatically, but Elasticsearch will not create directories recursively, thats has to be done by the user.
> D:\>echo %HADOOP_HOME%
> D:\Apps\winutils\hadoop-2.7.1
Create tmp/hive folders on the same disk as HADOOP_HOME
D:\>dir tmp\hive
Directory of D:\tmp\hive
06/13/2016 01:13 PM <DIR> .
06/13/2016 01:13 PM <DIR> ..
0 File(s) 0 bytes
2 Dir(s) 227,525,246,976 bytes free
Try to figure out what permission are set
D:\>winutils.exe ls \tmp\hive
FindFileOwnerAndPermission error (1789): The trust relationship between this workstation and the primary domain failed.
When I tried chmod for this folders it seems work
winutils.exe chmod 777 \tmp\hive
but ls shows same exception
Does anyone has an idea what is going on ? Moreover, It works for me a couple hours ago but now my spark application fails with an exception
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
I am quite late here still posting it so it might help someone in future.
While setting the permission, make sure you are using correct path for winutils.exe (try to use complete path). For me winutils.exe was in C drive:
C:\path\to\winutils.exe chmod -R 777 C:\tmp\hive
Run the below command to check the permission and it should look like below image ([setting and checking the permission : click to see the image]):
https://i.stack.imgur.com/vE9vl.png
If this is your corporate system the you must be on the same network using VPN or Forti Client or any other tool your organisation has been using
https://support.microsoft.com/en-us/kb/2771040
Looks like domain access issues, please ensure you can access domain and take a try again.
After ensure domain access, below error disappear
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions a
re: rw-rw-rw-
I'm late here and I just encountered this issue. Writing this so it will help somebody.
If you are using your office laptop, make sure you are connected to office network and retry. The domain Member of Domain settings point to your office network. That must solve the issue.
Log on Windows 10 using local Administrator account
Hold Windowslogo and press E to open File Explorer
3.On the right side of the File Explorer right click on This PC and choose Properties Click Advanced System Settings
Choose Computer Name tab and select change to see the value configured.
I am a newbie here, so it might be wrong but I think you need to add -R in the command as below:
winutils chmod -R 777 \tmp\hive
Elasticsearch setup works fine with default configurations.
But when updated its path.data setting from elasticsearch.yml file it crashes with below error
[2015-11-19 12:39:56,194][ERROR][bootstrap ] Exception
java.lang.IllegalStateException: Unable to access 'path.data' (/home/hadoop/bigdata/data/elasticsearch)
at org.elasticsearch.bootstrap.Security.addPath(Security.java:197)
at org.elasticsearch.bootstrap.Security.createPermissions(Security.java:170)
at org.elasticsearch.bootstrap.Security.configure(Security.java:100)
at org.elasticsearch.bootstrap.Bootstrap.setupSecurity(Bootstrap.java:181)
at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:159)
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:270)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
Caused by: java.nio.file.AccessDeniedException: /home/hadoop/bigdata/data
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:84)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.checkAccess(UnixFileSystemProvider.java:308)
at java.nio.file.Files.createDirectories(Files.java:702)
at org.elasticsearch.bootstrap.Security.ensureDirectoryExists(Security.java:218)
at org.elasticsearch.bootstrap.Security.addPath(Security.java:195)
... 6 more
I have copied elasticsearch directory from /var/lib location with preserved mode. But no success.
Can anybody please help me to come out of this error
Thanks,
Sanjay Bhosale
This error is coming because of not setting permissions for user "elasticsearch" to access the folder.
Try making "elasticsearch" user(default user of elasticsearch) the owner of the folder using below commands -
sudo chown elasticsearch: /home/hadoop/bigdata/data/elasticsearch
In addition to ensure that the user elasticsearch is the owner and group of the data folder, the user should also have permission (+x) to traverse each level of the path to the configured data path (in your case, /home/hadoop/bigdata/data/elasticsearch).
That is, if the parent directory of the Elasticsearch path.data is not owned by user elasticsearch (as in your case, the parent folder belongs to user hadoop), then you should check each level of the parent directory, to ensure it be set with o+x permission (with chmod) to guarantee permission to the user elasticsearch (i.e., others).
I have learned this solution from other's question: Elasticsearch cannot open log file: Permission denied.