Spring Boot Redis Cluster connection - spring-boot

I'm using Spring Boot and I'm trying to connect to Redis Server master and slave using RedisSentinelConfiguration, each time I'm trying to connect with the Redis server to Read or Write from it, it throws the below exception:
Caused by: io.lettuce.core.RedisCommandTimeoutException: Cannot obtain master using SENTINEL MASTER. Command timed out after 1 minute(s)
      at io.lettuce.core.internal.ExceptionFactory.createTimeoutException(ExceptionFactory.java:71)
      at io.lettuce.core.RedisClient.lambda$null$17(RedisClient.java:766)
      at reactor.core.publisher.Mono.lambda$onErrorMap$31(Mono.java:3730)
      at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:94)
      at reactor.core.publisher.SerializedSubscriber.onError(SerializedSubscriber.java:124)
      at reactor.core.publisher.SerializedSubscriber.onError(SerializedSubscriber.java:124)
      at reactor.core.publisher.FluxTimeout$TimeoutMainSubscriber.onError(FluxTimeout.java:219)
      at reactor.core.publisher.FluxMap$MapSubscriber.onError(FluxMap.java:134)
      at reactor.core.publisher.MonoNext$NextSubscriber.onError(MonoNext.java:93)
      at reactor.core.publisher.MonoNext$NextSubscriber.onError(MonoNext.java:93)
      at io.lettuce.core.RedisPublisher$ImmediateSubscriber.onError(RedisPublisher.java:891)
      at io.lettuce.core.RedisPublisher$State.onError(RedisPublisher.java:712)
      at io.lettuce.core.RedisPublisher$RedisSubscription.onError(RedisPublisher.java:357)
      at io.lettuce.core.RedisPublisher$SubscriptionCommand.onError(RedisPublisher.java:797)
      at io.lettuce.core.RedisPublisher$SubscriptionCommand.doOnComplete(RedisPublisher.java:757)
      at io.lettuce.core.protocol.CommandWrapper.complete(CommandWrapper.java:65)
      at io.lettuce.core.protocol.CommandWrapper.complete(CommandWrapper.java:63)
      at io.lettuce.core.protocol.CommandHandler.complete(CommandHandler.java:747)
      at io.lettuce.core.protocol.CommandHandler.decode(CommandHandler.java:682)
      at io.lettuce.core.protocol.CommandHandler.channelRead(CommandHandler.java:599)
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
      at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
      at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
      at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
      at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
      at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
      at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
      at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
      at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
      at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
      at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
      at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
      at java.base/java.lang.Thread.run(Thread.java:833)
      Suppressed: io.lettuce.core.RedisCommandExecutionException: ERR unknown command `SENTINEL`, with args beginning with: `get-master-addr-by-name`, `master_node_name`,
            at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:147)
            at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:116)
            ... 22 common frames omitted
any idea about this error?
thanks in advance

Related

Invalid HTTP Host: Kafka ElasticSearch Sink Connector

I am trying to use elasticsearch as a db for my application by using kafka connect in between. KafkaConnect, elasticsearch(version 7) and my application are running in a containers in a same network. When I trie to access elasticsearch from the container for kafkaconnect, it is successful. But my connector keeps throwing me the following error and I can't seem to figure out the exact issue:
Error:
container_standalone | java.lang.IllegalArgumentException: Invalid HTTP host: elasticsearch:9200/
container_standalone | at org.apache.http.HttpHost.create(HttpHost.java:123)
container_standalone | at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.lambda$getClientConfig$0(JestElasticsearchClient.java:201)
container_standalone | at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
container_standalone | at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
container_standalone | at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
container_standalone | at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
container_standalone | at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
container_standalone | at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
container_standalone | at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
container_standalone | at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.getClientConfig(JestElasticsearchClient.java:201)
container_standalone | at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.<init>(JestElasticsearchClient.java:149)
container_standalone | at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.<init>(JestElasticsearchClient.java:142)
container_standalone | at io.confluent.connect.elasticsearch.ElasticsearchSinkTask.start(ElasticsearchSinkTask.java:122)
container_standalone | at io.confluent.connect.elasticsearch.ElasticsearchSinkTask.start(ElasticsearchSinkTask.java:51)
container_standalone | at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:300)
container_standalone | at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:189)
container_standalone | at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
container_standalone | at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
container_standalone | at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
container_standalone | at java.util.concurrent.FutureTask.run(FutureTask.java:266)
container_standalone | at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
container_standalone | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
container_standalone | at java.lang.Thread.run(Thread.java:748)
container_standalone | [2020-09-23 09:47:09,366] ERROR WorkerSinkTask{id=elasticsearch-sink-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:180)
Configuration File:
name=elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=vehicle
topic.index=test-vehicle
connection.url=http://elasticsearch:9200
connection.username=username
connection.password=password
type.name=log
key.ignore=true
schema.ignore=true
Elastic.yml file:
cluster.name: "docker-cluster"
network.host: 0.0.0.0
xpack.license.self_generated.type: trial
xpack.security.enabled: false
xpack.monitoring.collection.enabled: true

Unable to run elasticsearch using docker-compose file

Created directory Product_dir
mkdir Product_dir
cd Product_dir
docker-compose up
Iam pointing data directory to path on my host machine : /var/elasticsearch/data/product1/
Gave full permission on directory: chmod -R 777 /var/elasticsearch/
changed owner to elasticsearch : chown -R elasticsearch:elasticsearch /var/elasticsearch/
docker-compose.yml file
version: "2"
services:
elasticsearch-5-6:
image: docker.elastic.co/elasticsearch/elasticsearch:5.6.3
container_name: elasticsearch-5-6
ports:
- "9201:9200"
volumes:
- /var/elasticsearch/data/product1/:/usr/share/elasticsearc/data/
#- /etc/elasticsearch/elasticsearch-5-6.yml:/usr/share/elasticsearch/config/elasticsearch.yml
#- /etc/elasticsearch/logging.yml:/usr/share/elasticsearch/config/logging.yml
#- /var/log/elasticsearch/:/usr/share/elasticsearch/logs/
environment:
- cluster.name=docker-cluster-elasticsearch-5-6
#- bootstrap.memory_lock=true
- "ES_JAVA_OPTS: -Xmx2048m -Xms2048m"
Log file:
[root#localhost docker-elasticsearch-5-6]# docker-compose up
Starting elasticsearch-5-6 ...
Starting elasticsearch-5-6 ... done
Attaching to elasticsearch-5-6
elasticsearch-5-6 | [2017-11-17T01:55:09,017][INFO ][o.e.n.Node ] [] initializing ...
elasticsearch-5-6 | [2017-11-17T01:55:09,035][WARN ][o.e.b.ElasticsearchUncaughtExceptionHandler] [] uncaught exception in thread [main]
elasticsearch-5-6 | org.elasticsearch.bootstrap.StartupException: java.lang.IllegalStateException: Failed to create node environment
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:136) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:123) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:70) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:134) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.cli.Command.main(Command.java:90) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:91) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:84) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | Caused by: java.lang.IllegalStateException: Failed to create node environment
elasticsearch-5-6 | at org.elasticsearch.node.Node.<init>(Node.java:268) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.node.Node.<init>(Node.java:245) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Bootstrap$5.<init>(Bootstrap.java:233) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:233) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:342) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:132) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | ... 6 more
elasticsearch-5-6 | Caused by: java.nio.file.AccessDeniedException: /usr/share/elasticsearch/data/nodes
elasticsearch-5-6 | at sun.nio.fs.UnixException.translateToIOException(UnixException.java:84) ~[?:?]
elasticsearch-5-6 | at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:?]
elasticsearch-5-6 | at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107) ~[?:?]
elasticsearch-5-6 | at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384) ~[?:?]
elasticsearch-5-6 | at java.nio.file.Files.createDirectory(Files.java:674) ~[?:1.8.0_141]
elasticsearch-5-6 | at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781) ~[?:1.8.0_141]
elasticsearch-5-6 | at java.nio.file.Files.createDirectories(Files.java:767) ~[?:1.8.0_141]
elasticsearch-5-6 | at org.elasticsearch.env.NodeEnvironment.<init>(NodeEnvironment.java:221) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.node.Node.<init>(Node.java:265) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.node.Node.<init>(Node.java:245) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Bootstrap$5.<init>(Bootstrap.java:233) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:233) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:342) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:132) ~[elasticsearch-5.6.3.jar:5.6.3]
elasticsearch-5-6 | ... 6 more
elasticsearch-5-6 exited with code 1

Shoud a Savitsky-Golay 2d/image smoothing kernel be the same when using cross terms or not

W/r to the code below: (Code is Matlab but it's really an algorithm question)
Size - the desired image convolution kernel size
PolyDegree - degree of polynomial
crossterms - boolean ==> whether there should be cross terms
So, if, say, PolyDegree=2 and crossterms is false, the design matrix,
A=[1,X,X^2,Y,Y^2]
If the crossterms is true then
A=[1,Y,Y^2,X,XY,X^2]
Note, if there was a cubic, there would be a lot more cross terms (e.g. X^2Y,Y^2X). However, I've tried this for 7x7 and 5x5 filters for quadratics and cubics, and for each combination, the smoothing SG Kernel is the same regardless of crossterms (i.e. given as true of false).
EDIT - Actually, for the same Size filter, it gives the same result regardless of degree. So, for example, a Size=7 filter with PolyDegree==2 and crossterms=0 yields the same SG filter (as shown at the bottom) for PolyDegree=3 and crossterms=1?!**
Is that right or am I screwing up?
x = -(Size(2)-1)/2 :(Size(2)-1)/2; % e.g Size(2)==5==>x=-2:2
y = -(Size(1)-1)/2 :(Size(1)-1)/2;
[x,y]=meshgrid(x,y);
x=x(:);
y=y(:);
if crossterms
A=[];
for kx=0:PolyDegree
for ky=0:(PolyDegree-kx)
A=[A x.^kx .* y.^ky];
end
end
else
A = ones(size(x));
for k=1:1:PolyDegree
A=[A x.^k];
end
for k=1:1:PolyDegree
A=[A y.^k];
end
end
C=inv(A'*A)*A'; % == pinv(A)
h=reshape(C(:,1),Size(1),Size(2)); % h=first row should be SG smoothing kernel.
So, for example, regardless of crossterms, or/and even whether I specified a 2 or 3 degree polynomial, a 7x7 Size bicubic (PolyDegree==3) yields:
h =
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
0.0136 0.0476 0.0680 0.0748 0.0680 0.0476 0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
from https://en.wikipedia.org/wiki/Savitzky%E2%80%93Golay_filter
:
In general, polynomials of degree (0 and 1),[note 3] (2 and 3), (4 and
5) etc. give the same coefficients for smoothing and even derivatives.
Polynomials of degree (1 and 2), (3 and 4) etc. give the same
coefficients for odd derivatives.
so 2 and 3 degree were the same but 4 was different but then 4 was the same as 5 given the same crossterms setting. However, unlike for degrees 2 and 3, 4 and 5 were different when the crossterms setting was off vs on.
Still not sure about the crossterms biz being the same (i.e. generating the same filter whether crossterms is 1 of 0). Note:
The function sgsdf_2da called below corresponds to the above code (in my question). The function has the API sgsdf_2da(Size, PolyDegree, , crossterms). A Scalar Size ==> a square Size x Size filter and "crossterms", as per the above code, is true for having polynomial cross terms, false for not.
>> hi=sgsdf_2da(7,2,0,0)
hi =
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
0.0136 0.0476 0.0680 0.0748 0.0680 0.0476 0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
>> hi=sgsdf_2da(7,2,0,1)
hi =
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
0.0136 0.0476 0.0680 0.0748 0.0680 0.0476 0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
>> hi=sgsdf_2da(7,3,0,0)
hi =
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
0.0136 0.0476 0.0680 0.0748 0.0680 0.0476 0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
>> hi=sgsdf_2da(7,3,0,1)
hi =
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
0.0136 0.0476 0.0680 0.0748 0.0680 0.0476 0.0136
0.0068 0.0408 0.0612 0.0680 0.0612 0.0408 0.0068
-0.0136 0.0204 0.0408 0.0476 0.0408 0.0204 -0.0136
-0.0476 -0.0136 0.0068 0.0136 0.0068 -0.0136 -0.0476
>> hi=sgsdf_2da(7,4,0,0)
hi =
-0.0142 -0.0359 0.0291 0.0637 0.0291 -0.0359 -0.0142
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
0.0291 0.0074 0.0724 0.1070 0.0724 0.0074 0.0291
0.0637 0.0421 0.1070 0.1416 0.1070 0.0421 0.0637
0.0291 0.0074 0.0724 0.1070 0.0724 0.0074 0.0291
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
-0.0142 -0.0359 0.0291 0.0637 0.0291 -0.0359 -0.0142
>> hi=sgsdf_2da(7,4,0,1)
hi =
0.0425 -0.0359 -0.0049 0.0183 -0.0049 -0.0359 0.0425
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
-0.0049 0.0074 0.0928 0.1342 0.0928 0.0074 -0.0049
0.0183 0.0421 0.1342 0.1779 0.1342 0.0421 0.0183
-0.0049 0.0074 0.0928 0.1342 0.0928 0.0074 -0.0049
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
0.0425 -0.0359 -0.0049 0.0183 -0.0049 -0.0359 0.0425
>> hi=sgsdf_2da(7,5,0,0)
hi =
-0.0142 -0.0359 0.0291 0.0637 0.0291 -0.0359 -0.0142
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
0.0291 0.0074 0.0724 0.1070 0.0724 0.0074 0.0291
0.0637 0.0421 0.1070 0.1416 0.1070 0.0421 0.0637
0.0291 0.0074 0.0724 0.1070 0.0724 0.0074 0.0291
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
-0.0142 -0.0359 0.0291 0.0637 0.0291 -0.0359 -0.0142
>> hi=sgsdf_2da(7,5,0,1)
hi =
0.0425 -0.0359 -0.0049 0.0183 -0.0049 -0.0359 0.0425
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
-0.0049 0.0074 0.0928 0.1342 0.0928 0.0074 -0.0049
0.0183 0.0421 0.1342 0.1779 0.1342 0.0421 0.0183
-0.0049 0.0074 0.0928 0.1342 0.0928 0.0074 -0.0049
-0.0359 -0.0575 0.0074 0.0421 0.0074 -0.0575 -0.0359
0.0425 -0.0359 -0.0049 0.0183 -0.0049 -0.0359 0.0425

Play 2.4.0 https support - RSA no longer available?

Sorry for the horribly long stack trace but I suspect it will be asked for eventually to get to the bottom of this.
Upgraded to play 2.4.0 from prior version which (still) works on the same server.
We know that .keystore file is valid (I copied it in case there was any sort of file locking issue with other processes but when that didn't work I shut down the running instance and used that and yet still no joy)
Update: If we RUN activator it works - however if we START activator it does not. Will continue to investigate the relevance of that new nugget of knowledge.
Update (2016/01/01) I have installed the certificate on dev
workstation to attempt to troubleshoot this further and sadly https
refuses to run under OS X with yet another horrible long and virtually
meaningless (as in no real error being pointed out) stack trace.
[error] - play.core.server.NettyServer$PlayPipelineFactory - cannot load SSL context
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_31]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_31]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_31]
at java.lang.reflect.Constructor.newInstance(Constructor.java:408) ~[na:1.8.0_31]
at play.core.server.ssl.ServerSSLEngine$.createScalaSSLEngineProvider(ServerSSLEngine.scala:96) ~[com.typesafe.play.play-server_2.11-2.4.0.jar:2.4.0]
at play.core.server.ssl.ServerSSLEngine$.createSSLEngineProvider(ServerSSLEngine.scala:32) ~[com.typesafe.play.play-server_2.11-2.4.0.jar:2.4.0]
at play.core.server.NettyServer$PlayPipelineFactory.liftedTree1$1(NettyServer.scala:113) [com.typesafe.play.play-netty-server_2.11-2.4.0.jar:2.4.0]
at play.core.server.NettyServer$PlayPipelineFactory.sslEngineProvider$lzycompute(NettyServer.scala:112) [com.typesafe.play.play-netty-server_2.11-2.4.0.jar:2.4.0]
at play.core.server.NettyServer$PlayPipelineFactory.sslEngineProvider(NettyServer.scala:111) [com.typesafe.play.play-netty-server_2.11-2.4.0.jar:2.4.0]
at play.core.server.NettyServer$PlayPipelineFactory.getPipeline(NettyServer.scala:90) [com.typesafe.play.play-netty-server_2.11-2.4.0.jar:2.4.0]
at org.jboss.netty.channel.socket.nio.NioServerBoss.registerAcceptedChannel(NioServerBoss.java:134) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioServerBoss.process(NioServerBoss.java:104) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [io.netty.netty-3.10.3.Final.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_31]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_31]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_31]
Caused by: java.lang.Exception: Error loading HTTPS keystore from /usr/share/play/.keystore
at play.core.server.ssl.DefaultSSLEngineProvider.createSSLContext(DefaultSSLEngineProvider.scala:47) ~[com.typesafe.play.play-server_2.11-2.4.0.jar:2.4.0]
at play.core.server.ssl.DefaultSSLEngineProvider.<init>(DefaultSSLEngineProvider.scala:21) ~[com.typesafe.play.play-server_2.11-2.4.0.jar:2.4.0]
... 19 common frames omitted
Caused by: java.security.NoSuchAlgorithmException: RSA KeyManagerFactory not available
at sun.security.jca.GetInstance.getInstance(GetInstance.java:159) ~[na:1.8.0_31]
at javax.net.ssl.KeyManagerFactory.getInstance(KeyManagerFactory.java:137) ~[na:1.8.0_31]
at play.core.server.ssl.DefaultSSLEngineProvider.createSSLContext(DefaultSSLEngineProvider.scala:42) ~[com.typesafe.play.play-server_2.11-2.4.0.jar:2.4.0]
... 20 common frames omitted
[error] - play.core.server.netty.PlayDefaultUpstreamHandler - Exception caught in Netty
java.lang.IllegalArgumentException: empty text
at org.jboss.netty.handler.codec.http.HttpVersion.<init>(HttpVersion.java:89) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpVersion.valueOf(HttpVersion.java:62) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpRequestDecoder.createMessage(HttpRequestDecoder.java:75) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:191) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:102) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [io.netty.netty-3.10.3.Final.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_31]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_31]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_31]
[error] - play.core.server.netty.PlayDefaultUpstreamHandler - Exception caught in Netty
java.lang.IllegalArgumentException: empty text
at org.jboss.netty.handler.codec.http.HttpVersion.<init>(HttpVersion.java:89) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpVersion.valueOf(HttpVersion.java:62) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpRequestDecoder.createMessage(HttpRequestDecoder.java:75) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:191) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:102) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.cleanup(ReplayingDecoder.java:554) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.frame.FrameDecoder.channelDisconnected(FrameDecoder.java:365) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:102) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireChannelDisconnected(Channels.java:396) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.close(AbstractNioWorker.java:360) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.handleAcceptedSocket(NioServerSocketPipelineSink.java:81) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.eventSunk(NioServerSocketPipelineSink.java:36) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:779) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:54) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:784) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.closeRequested(SimpleChannelHandler.java:334) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.handleDownstream(SimpleChannelHandler.java:260) [io.netty.netty-3.10.3.Final.jar:na]
at com.typesafe.netty.http.pipelining.HttpPipeliningHandler.handleDownstream(HttpPipeliningHandler.java:106) [com.typesafe.netty.netty-http-pipelining-1.1.4.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:582) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.close(Channels.java:812) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.AbstractChannel.close(AbstractChannel.java:206) [io.netty.netty-3.10.3.Final.jar:na]
at play.core.server.netty.PlayDefaultUpstreamHandler.exceptionCaught(PlayDefaultUpstreamHandler.scala:66) [com.typesafe.play.play-netty-server_2.11-2.4.0.jar:2.4.0]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.exceptionCaught(SimpleChannelHandler.java:156) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:130) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.exceptionCaught(SimpleChannelUpstreamHandler.java:153) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.frame.FrameDecoder.exceptionCaught(FrameDecoder.java:377) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireExceptionCaught(Channels.java:525) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.AbstractChannelSink.exceptionCaught(AbstractChannelSink.java:48) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.notifyHandlerException(DefaultChannelPipeline.java:658) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:566) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [io.netty.netty-3.10.3.Final.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_31]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_31]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_31]
[error] - play.core.server.netty.PlayDefaultUpstreamHandler - Exception caught in Netty
java.lang.IllegalArgumentException: empty text
at org.jboss.netty.handler.codec.http.HttpVersion.<init>(HttpVersion.java:89) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpVersion.valueOf(HttpVersion.java:62) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpRequestDecoder.createMessage(HttpRequestDecoder.java:75) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:191) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:102) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [io.netty.netty-3.10.3.Final.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_31]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_31]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_31]
[error] - play.core.server.netty.PlayDefaultUpstreamHandler - Exception caught in Netty
java.lang.IllegalArgumentException: empty text
at org.jboss.netty.handler.codec.http.HttpVersion.<init>(HttpVersion.java:89) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpVersion.valueOf(HttpVersion.java:62) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpRequestDecoder.createMessage(HttpRequestDecoder.java:75) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:191) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:102) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.cleanup(ReplayingDecoder.java:554) ~[io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.frame.FrameDecoder.channelDisconnected(FrameDecoder.java:365) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:102) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireChannelDisconnected(Channels.java:396) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.close(AbstractNioWorker.java:360) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.handleAcceptedSocket(NioServerSocketPipelineSink.java:81) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.eventSunk(NioServerSocketPipelineSink.java:36) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:779) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:54) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:784) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.closeRequested(SimpleChannelHandler.java:334) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.handleDownstream(SimpleChannelHandler.java:260) [io.netty.netty-3.10.3.Final.jar:na]
at com.typesafe.netty.http.pipelining.HttpPipeliningHandler.handleDownstream(HttpPipeliningHandler.java:106) [com.typesafe.netty.netty-http-pipelining-1.1.4.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:582) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.close(Channels.java:812) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.AbstractChannel.close(AbstractChannel.java:206) [io.netty.netty-3.10.3.Final.jar:na]
at play.core.server.netty.PlayDefaultUpstreamHandler.exceptionCaught(PlayDefaultUpstreamHandler.scala:66) [com.typesafe.play.play-netty-server_2.11-2.4.0.jar:2.4.0]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.exceptionCaught(SimpleChannelHandler.java:156) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:130) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.exceptionCaught(SimpleChannelUpstreamHandler.java:153) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.handler.codec.frame.FrameDecoder.exceptionCaught(FrameDecoder.java:377) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireExceptionCaught(Channels.java:525) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.AbstractChannelSink.exceptionCaught(AbstractChannelSink.java:48) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.notifyHandlerException(DefaultChannelPipeline.java:658) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:566) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [io.netty.netty-3.10.3.Final.jar:na]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [io.netty.netty-3.10.3.Final.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_31]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_31]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_31]

I am trying to run ShowFileStatusTest given in hadoop definited guide book, I get the following error

Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 3.836 sec <<< FAILURE!
throwsFileNotFoundForNonExistentFile(org.anahata.play.hadoop.ShowFileStatusTest) Time elapsed: 3.667 sec <<< ERROR!
java.lang.NullPointerException
at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:422)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:280)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:124)
at org.anahata.play.hadoop.ShowFileStatusTest.setUp(ShowFileStatusTest.java:57)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:35)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:115)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:97)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.maven.surefire.booter.ProviderFactory$ClassLoaderProxy.invoke(ProviderFactory.java:103)
at $Proxy0.invoke(Unknown Source)
at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:150)
at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcess(SurefireStarter.java:91)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:69)
fileStatusForFile(org.anahata.play.hadoop.ShowFileStatusTest) Time elapsed: 0.072 sec <<< ERROR!
java.io.IOException: Cannot lock storage /tmp/dfs/name1. The directory is already locked.
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.lock(Storage.java:602)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1219)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1237)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1164)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:184)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:267)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:124)
at org.anahata.play.hadoop.ShowFileStatusTest.setUp(ShowFileStatusTest.java:57)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:35)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:115)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:97)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.maven.surefire.booter.ProviderFactory$ClassLoaderProxy.invoke(ProviderFactory.java:103)
at $Proxy0.invoke(Unknown Source)
at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:150)
at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcess(SurefireStarter.java:91)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:69)
fileStatusForDirectory(org.anahata.play.hadoop.ShowFileStatusTest) Time elapsed: 0.044 sec <<< ERROR!
java.io.IOException: Cannot lock storage /tmp/dfs/name1. The directory is already locked.
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.lock(Storage.java:602)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1219)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1237)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1164)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:184)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:267)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:124)
at org.anahata.play.hadoop.ShowFileStatusTest.setUp(ShowFileStatusTest.java:57)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:35)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:115)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:97)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.maven.surefire.booter.ProviderFactory$ClassLoaderProxy.invoke(ProviderFactory.java:103)
at $Proxy0.invoke(Unknown Source)
at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:150)
at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcess(SurefireStarter.java:91)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:69)
As discussed in our chat session, this is a problem in the org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(String[], Configuration, SecureResources) method, which asserts that the data node data directories must be rwxr-xr-x.
There is a warning in the logs saying that you current have them as 'rwxr-x-r-x'. You'll need to configure your sessions umask (a linux concept, not a java 'thing') to be 0022, rather than the current value of 0002. Execute umask in the terminal to confirm this, then set it with umask 0022. Now your tests should succeed.
Depending on how you're executing your tests, you may need to apply the umask to your bash profile, and re-apply it:
http://www.cyberciti.biz/tips/understanding-linux-unix-umask-value-usage.html

Resources