Unable to start elasticsearch 2.2.0 - elasticsearch

I have installed elasticsearch 2.2.0 and it worked fine for a week, but now it didn't start. I have set both JAVA_HOME and JRE_HOME. I use java version 1.8 (i.e., jdk1.8.0_201 and jre1.8.0_202). When I try to start the elasticsearch.bat it terminates with a error message of:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/cli/CommandLineParser
Likely root cause: java.lang.ClassNotFoundException:
org.apache.commons.cli.CommandLineParser
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:241)
at
org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
Refer to the log for complete error details.
But no logs has been generated.

Related

OserverCommandGetGephi ClassNotFoundException OrientDB 3.0.6

Environment :
Docker : version 1.12.6
Linux : Centos 7
Java : Java 1.8.0_45
OrientDB : 3.0.6
Am trying to invoke OrientDB instance from an image built up using the Dockerfile from https://github.com/orientechnologies/orientdb-docker/blob/bc7e1c48bedbd259df885dd922a13f79dfef0107/3.0/x86_64/alpine/Dockerfile.
when run the image i get an exception in the middle of the start up process
2018-09-04 07:28:30:865 INFO - shutdown storage: OSystem... [OrientDBDistributed]Error during server execution
java.lang.IllegalArgumentException: Cannot create custom command invoking the constructor: com.orientechnologies.orient.graph.server.command.OServerCommandGetGephi(com.orientechnologies.orient.server.config.OServerCommandConfiguration#501d7d86)
at com.orientechnologies.orient.server.network.OServerNetworkListener.createCommand(OServerNetworkListener.java:129)
at com.orientechnologies.orient.server.network.OServerNetworkListener.<init>(OServerNetworkListener.java:88)
at com.orientechnologies.orient.server.OServer.activate(OServer.java:453)
at com.orientechnologies.orient.server.OServerMain$1.run(OServerMain.java:48)
Caused by: java.lang.ClassNotFoundException: com.orientechnologies.orient.graph.server.command.OServerCommandGetGephi
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.orientechnologies.orient.server.network.OServerNetworkListener.createCommand(OServerNetworkListener.java:123)
While am able to download the same OrientDB bits and start up the instance directly (without using docker) thru the $ORIENT_HOME/bin/server.sh script.
Any help in resolving this issue ?
Please let me know.

StorageBackend version is incompatible with current JanusGraph version

Unable to start gremlin-server due to version mismatch. How can I fix this issue?
Here is the full stack trace
Exception in thread "main" java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:121)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:89)
at org.apache.tinkerpop.gremlin.server.GremlinServer.<init>(GremlinServer.java:110)
at org.apache.tinkerpop.gremlin.server.GremlinServer.main(GremlinServer.java:354)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:110)
... 3 more
Caused by: org.janusgraph.core.JanusGraphException: StorageBackend version is incompatible with current JanusGraph version: storage [0.2.1] vs. runtime [0.2.0]
at org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1427)
at org.janusgraph.core.JanusGraphFactory.lambda$open$0(JanusGraphFactory.java:152)
at org.janusgraph.graphdb.management.JanusGraphManager.openGraph(JanusGraphManager.java:210)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:151)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:101)
at org.janusgraph.graphdb.management.JanusGraphManager.lambda$new$0(JanusGraphManager.java:65)
at java.util.LinkedHashMap.forEach(LinkedHashMap.java:684)
at org.janusgraph.graphdb.management.JanusGraphManager.<init>(JanusGraphManager.java:64)
... 8 more
Exception in thread "gremlin-server-shutdown" java.lang.NullPointerException
at org.apache.tinkerpop.gremlin.server.GremlinServer.stop(GremlinServer.java:264)
at org.apache.tinkerpop.gremlin.server.GremlinServer.lambda$new$0(GremlinServer.java:91)
at java.lang.Thread.run(Thread.java:748)
I am working with janusgraph-0.2.0-hadoop2.zip downloaded from janusgraph website. I don't know why the error says it has janusgraph 0.2.1
Downloading janusgraph-0.2.1-hadoop2.zip and starting cassandra and gremlin-server running
./janusgraph-0.2.1-hadoop2/bin/cassandra
./janusgraph-0.2.1-hadoop2/bin/gremlin-server.sh
resolved the issue
Note : I also modified the /janusgraph-0.2.1-hadoop2/conf/gremlin-server/gremlin-server.yaml and gremlin-server-configuration.yaml files to use conf/janusgraph-cassandra.properties for ConfigurationManagementGraph property.
You need to add the following property in your configuration file. This property will allow upgrade and DB get compatible with the newer version.
graph.allow-upgrade=true
Name
Description
Datatype
Default Value
graph.allow-upgrade
Setting this to true will allow certain fixed values to be updated such as storage-version. This should only be used for upgrading.
Boolean
false

Problems when trying to create proxy in WSO2

I was training the creating of proxy in wso2, in the tutorial: https://docs.wso2.com/display/ESB490/Sending+a+Simple+Message+Through+the+ESB
and had some problems in pass 2 of: 'Building and deploying SimpleStockQuoteService', when i try to run the wso2server.bat, i receive the message bellow:
"Starting Sample Axis2 Server ..."
Using AXIS2_HOME: C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server
Using JAVA_HOME: C:\Program Files\Java\jdk1.8.0_121
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Server could not start due to class loading issue java.lang.NoSuchMethodException: samples.util.SampleAxis2Server.startServer([Ljava.lang.String;)
I saw in the internet that it would be because of my jdk version, and i tried the 1.7.0_45 and 1.6.0_45. In this case i had this message:
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
The AXIS2_HOME was created too with the way: 'C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server'
I really don't know what happening. If someone could help me i would stay quite grateful.
ps: After tried the tutorial of 584, to change the axis2server.bat, in the git hub, i receive this message:
"Starting Sample Axis2 Server ..."
Using AXIS2_HOME: C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server
Using JAVA_HOME: C:\Program Files\Java\jdk1.8.0_144
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[main] INFO samples.util.SampleAxis2ServerManager - [SimpleAxisServer] Starting
[SimpleAxisServer] Using the Axis2 Repository : C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server\repository
[SimpleAxisServer] Using the Axis2 Configuration File : C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server\repository\conf\axis2.xml
[main] WARN org.apache.axiom.util.stax.dialect.StAXDialectDetector - Unable to determine dialect of the StAX implementation at jar:file:/C:/Oxaguia%20spk/Trabalhos/Coach%20IT/Piramidal/Servers/wso2ei-6.1.1/wso2/components/plugins/axiom_1.2.11.wso2v11.jar!/
[main] FATAL samples.util.SampleAxis2ServerManager - [SimpleAxisServer] Shutting down. Error starting SimpleAxisServer
org.apache.axis2.deployment.DeploymentException: javax/transaction/SystemException
at org.apache.axis2.deployment.AxisConfigBuilder.processTransportSenders(AxisConfigBuilder.java:704)
at org.apache.axis2.deployment.AxisConfigBuilder.populateConfig(AxisConfigBuilder.java:124)
at org.apache.axis2.deployment.DeploymentEngine.populateAxisConfiguration(DeploymentEngine.java:887)
at org.apache.axis2.deployment.FileSystemConfigurator.getAxisConfiguration(FileSystemConfigurator.java:116)
at org.apache.axis2.context.ConfigurationContextFactory.createConfigurationContext(ConfigurationContextFactory.java:64)
at org.apache.axis2.context.ConfigurationContextFactory.createConfigurationContextFromFileSystem(ConfigurationContextFactory.java:210)
at samples.util.SampleAxis2ServerManager.start(SampleAxis2ServerManager.java:93)
at samples.util.SampleAxis2Server.startServer(SampleAxis2Server.java:61)
at samples.util.SampleAxis2Server.main(SampleAxis2Server.java:40)
Caused by: java.lang.NoClassDefFoundError: javax/transaction/SystemException
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.newInstance(Class.java:412)
at org.apache.axis2.deployment.AxisConfigBuilder.processTransportSenders(AxisConfigBuilder.java:688)
... 8 more
Caused by: java.lang.ClassNotFoundException: javax.transaction.SystemException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 13 more
Thank you everyone.
Enterprise Integrator 6.1.1 supported on JDK version 1.8.* 1 (Not supported on lower JDK versions).
The issue you observed is fixed 2 in the upcoming version.
The 2nd error happens because you are using a higher JDK during compile time and lower JDK during runtime.
Try both compiling and running on 1.7.

Spark without Hadoop: Failed to Launch

I'm running Spark 2.1.0, Hive 2.1.1 and Hadoop 2.7.3 on Ubuntu 16.04.
I download the Spark project from github and build the "without hadoop" version:
./dev/make-distribution.sh --name "hadoop2-without-hive" --tgz
"-Pyarn,hadoop-provided,hadoop-2.7,parquet-provided"
When I run ./sbin/start-master.sh, I get the following exception:
Spark Command: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp /home/server/spark/conf/:/home/server/spark/jars/*:/home/server/hadoop/etc/hadoop/:/home/server/hadoop/share/hadoop/common/lib/:/home/server/hadoop/share/hadoop/common/:/home/server/hadoop/share/hadoop/mapreduce/:/home/server/hadoop/share/hadoop/mapreduce/lib/:/home/server/hadoop/share/hadoop/yarn/:/home/server/hadoop/share/hadoop/yarn/lib/ -Xmx1g org.apache.spark.deploy.master.Master --host ThinkPad-W550s-Lab --port 7077 --webui-port 8080
========================================
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
I edit SPARK_DIST_CLASSPATH according to the post Where are hadoop jar files in hadoop 2?
export SPARK_DIST_CLASSPATH=~/hadoop/share/hadoop/common/lib:~/hadoop/share/hadoop/common:~/hadoop/share/hadoop/mapreduce:~/hadoop/share/hadoop/mapreduce/lib:~/hadoop/share/hadoop/yarn:~/hadoop/share/hadoop/yarn/lib
But I'm still getting the same error.
I can see the slf4j jar file is under ~/hadoop/share/hadoop/common/lib.
How could I fix this error?
Thank you!
“Hadoop free” builds need to modify SPARK_DIST_CLASSPATH to include Hadoop’s package jars.
The most convenient place to do this is by adding an entry in conf/spark-env.sh :
export SPARK_DIST_CLASSPATH=$(/path/to/hadoop/bin/hadoop classpath)
check this https://spark.apache.org/docs/latest/hadoop-provided.html

spark2-submit is throwing java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream [duplicate]

This question already has answers here:
Resolving dependency problems in Apache Spark
(7 answers)
Closed 5 years ago.
I am getting following error while running spark2-submit after installing spark 2.0.0.
Does anyone know why it is not able to point to hadoop jar files? When I 'echo $HADOOP_HOME' in spark2-submit its shows the correct HADOOP_HOME path.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:118)
at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:104)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:117)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
Try to add the jars in the HADOOP_HOME/share/hadoop to the CLASSPATH.
Such as I add CLASSPATH in the /etc/profile,and my hadoop version is 2.7.2
export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib:$HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.2.jar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.7.2.jar:$HADOOP_HOME/share/hadoop/common/lib/commons-cli-1.2.jar:$CLASSPATH

Resources