Unable to start gremlin-server due to version mismatch. How can I fix this issue?
Here is the full stack trace
Exception in thread "main" java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:121)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:89)
at org.apache.tinkerpop.gremlin.server.GremlinServer.<init>(GremlinServer.java:110)
at org.apache.tinkerpop.gremlin.server.GremlinServer.main(GremlinServer.java:354)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:110)
... 3 more
Caused by: org.janusgraph.core.JanusGraphException: StorageBackend version is incompatible with current JanusGraph version: storage [0.2.1] vs. runtime [0.2.0]
at org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1427)
at org.janusgraph.core.JanusGraphFactory.lambda$open$0(JanusGraphFactory.java:152)
at org.janusgraph.graphdb.management.JanusGraphManager.openGraph(JanusGraphManager.java:210)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:151)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:101)
at org.janusgraph.graphdb.management.JanusGraphManager.lambda$new$0(JanusGraphManager.java:65)
at java.util.LinkedHashMap.forEach(LinkedHashMap.java:684)
at org.janusgraph.graphdb.management.JanusGraphManager.<init>(JanusGraphManager.java:64)
... 8 more
Exception in thread "gremlin-server-shutdown" java.lang.NullPointerException
at org.apache.tinkerpop.gremlin.server.GremlinServer.stop(GremlinServer.java:264)
at org.apache.tinkerpop.gremlin.server.GremlinServer.lambda$new$0(GremlinServer.java:91)
at java.lang.Thread.run(Thread.java:748)
I am working with janusgraph-0.2.0-hadoop2.zip downloaded from janusgraph website. I don't know why the error says it has janusgraph 0.2.1
Downloading janusgraph-0.2.1-hadoop2.zip and starting cassandra and gremlin-server running
./janusgraph-0.2.1-hadoop2/bin/cassandra
./janusgraph-0.2.1-hadoop2/bin/gremlin-server.sh
resolved the issue
Note : I also modified the /janusgraph-0.2.1-hadoop2/conf/gremlin-server/gremlin-server.yaml and gremlin-server-configuration.yaml files to use conf/janusgraph-cassandra.properties for ConfigurationManagementGraph property.
You need to add the following property in your configuration file. This property will allow upgrade and DB get compatible with the newer version.
graph.allow-upgrade=true
Name
Description
Datatype
Default Value
graph.allow-upgrade
Setting this to true will allow certain fixed values to be updated such as storage-version. This should only be used for upgrading.
Boolean
false
Related
I have installed elasticsearch 2.2.0 and it worked fine for a week, but now it didn't start. I have set both JAVA_HOME and JRE_HOME. I use java version 1.8 (i.e., jdk1.8.0_201 and jre1.8.0_202). When I try to start the elasticsearch.bat it terminates with a error message of:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/cli/CommandLineParser
Likely root cause: java.lang.ClassNotFoundException:
org.apache.commons.cli.CommandLineParser
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:241)
at
org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
Refer to the log for complete error details.
But no logs has been generated.
I was training the creating of proxy in wso2, in the tutorial: https://docs.wso2.com/display/ESB490/Sending+a+Simple+Message+Through+the+ESB
and had some problems in pass 2 of: 'Building and deploying SimpleStockQuoteService', when i try to run the wso2server.bat, i receive the message bellow:
"Starting Sample Axis2 Server ..."
Using AXIS2_HOME: C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server
Using JAVA_HOME: C:\Program Files\Java\jdk1.8.0_121
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Server could not start due to class loading issue java.lang.NoSuchMethodException: samples.util.SampleAxis2Server.startServer([Ljava.lang.String;)
I saw in the internet that it would be because of my jdk version, and i tried the 1.7.0_45 and 1.6.0_45. In this case i had this message:
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
The AXIS2_HOME was created too with the way: 'C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server'
I really don't know what happening. If someone could help me i would stay quite grateful.
ps: After tried the tutorial of 584, to change the axis2server.bat, in the git hub, i receive this message:
"Starting Sample Axis2 Server ..."
Using AXIS2_HOME: C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server
Using JAVA_HOME: C:\Program Files\Java\jdk1.8.0_144
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[main] INFO samples.util.SampleAxis2ServerManager - [SimpleAxisServer] Starting
[SimpleAxisServer] Using the Axis2 Repository : C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server\repository
[SimpleAxisServer] Using the Axis2 Configuration File : C:\Oxaguia spk\Trabalhos\Coach IT\Piramidal\Servers\wso2ei-6.1.1\samples\axis2Server\repository\conf\axis2.xml
[main] WARN org.apache.axiom.util.stax.dialect.StAXDialectDetector - Unable to determine dialect of the StAX implementation at jar:file:/C:/Oxaguia%20spk/Trabalhos/Coach%20IT/Piramidal/Servers/wso2ei-6.1.1/wso2/components/plugins/axiom_1.2.11.wso2v11.jar!/
[main] FATAL samples.util.SampleAxis2ServerManager - [SimpleAxisServer] Shutting down. Error starting SimpleAxisServer
org.apache.axis2.deployment.DeploymentException: javax/transaction/SystemException
at org.apache.axis2.deployment.AxisConfigBuilder.processTransportSenders(AxisConfigBuilder.java:704)
at org.apache.axis2.deployment.AxisConfigBuilder.populateConfig(AxisConfigBuilder.java:124)
at org.apache.axis2.deployment.DeploymentEngine.populateAxisConfiguration(DeploymentEngine.java:887)
at org.apache.axis2.deployment.FileSystemConfigurator.getAxisConfiguration(FileSystemConfigurator.java:116)
at org.apache.axis2.context.ConfigurationContextFactory.createConfigurationContext(ConfigurationContextFactory.java:64)
at org.apache.axis2.context.ConfigurationContextFactory.createConfigurationContextFromFileSystem(ConfigurationContextFactory.java:210)
at samples.util.SampleAxis2ServerManager.start(SampleAxis2ServerManager.java:93)
at samples.util.SampleAxis2Server.startServer(SampleAxis2Server.java:61)
at samples.util.SampleAxis2Server.main(SampleAxis2Server.java:40)
Caused by: java.lang.NoClassDefFoundError: javax/transaction/SystemException
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.newInstance(Class.java:412)
at org.apache.axis2.deployment.AxisConfigBuilder.processTransportSenders(AxisConfigBuilder.java:688)
... 8 more
Caused by: java.lang.ClassNotFoundException: javax.transaction.SystemException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 13 more
Thank you everyone.
Enterprise Integrator 6.1.1 supported on JDK version 1.8.* 1 (Not supported on lower JDK versions).
The issue you observed is fixed 2 in the upcoming version.
The 2nd error happens because you are using a higher JDK during compile time and lower JDK during runtime.
Try both compiling and running on 1.7.
I am unable to start Embedded drillbit on windows machine and getting the following error. I have checked for the jars in 3rd party folder where Jackson-databind-2.7.1.jar is present, still it's saying class not found exception. Can you help me here?
Error: Failure in starting embedded Drillbit: UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance of the class of type org.apache.drill.exec.store.StoragePluginRegistry requested at path drill.exec.storage.registry.
[Error Id: 4e654256-f63d-434f-8f41-981892a776b5 ] (state=,code=0)
java.sql.SQLException: Failure in starting embedded Drillbit: UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance of the class of type org.apache.drill.exec.store.StoragePluginRegistry requested at path drill.exec.storage.registry.
[Error Id: 4e654256-f63d-434f-8f41-981892a776b5 ]
at org.apache.drill.jdbc.impl.DrillConnectionImpl.(DrillConnectionImpl.java:120)
at org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:64)
at org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:69)
at net.hydromatic.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:126)
at org.apache.drill.jdbc.Driver.connect(Driver.java:72)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:167)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:213)
at sqlline.Commands.connect(Commands.java:1083)
at sqlline.Commands.connect(Commands.java:1015)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)
at sqlline.SqlLine.dispatch(SqlLine.java:742)
at sqlline.SqlLine.initArgs(SqlLine.java:528)
at sqlline.SqlLine.begin(SqlLine.java:596)
at sqlline.SqlLine.start(SqlLine.java:375)
at sqlline.SqlLine.main(SqlLine.java:268)
Caused by: org.apache.drill.common.exceptions.UserException: UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance of the class of type org.apache.drill.exec.store.StoragePluginRegistry requested at path drill.exec.storage.registry.
[Error Id: 4e654256-f63d-434f-8f41-981892a776b5 ]
at org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:543)
at org.apache.drill.common.config.DrillConfig.getInstance(DrillConfig.java:88)
at org.apache.drill.exec.server.DrillbitContext.(DrillbitContext.java:85)
at org.apache.drill.exec.work.WorkManager.start(WorkManager.java:105)
at org.apache.drill.exec.server.Drillbit.run(Drillbit.java:110)
at org.apache.drill.jdbc.impl.DrillConnectionImpl.(DrillConnectionImpl.java:118)
... 18 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.drill.common.config.DrillConfig.getInstance(DrillConfig.java:86)
... 22 more
Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
at org.apache.drill.exec.serialization.JacksonSerializer.(JacksonSerializer.java:32)
at org.apache.drill.exec.store.sys.PersistentStoreConfig.newJacksonBuilder(PersistentStoreConfig.java:81)
at org.apache.drill.exec.store.StoragePluginRegistryImpl.(StoragePluginRegistryImpl.java:90)
... 27 more
apache drill 1.6.0
"this isn't your grandfather's sql"
The issue is related with the HADOOP_HOME environment variable
If it is set, Embedded Drill does not start properly
My HADOOP_HOME was set because I use sometimes Spark or Hadoop MapReduce on my machine.
So, with
set HADOOP_HOME=
and then
sqlline.bat -u "jdbc:drill:zk=local"
The initialization is completed and the Drillbit starts
I am testing Tomcat7 showing data from the example table "malaga_plagues". I have done changes but, when I try test it again, I obtain the following error creating .war file.
[root#host-192-168-192-78 AMS_Widhoc]# ../../../apache-maven-3.3.3/bin/mvn package
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/maven/cli/MavenCli : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:643)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClassFromSelf(ClassRealm.java:401)
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:42)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:271)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:254)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:239)
at org.codehaus.plexus.classworlds.launcher.Launcher.getMainClass(Launcher.java:144)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:266)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[root#host-192-168-192-78 AMS_Widhoc]#
One month ago this was working fine, but now I have resumed the example and it does not work. I reinstalled Orion CB in mi VM but I didn't change anything in Tomcat.
Could you help me? Thank you.
Maven 3.3.3 is only compatible with Java 7+, and it seems that you are trying to run it with Java 6. Please check your JAVA_HOME environment variable, it should point to a Java 7 JDK.
I am trying to get a Spark/Shark cluster up but keep running into the same problem.
I have followed the instructions on https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster and addressed Hive as stated.
I think that the Shark Driver is picking up another version of Hadoop jars but am unsure why.
Here are the details, any help would be great.
Spark/Shark 0.9.0
Apache Hadoop 2.3.0
Amplabs Hive 0.11
Scala 2.10.3
Java 7
I have everything install but I get some deprecation warnings and then an exception:
14/03/14 11:24:47 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
14/03/14 11:24:47 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
Exception:
Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1072)
at shark.memstore2.TableRecovery$.reloadRdds(TableRecovery.scala:49)
at shark.SharkCliDriver.<init>(SharkCliDriver.scala:275)
at shark.SharkCliDriver$.main(SharkCliDriver.scala:162)
at shark.SharkCliDriver.main(SharkCliDriver.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1139)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2288)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2299)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1070)
... 4 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1137)
... 9 more
Caused by: java.lang.UnsupportedOperationException: Not implemented by the DistributedFileSystem FileSystem implementation
I had this same problem, and I think it's caused by incompatible versions of hadoop/hive and spark/shark.
You need to either:
Remove hadoop-core-1.0.x.jar from shark/lib_managed/jars/org.apache.hadoop/hadoop-core/
When building shark, explicitly set SHARK_HADOOP_VERSION as follows:
cd shark;
SHARK_HADOOP_VERSION=2.0.0-mr1-cdh4.5.0 ./sbt/sbt clean
SHARK_HADOOP_VERSION=2.0.0-mr1-cdh4.5.0 ./sbt/sbt package
The second method solved other issues for me as well. You can also see this topic for more details: https://groups.google.com/forum/#!msg/shark-users/lTNPcxHJiOQ/EqzyByZrzQMJ