Where can I find the neo4j 2.2 JDBC driver binaries? - jdbc

We are currently using the 2.0.0-M06 snapshot version of the neo4j jdbc driver and are trying to use the latest version available. We found the 2.1.4 version on the maven repository below,
https://m2.neo4j.org/content/repositories/releases/org/neo4j/neo4j-jdbc/
However, while trying to use this we see the below error..
Caused by: java.lang.IllegalStateException: Error during parsing
at org.neo4j.jdbc.rest.StreamingParser$ParserState.nextToken(StreamingParser.java:71)
at org.neo4j.jdbc.rest.StreamingParser.skipTo(StreamingParser.java:313)
at org.neo4j.jdbc.rest.StreamingParser.nextResult(StreamingParser.java:130)
at org.neo4j.jdbc.rest.StreamingParser$2.hasNext(StreamingParser.java:265)
at org.neo4j.jdbc.rest.StreamingParser$2$1.endReached(StreamingParser.java:269)
at org.neo4j.jdbc.rest.StreamingParser$1.hasNext(StreamingParser.java:201)
at org.neo4j.jdbc.IteratorResultSet.hasNext(IteratorResultSet.java:98)
at org.neo4j.jdbc.IteratorResultSet.next(IteratorResultSet.java:63)
at com.mchange.v2.c3p0.impl.NewProxyResultSet.next(NewProxyResultSet.java:2859)
... 92 more
Caused by: java.io.IOException: Stream closed
at sun.nio.cs.StreamDecoder.ensureOpen(StreamDecoder.java:46)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:148)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at org.codehaus.jackson.impl.ReaderBasedParser.loadMore(ReaderBasedParser.java:117)
at org.codehaus.jackson.impl.ReaderBasedParser._skipWSOrEnd(ReaderBasedParser.java:1476)
at org.codehaus.jackson.impl.ReaderBasedParser.nextToken(ReaderBasedParser.java:368)
at org.neo4j.jdbc.rest.StreamingParser$ParserState.nextToken(StreamingParser.java:67)
... 100 more
We found a reference that this is addressed in the 2.2 version of the driver and are therefore trying to download that. Can someone please point us in the right direction in getting this 2.2 binary for the neo4j-jdbc driver? Also, we currently use the neo4j 2.2 version for our db server.
Thx,
NN

I think version 2.2 is not released yet.
You can try to build your own binaries from the source code - https://github.com/neo4j-contrib/neo4j-jdbc

Related

Failing to connect MQ with java 11

After upgrading to JDK 11 from JDK 8 and MQ 9.2.0.5 (from 9.2.0.4), i'm getting the below error when trying to open JMS connection.
I'm running on WLS 14.
I also upgraded allclient.jar to 9.2.0.5.
I tried running it with previous MQ (9.2.0.4) which worked fine with java 8, i get the same issue.
Same code works fine with MQ 9.2.0.4, JDK 8 and WLS 12.
I verified that method exists in jar and verified no other versions of allclient jars exists.
com.ibm.mq.jmqi.JmqiException: CC=2;RC=2195;AMQ9546: Error return code
received.
[1=java.lang.NoSuchMethodException[com.ibm.mq.jmqi.remote.api.RemoteFAP.(com.ibm.mq.jmqi.JmqiEnvironment,
int)],3=Class.getConstructor0]
at com.ibm.mq.jmqi.JmqiEnvironment.getInstance(JmqiEnvironment.java:857)
at com.ibm.mq.jmqi.JmqiEnvironment.getMQI(JmqiEnvironment.java:702)
at com.ibm.msg.client.wmq.factories.WMQConnectionFactory.createV7ProviderConnection(WMQConnectionFactory.java:8437)
at com.ibm.msg.client.wmq.factories.WMQConnectionFactory.createProviderConnection(WMQConnectionFactory.java:7815)
at com.ibm.msg.client.jms.admin.JmsConnectionFactoryImpl._createConnection(JmsConnectionFactoryImpl.java:322)
at com.ibm.msg.client.jms.admin.JmsConnectionFactoryImpl.createConnection(JmsConnectionFactoryImpl.java:242)
at com.ibm.mq.jms.MQConnectionFactory.createCommonConnection(MQConnectionFactory.java:6026)
at com.ibm.mq.jms.MQConnectionFactory.createConnection(MQConnectionFactory.java:6086)
at org.springframework.jms.connection.UserCredentialsConnectionFactoryAdapter.doCreateConnection(UserCredentialsConnectionFactoryAdapter.java:188)
at . Caused by: java.lang.NoSuchMethodException: com.ibm.mq.jmqi.remote.api.RemoteFAP.(com.ibm.mq.jmqi.JmqiEnvironment,
int)
at java.base/java.lang.Class.getConstructor0(Class.java:3349)
at java.base/java.lang.Class.getConstructor(Class.java:2151)
at com.ibm.mq.jmqi.JmqiEnvironment.getInstance(JmqiEnvironment.java:764)
Remove if you have com.ibm.* related packages in prefer-application-packages section in weblogic.xml.
I just ran a couple of Java/JMS applications using OpenJDK 11 and they ran fine. I agree with Doug Grove that you probably have a mismatch of MQ JAR files.
Add the following line to your code and then update your question with the output:
System.out.println("java.class.path="+System.getProperty("java.class.path"));
If you want to get fancy then you can do:
if (null != System.getProperty("java.class.path"))
{
if (System.getProperty("os.name").startsWith("Windows"))
System.out.println("java.class.path=\n"+(System.getProperty("java.class.path")).replace(';', '\n'));
else
System.out.println("java.class.path\n="+(System.getProperty("java.class.path")).replace(':', '\n'));
}

Trying to establish a jdbc-connection to MS Access with "Ucanaccess"

I want to establish a JDBC connection to MS Access with Ucanaccess. The application, where the connection should be built in, runs with Java 7 and there is no chance to change to Java 8 right now. Ucanaccess 5.0.0 and 5.0.1 need Java 8, so they do not work for me.
Here is the error message:
EXEC*sqlconnect [1]=net.ucanaccess.jdbc.UcanaccessDriver
[2]=DriverManager.getConnection("jdbc:ucanaccess://C:/temp/Wincan.mdb")
Exception in thread "AWT-EventQueue-0"
java.lang.UnsupportedClassVersionError:
net/ucanaccess/jdbc/UcanaccessDriver : Unsupported major.minor version
52.0
Ucanaccess 4.0.4 doesn't work either, probably because it's compiled for Java 6. Here is the error message:
Error while connecting: sqlconnect:connect:No suitable driver found
for
DriverManager.getConnection("jdbc:ucanaccess://C:/temp/Wincan.mdb")
This is the Java, that is being installed:
Java(TM) Platform SE 7 U45, Product version 7.0.450.18
Is there any chance to get a Ucanaccess-Version that runs with my Java version mentioned above?

org.apache.kylin.job.exception.ExecuteException: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/serde2/typeinfo/TypeInfo

I find similar error on https://issues.apache.org/jira/browse/KYLIN-2511
env:
hadoop-2.7.1
hbase-1.3.2
apache-hive-2.1.1-bin
apache-kylin-1.6.0-hbase1.x-bin
I've tried copy all the hive libs to kylin, but get another ERROR.
org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoClassDefFoundError: org/apache/hadoop/hive/serde2/typeinfo/TypeInfo
The missing class should be in hive-exec-.jar; Check and debug the "bin/find-hive-dependency.sh" to see why it wasn't able to locate this jar from your server. You can manually add it to the "hive_exec_path" variable.
BTW, Kylin 1.6 is quite old, try to upgrade to a 2.x version.
Why you just try the method mentioned in https://issues.apache.org/jira/browse/KYLIN-2511. You'd better prepare the env according to the document of v16. It is better for using the latest version of Kylin. It has more feature and fixes some bugs.

HIVE_STATS_JDBC_TIMEOUT for Hive queries in Spark

I've just setup a new hadoop 3.0 cluster with Hive 2.3.2 and Spark 2.3. When I want to run some queries on Hive tables, getting following error.
I know there were some bugs in Hive, but seems like it was fixed for 2.1.1, but not sure what's the situation with 2.3.2 version. Do you have any idea if that could be handled somehow?
Thanks
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_151)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import spark.sql
import spark.sql
scala> sql("show databases")
java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
at org.apache.spark.sql.hive.HiveUtils$.formatTimeVarsForHiveClient(HiveUtils.scala:205)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638)
... 49 elided
I am running spar 2.3 with Hive 2.3.2 and encounter similar issue.
The fix you mentioned is for Hive 2.1 as can be seen from the Spark Jira following:
https://issues.apache.org/jira/browse/SPARK-13446
You can see from the latest comment that people are getting exactly same error as yours.
Also, as this so question answered, the current Hive version supported by Spark is 2.1

net.sf.jasperreports.engine.JRException: No deserializer defined

I am tring to connect HBASE with jasperreports-server-cp-6.0.1. I have hadoop 2.5.2 and hbase-1.0.1 installed on my system.
I have installed HBasePlugin-0.5.1.nbm plugin in iReport 5.6.0.
I have followed all the steps given in: http://community.jaspersoft.com/wiki/hadoop-hbase
When I write the following Query:
{ "tableName" : "blogposts", "deserializerClass" : "com.jaspersoft.hbase.deserialize.impl.ShellDeserializer" }
In iReport, I am getting the following error:
Message:
net.sf.jasperreports.engine.JRException: No deserializer defined
Level:
SEVERE
Stack Trace:
No deserializer defined
com.jaspersoft.hadoop.hbase.query.HBaseQueryWrapper.<init>(HBaseQueryWrapper.java:152)
com.jaspersoft.hadoop.hbase.HBaseFieldsProvider.getFields(HBaseFieldsProvider.java:50)
com.jaspersoft.ireport.hbase.designer.HBaseFieldsProvider.getFields(HBaseFieldsProvider.java:57)
com.jaspersoft.ireport.hbase.connection.HBaseConnection.readFields(HBaseConnection.java:185)
com.jaspersoft.ireport.designer.wizards.ConnectionSelectionWizardPanel.validate(ConnectionSelectionWizardPanel.java:146)
org.openide.WizardDescriptor$7.run(WizardDescriptor.java:1357)
org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:572)
org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:997)
Could you please help me with this error (I also tried with iReport 4.0.2, but I received the same error)?
Both iReport and the HBase connector are outdated.
Try using the Apache Phoenix JDBC driver which is compatible with the latest release (6.2) of the Jaspersoft products:
http://community.jaspersoft.com/wiki/how-use-apache-phoenix-jdbc-driver-run-reports-hbase
Thanks!

Resources