I'm trying to use CallableStatements to get the value of IDENTITY() in HSQLDB from Java JDBC.
I can prepareCall fine. The issue is with registerOutputParameter. I get "parameter index out of range" no matter what index I pass in.
I've tried SQL snippets like "{? = CALL IDENTITY()}" with no luck.
Any clues? Am I completely off track in how to invoke HSQLDB function routines from JDBC?
Instead of using IDENTITY(), use getGeneratedKeys() to retrieve any keys generated by the (insert) statement.
Note that you do need to use one of the Statement.execute... or Connection.prepare... methods that will enable this feature.
Gah.
http://sourceforge.net/tracker/index.php?func=detail&aid=3530755&group_id=23316&atid=378134
Output parameters for function invocation is not supported. Use executeQuery and grab the ResultSet.
Related
I want to switch database based on a request header, which I have managed to do but in a rather clunky way.
I had to change my jdbc driver to neo native driver to get the "USE database" prefix to work.
I had to prefix my cql query with "USE database "
What I really want is to do this via AOP, such that I can annotate the method with my custom java annotation and this aspect will just call "USE DATABASE" in isolation before going on to the joinpoint and calling the actual query.
When I try this though I get this error
Query cannot conclude with USE GRAPH (must be RETURN or an update clause)
Is it possible ?
You can change the database on the session object.
try (Session session = driver.session(SessionConfig.forDatabase( "databaseName" ))) {
session.executeWrite((tx) -> tx.run(stmt, params));
}
You shouldn't use JDBC for apps, that's rather for tools.
https://neo4j.com/docs/java-manual/current/cypher-workflow/#java-database-selection
I have written a beanshell sampler which has the objective of setting the multiple values for a single key. I tried with lpush just for the start. Then I went on using the zadd method which I believe is for adding multiple values for a single key.
But, in both the cases, I am getting an error in the log viewer that the respective method in the jedis class does not exist. How do I go about resolving this error.
I have tried to replace the jedis jar file(current version i am using is: 2.2.1) with the latest version(in the lib folder) in order to ensure the completeness of the methods to be used. But then, while starting the jmeter from command line throws java.lang.VerifyError.
Please let me know the solution for this problem.
Show your code so that we can help more precisely.
Another option is to use this if you only want to read the values:
http://jmeter-plugins.org/wiki/RedisDataSet/
I'm currently writing an aggregation query for MongoDB in my Spring project in which I'm using $project operator. Within this operator I would like to compare two fields in order to return the result as projected "matches" key value. Here's the mongoDB shell equivalent (which works):
{$project:
{matches:
{$eq: ["$lastDate", "$meta.date"]}
}
}
I've read Spring Data MongoDB documentation and found some useful info about ProjectionOperator's 'andExpression' method which uses SpEL. The result Java code of my investigation was:
new ProjectionOperation().andExpression("lastDate == meta.date").as("matches")
Unfortunately I'm receiving exception:
java.lang.IllegalArgumentException: Unsupported Element:
org.springframework.data.mongodb.core.spel.OperatorNode#70c1152a Type: class org.springframework.data.mongodb.core.spel.OperatorNode You probably have a syntax error in your SpEL expression!
As far as I've checked, Spring Data MongoDB handles all Arithmetic operators correctly but cannot handle the comparison ones. Therefore I want to ask is there any other way to create such query with Spring Data MongoDB? Or maybe I don't know something crucial about SpEL?
I resolved this issue by passing JSON aggregate command (created with DBObjects in order to preserve flexibility of the query) to MongoDB, i.e.:
MongoOperations#executeCommand(DBObject command)
Some of my DLL statements related to full text search (ex: CREATE INDEX...FOR TEXT) only run when executed from db2ts. I'd like to emit these statements directly from some java code using JDBC, but then an SQL syntax error is returned.
Is there a way to emit these commands from JDBC?
Actually, you have to call the admin proc in SYSPROC to do the job. The documentation is not that clear but I've been able to delete an index by using:
CALL SYSPROC.SYSTS_DROP('DB2ADMIN','TESTS_FTS_FT','en_US',?)
Trying to configure tigase to use hsqldb (hsqldb-1.8.0.9-1jpp.2) instead of derby (don't ask why, that's not the point) and everything works fine, except for setting some properties in the end. In Derby I had
CREATE procedure TigAddUserPlainPw(userId varchar(2049), userPw varchar(255))
PARAMETER STYLE JAVA
LANGUAGE JAVA
MODIFIES SQL DATA
DYNAMIC RESULT SETS 1
EXTERNAL NAME 'tigase.db.derby.StoredProcedures.tigAddUserPlainPw';
and
call TigAddUserPlainPw('db-properties', NULL);
When I try to replices this with hsqldb by
CREATE ALIAS TigAddUserPlainPw
FOR "tigase.db.derby.StoredProcedures.tigAddUserPlainPw";
and
CALL TigAddUserPlainPw('db-properties', NULL);
I get this error message
[root#tikanga scripts]# ./hsqldb-db-create.sh /var/lib/tigase/db/tigase
SQL Error at '/etc/tigase/database/hsqldb-schema-4-props.sql' line 1:
"CALL TigAddUserPlainPw('db-properties', NULL)"
Wrong data type: [Ljava.sql.ResultSet; in statement [CALL TigAddUserPlainPw(]
Any idea, what I am doing wrong?
You cannot use the Java static methods as they are. The Result[] parameters are not acceptable to HSQLDB 1.8.x.
It would be easier to convert to HSQLDB 2.0, as its stored procedure support has improved over version 1.8.
Your example shows we need to make some more improvements to HSQLDB to support these procedure declarations.
As far as I know, HSQLDB supports Java functions that return a ResultSet (Fred will correct me if I'm wrong): http://hsqldb.org/doc/2.0/guide/sqlroutines-chapt.html#N12835
You would need a new method:
public static ResultSet tigAddUserPlainPw(
String userId, String userPw) throws SQLException;