Ignite 2.10.0 org.h2.jdbc.JdbcSQLException: Function "LOCK_MODE" not found - h2

Recently, I upgraded the ignite version from 2.7 to 2.10.0. Since then my test IT with #SpringbootTest cases are failing.
I am using h2 version 1.4.197 in the pom as h2.version.
The upgrade to 1.4.200 didn't solve my problem
Caused by: org.h2.jdbc.JdbcSQLException: Function "LOCK_MODE" not found; SQL statement:
CALL LOCK_MODE() [90022-197]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:357)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.command.Parser.readJavaFunction(Parser.java:2699)
at org.h2.command.Parser.readFunction(Parser.java:2756)
at org.h2.command.Parser.readTerm(Parser.java:3102)
at org.h2.command.Parser.readFactor(Parser.java:2587)
at org.h2.command.Parser.readSum(Parser.java:2574)
at org.h2.command.Parser.readConcat(Parser.java:2544)
at org.h2.command.Parser.readCondition(Parser.java:2370)
at org.h2.command.Parser.readAnd(Parser.java:2342)
at org.h2.command.Parser.readExpression(Parser.java:2334)
at org.h2.command.Parser.parseCall(Parser.java:4854)
at org.h2.command.Parser.parsePrepared(Parser.java:382)
at org.h2.command.Parser.parse(Parser.java:335)
at org.h2.command.Parser.parse(Parser.java:307)
at org.h2.command.Parser.prepareCommand(Parser.java:278)
at org.h2.engine.Session.prepareLocal(Session.java:611)
at org.h2.engine.Session.prepareCommand(Session.java:549)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1251)
at org.h2.jdbc.JdbcConnection.getTransactionIsolation(JdbcConnection.java:815)
at com.zaxxer.hikari.pool.PoolBase.checkDefaultIsolation(PoolBase.java:479)
at com.zaxxer.hikari.pool.PoolBase.checkDriverSupport(PoolBase.java:442)
at com.zaxxer.hikari.pool.PoolBase.setupConnection(PoolBase.java:410)
at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:363)
at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:206)
at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:477)
at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:560)
... 151 more
Thank you for your help. I have been trying to solve this problem for a while but with no luck

The Function in h2 core will add these methods in static block, but somehow the function map has missed something, didn't know why, at least in my case it will work when I add the following code(JDK11):
try {
Field functionInfos = Function.class.getDeclaredField("FUNCTIONS");
functionInfos.setAccessible(true);
var lookup = MethodHandles.privateLookupIn(Field.class, MethodHandles.lookup());
VarHandle modifiers = lookup.findVarHandle(Field.class, "modifiers", int.class);
int mods = functionInfos.getModifiers();
if (Modifier.isFinal(mods)) {
modifiers.set(functionInfos, mods & ~Modifier.FINAL);
}
HashMap map = (HashMap) functionInfos.get(null);
System.out.println(" =================== function size: "+map.size());
if (map.size() < 189) {
var method = MethodHandles.privateLookupIn(Function.class, MethodHandles.lookup());
MethodHandle addFunc = method.findStatic(Function.class, "addFunctionNotDeterministic", MethodType.methodType(void.class, String.class,
int.class, int.class, int.class));
addFunc.invoke("LOCK_MODE",214,0, org.h2.value.Value.INT);
}
} catch (Throwable e) {
e.printStackTrace();
}

Related

Exception with Spring Data Cassandra

I am getting exception while querying cassandra using Spring-Data-Cassandra . Please help
2021-06-01 12:09:48.594 INFO 9568 --- [nio-8080-exec-3] c.e.demo2.Controller.DemoController : Error : org.springframework.data.cassandra.CassandraUncategorizedException: Query; CQL [select * from summary_data where proj_id = ? and category = ? and name = ? and time >= ?]; Query timed out after PT10S; nested exception is com.datastax.oss.driver.api.core.DriverTimeoutException: Query timed out after PT10S
private final AsyncCassandraOperations asyncCassandraTemplate;
public List<Data1> getData(String convProjectId, List<String> stageNames,
String eventCategory, List<String> distinctDateHour) {
final String cql = "select * from summary_data where proj_id = ? and category = ? and name = ? and time >= ?";
List<BanEventLifecycle> bList = new ArrayList<>();
ArrayList<ListenableFuture<List<Data1>>> bFutureList = new ArrayList<>();
distinctDateHour.forEach(dateHr-> stageNames.forEach(stageName->{
ListenableFuture<List<Data1>> futureBData = asyncCassandraTemplate.getAsyncCqlOperations().query(cql,
ps -> ps.bind().setString(0, projectId.toLowerCase()).setString(1,dateHr)
.setString(2, category.toLowerCase()).setString(3, name),
(row, rowNum) -> Data1.getData1(row));
bFutureList.add(futureBData);
}));
bFutureList.forEach(future -> {
try {
banList.addAll(future.get());
} catch (Exception exception) {
}
});
return bList;
}
Spring-Data-Cassandra : 3.0.9.RELEASE
The Java driver for Cassandra throws a DriverTimeoutException when it doesn't get a response back from the coordinator within the driver request timeout which is your case is 10 seconds (PT10S).
You need to review the Cassandra logs to determine why the coordinator didn't respond to driver's request. Either you were executing an expensive query or the cluster was overloaded at the time.
Note that you shouldn't confuse DriverTimeoutException with a ReadTimeoutException which is the error thrown by the coordinator when it hasn't received responses from the replica(s) within the read request timeout. Cheers!

Error executing query in Java code to connect to Presto

We are trying to connect to Presto using Java code and execute some queries. Catalog we are using is MySQL.
Presto is installed on the Linux server. Presto CLI is working fine on Linux. Started Presto in Linux.
MySQL is also installed on the Linux machine. We are able to access MySQL in windows using DbVisualizer.
I created a MySQL connector catalog for Presto. I'm successful in querying data of MySQL using Presto CLI as presto --server localhost:8080 --catalog mysql --schema tutorials.
Executing the Java code on the Windows machine, I'm able to access MySQL and execute queries, but we are unable to query data. When we try to run a query from Presto, it is giving us Error Executing Query. In the below example, I have used a jar from Trinosql
package testdbPresto;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.Properties;
public class PrestoJdbc {
public static void main(String args[]) throws SQLException, ClassNotFoundException {
try{
//connect mysql server tutorials database here
Class.forName("com.facebook.presto.jdbc.PrestoDriver");
String url = "jdbc:trino://35.173.241.37:8080/mysql/tutorials";
Properties properties = new Properties();
properties.setProperty("user", "root");
properties.setProperty("password", "Redcar88!");
properties.setProperty("SSL", "true");
Connection connection = DriverManager.getConnection(url, properties);
Statement statement = null;
statement = connection.createStatement();
//select mysql table author table two columns
String sql;
sql = "select auth_id, auth_name from mysql.tutorials.author";
ResultSet resultSet = statement.executeQuery(sql);
//Extract data from result set
while (resultSet.next()) {
//Retrieve by column name
String name = resultSet.getString("auth_name");
//Display values
System.out.println("name : " + name);
}
//Clean-up environment
resultSet.close();
statement.close();
connection.close();
}catch(Exception e){ e.printStackTrace();}
}
}
Output:
java.sql.SQLException: Error executing query
at io.trino.jdbc.TrinoStatement.internalExecute(TrinoStatement.java:274)
at io.trino.jdbc.TrinoStatement.execute(TrinoStatement.java:227)
at io.trino.jdbc.TrinoStatement.executeQuery(TrinoStatement.java:76)
at testdbPresto.PrestoJdbc.main(PrestoJdbc.java:29)
Caused by: java.io.UncheckedIOException: javax.net.ssl.SSLException: Unsupported or unrecognized SSL message
at io.trino.jdbc.$internal.client.JsonResponse.execute(JsonResponse.java:154)
at io.trino.jdbc.$internal.client.StatementClientV1.<init>(StatementClientV1.java:110)
at io.trino.jdbc.$internal.client.StatementClientFactory.newStatementClient(StatementClientFactory.java:24)
at io.trino.jdbc.QueryExecutor.startQuery(QueryExecutor.java:46)
at io.trino.jdbc.TrinoConnection.startQuery(TrinoConnection.java:728)
at io.trino.jdbc.TrinoStatement.internalExecute(TrinoStatement.java:239)
... 3 more
Caused by: javax.net.ssl.SSLException: Unsupported or unrecognized SSL message
at sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:448)
at sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:174)
at sun.security.ssl.SSLTransport.decode(SSLTransport.java:110)
at sun.security.ssl.SSLSocketImpl.decode(SSLSocketImpl.java:1279)
at sun.security.ssl.SSLSocketImpl.readHandshakeRecord(SSLSocketImpl.java:1188)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:401)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:373)
at io.trino.jdbc.$internal.okhttp3.internal.connection.RealConnection.connectTls(RealConnection.java:299)
at io.trino.jdbc.$internal.okhttp3.internal.connection.RealConnection.establishProtocol(RealConnection.java:268)
at io.trino.jdbc.$internal.okhttp3.internal.connection.RealConnection.connect(RealConnection.java:160)
at io.trino.jdbc.$internal.okhttp3.internal.connection.StreamAllocation.findConnection(StreamAllocation.java:256)
at io.trino.jdbc.$internal.okhttp3.internal.connection.StreamAllocation.findHealthyConnection(StreamAllocation.java:134)
at io.trino.jdbc.$internal.okhttp3.internal.connection.StreamAllocation.newStream(StreamAllocation.java:113)
at io.trino.jdbc.$internal.okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.java:42)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.java:93)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.java:93)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.java:125)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.client.OkHttpUtil.lambda$basicAuth$1(OkHttpUtil.java:85)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.client.OkHttpUtil.lambda$userAgent$0(OkHttpUtil.java:71)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:200)
at io.trino.jdbc.$internal.okhttp3.RealCall.execute(RealCall.java:77)
at io.trino.jdbc.$internal.client.JsonResponse.execute(JsonResponse.java:131)
... 8 more
It is quite old question but it might be still relevant.
You are trying connect to trino with presto jdbc driver. PrestoSQL was rebranded as Trino . So in order to access trino via jdb, you should use trino jdbc driver.
Add trino dependency in your classpath.
If you use maven, add this dependency in the pom.
<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-jdbc</artifactId>
<version>${trino-jdbc.version}</version>
</dependency>
Then use the following driver
Class.forName("io.trino.jdbc.TrinoDriver");
Here is a code that works with Trino.
fun main() {
val trinoUrl = "jdbc:trino://myDomain:443"
val properties = Properties()
properties.setProperty("user", "noUserS")
// properties.setProperty("password", "noPass")
properties.setProperty("SSL", "true")
DriverManager.getConnection(trinoUrl, properties).use { trinoConn ->
trinoConn.createStatement().use { statement ->
statement.connection.catalog = "catalog1"
statement.connection.schema = "default"
println("Executing query...")
statement.executeQuery("""
select
restaurantId,
type,
time
from table1
where time > CURRENT_TIMESTAMP - INTERVAL '1' hour
""".trimIndent()
).use { resultSet ->
val list = mutableListOf<Map<String, String>>()
while (resultSet.next()) {
val data = mapOf(
"restaurantId" to resultSet.getString("restaurantId"),
"type" to resultSet.getString("type"),
"time" to resultSet.getString("time")
)
list.add(data)
}
println("Records returned: ${list.size}")
println(list)
}
}
}
exitProcess(0)
}
It is Kotlin, but it's easy to understand.
The .use {..} it's try-with-resources in Java.
Hope this helps.

Phoenix-hbase can't execute upsert query through jdbc

Firstly I want to clarify that create query is working fine.
when doing the following: query 1 & query 3 results in error on executing query 3 but query 2 & 3 works fine.I can't find anythong on the net.
class test {
public static void main(String args[]) {
Connection connection;
try {
Class.forName("org.apache.phoenix.jdbc.PhoenixDriver");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
connection = DriverManager.getConnection("jdbc:phoenix:localhost:2181/hbase-insecure");
//(note:i have tried without /hbase - insecure the result is same)
//query 1:->
connection.createStatement().executeUpdate("UPSERT INTO tableName VALUES('1','randomValue','randomValue',1234567890, 'randomValue', 'randomValue')");
//query 2:->
connection.createStatement().executeUpdate("CREATE TABLE IF NOT EXISTS tableName (A VARCHAR(40), Z.B.type VARCHAR, Z.C VARCHAR, Z.D UNSIGNED_LONG, Z.E VARCHAR,X.F VARCHAR CONSTRAINT rowkey PRIMARY KEY (A))");
//query 3:->
connection.commit();
}
}
error:Exception in thread "streaming-job-executor-0"
java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.KeyValueUtil.length(Lorg/apache/hadoop/hbase/Cell;)I
at
org.apache.phoenix.util.PhoenixKeyValueUtil.calculateMutationDiskSize(PhoenixKeyValueUtil.java:182)
at
org.apache.phoenix.execute.MutationState.calculateMutationSize(MutationState.java:800)
at
org.apache.phoenix.execute.MutationState.send(MutationState.java:971)
at
org.apache.phoenix.execute.MutationState.send(MutationState.java:1344)
at
org.apache.phoenix.execute.MutationState.commit(MutationState.java:1167)
at
org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:670)
at
org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:666)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at
org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:666)
at
com.kratinmobile.uep.services.SparkStream.lambda$null$0(SparkStream.java:119)
at java.lang.Iterable.forEach(Iterable.java:75) at
com.kratinmobile.uep.services.SparkStream.lambda$startStreaming$10899135$1(SparkStream.java:102)
at
org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)
at
org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at
org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192) at
org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Looking at the stack trace it most likely looks like a classpath mistmatch or a version mismatch.

Spocking the JDBC

I have some Groovy 2.4.x code that uses some JDBC:
class WidgetPersistor {
#Inject // Gets injected correctly by Guice, don't worry about it!
DataSource dataSource
Fizz getFizzByWidgetName(String name) {
Connection conn
PreparedStatement ps
ResultSet rs
try {
// JDBC code here
} catch(SQLException sqlExc) {
if(conn) {
try {
// NOTE: At the end of the day, I just want to verify
// that, given the 'name' arg to this method, the rollback
// doesn't fire!
conn.rollback()
} catch(SQLException rollBackExc) {
throw rollBackExc
}
}
throw sqlExc
} finally {
if(conn) {
try {
rs.close()
ps.close()
conn.close()
} catch(SQLException closingExc) {
throw closingExc
}
}
}
}
}
I am trying to write a Spock test that will execute this getFizzByWidgetName method and verify that the conn.rollback() method never executed (meaning we never tried to rollback).
Here's my best attempt:
def "getFizzByWidgetName succeeds without rollback"() {
given: "data client with db connections"
// Don't worry about how I get this for my test, but its a legit JDBC connection
DataSource ds = provideDataSource()
Connection mockConn = Mock(Connection)
PreparedStatement mockPS = Mock(PreparedStatement)
ResultSet mockRS = Mock(ResultSet)
mockPS.executeQuery() >> mockRS
mockConn.prepareStatement(_) >> mockPS
ds.connection >> mockConn // ??? Its like I want the DataSource half-mocked...
WidgetPersistor client = new WidgetPersistor(mockDS)
when: "we try to query something"
client.getFizzByWidgetName('fizzbuzz')
then: "we dont get any errors"
0 * mockConn.rollback()
}
Any ideas where I'm going awry?
If you are using a DataSource from a real database, and your code under test is written in Groovy (which looks like the case), you can use the metaclass to test this kind of things:
DataSource ds = provideDataSource()
def connection = ds.connection
connection.metaClass.rollback = { throw new AssertionError("rollback called") }
ds.metaClass.connection = connection
But it's not really pretty. You should probably call your method without using mocks, and test the state of the database (ie, data have been committed, not rollbacked)

multiple times triggering of Datasource.groovy in grails application ( throwing exception when called second time)

Work Around for creating the application :
creating a Grails Application using Postgres database.
Needs to create Database when the application is executed (i.e database is to be created from the project itself instead of creating the database manually).
For creating the database I am calling a groovy service CreateDatabaseService in datasource.groovy as follows :
import demo.grails.CreateDatabaseService
CreateDatabaseService.serviceMethod()
dataSource {
pooled = true
jmxExport = true
driverClassName = "org.postgresql.Driver"
dialect = "org.hibernate.dialect.PostgreSQLDialect"
username = "postgres"
password = "password"
}
hibernate {
cache.use_second_level_cache = true
cache.use_query_cache = false
singleSession = true // configure OSIV singleSession mode
flush.mode = 'manual'
}
// environment specific settings
environments {
development {
dataSource {
dbCreate = "create-drop"
url = "jdbc:postgresql://localhost:5432/SampleAppDb"
}
}
test {
dataSource {
dbCreate = "update"
url = "jdbc:postgresql://localhost:5432/SampleAppDb"
}
} .....
Full code for CreateDatabaseService :
package demo.grails
import groovy.sql.Sql
import java.sql.DriverManager
import java.sql.Connection
import java.sql.SQLException
import java.sql.*
class CreateDatabaseService {
public static flag =0
def static serviceMethod() {
Connection conn = null;
Statement stmt = null;
try{
if(flag == 0) {
//STEP 2: Register JDBC driver
Class.forName("org.postgresql.Driver");
//STEP 3: Open a connection
System.out.println("Connecting to database...");
conn = DriverManager.getConnection("jdbc:postgresql://localhost:5432", "postgres", "password");
//STEP 4: Execute a query
System.out.println("Creating database...");
stmt = conn.createStatement();
String isDatabase = "select datname from pg_catalog.pg_database where lower(datname) = lower('SampleAppDb')"
ResultSet rs = stmt.executeQuery(isDatabase);
String idr
while(rs.next()){
//Retrieve by column name
idr = rs.getString("datname");
}
if(!idr.equals("SampleAppDb")) {
String sql = "create database SampleAppDb";
stmt.executeUpdate(sql);
System.out.println("Database created successfully...");
flag =1
}
else {
flag =1
}
}
}catch(Exception e){
e.printStackTrace();
}
finally{
stmt.close();
conn.close();
}
}
}
On running the grails app following is the full stack trace that shows datasource.groovy is called multiple times, though the database is created from the application but complains about no suitable driver found when datasource.groovy is called second time.
|Loading Grails 2.4.5
|Configuring classpath
.
|Environment set to development
.................................
|Packaging Grails application
................Connecting to database...
Creating database...
Database created successfully...
....................
|Running Grails application
Connecting to database...
Error |
java.sql.SQLException: No suitable driver found for jdbc:postgresql://localhost:5432
Error |
at java.sql.DriverManager.getConnection(DriverManager.java:689)
Error |
at java.sql.DriverManager.getConnection(DriverManager.java:247)
Error |
at java_sql_DriverManager$getConnection.call(Unknown Source)
Error |
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
Error |
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
Error |
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
Error |
at demo.grails.CreateDatabaseService.serviceMethod(CreateDatabaseService.groovy:23)
Error |
at demo.grails.CreateDatabaseService$serviceMethod.call(Unknown Source)
Error |.......
Please suggest how to overcome this error.
BuildConfig.groovy
dependencies {
// specify dependencies here under either 'build', 'compile', 'runtime', 'test' or 'provided' scopes e.g.
// runtime 'mysql:mysql-connector-java:5.1.29'
// runtime 'org.postgresql:postgresql:9.3-1101-jdbc41'
runtime 'postgresql:postgresql:9.0-801.jdbc4'
test "org.grails:grails-datastore-test-support:1.0.2-grails-2.4"
}
plugins {
// plugins for the build system only
build ":tomcat:7.0.55.2" // or ":tomcat:8.0.20"
// plugins for the compile step
compile ":scaffolding:2.1.2"
compile ':cache:1.1.8'
compile ":asset-pipeline:2.1.5"
// plugins needed at runtime but not for compilation
runtime ":hibernate4:4.3.8.1" // or ":hibernate:3.6.10.18"
runtime ":database-migration:1.4.0"
runtime ":jquery:1.11.1"
// Uncomment these to enable additional asset-pipeline capabilities
//compile ":sass-asset-pipeline:1.9.0"
//compile ":less-asset-pipeline:1.10.0"
//compile ":coffee-asset-pipeline:1.8.0"
//compile ":handlebars-asset-pipeline:1.3.0.3"
}
}
Validate that you have the following configuration in your BuildConfig
dependencies {
...
compile 'org.grails.plugins:hibernate:4.3.10.4'
provided 'org.postgresql:postgresql:9.4-1203-jdbc4'
...
}

Resources