flyway migrations with postgresql and postgis extension - spring

I have 2 schemas in my db:
CREATE SCHEMA my_schema;
CREATE SCHEMA my_second_schema;
So i created an extension
CREATE EXTENSION postgis
VERSION "2.1.4";
and used it well with both schemas.
But flyway 3.0 works with only first schema, on my_second_schema throwing an error:
org.flywaydb.core.internal.dbsupport.FlywaySqlScriptException: Error executing statement at line 803: CREATE TABLE places (
id bigint DEFAULT nextval('places_sequence'::regclass) NOT NULL,
geo_location geometry,
created_at timestamp without time zone,
updated_at timestamp without time zone,
version bigint,
state boolean
)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1554)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:228)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:292)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:975)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:752)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:125)
at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60)
at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:102)
at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:248)
at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContextInternal(CacheAwareContextLoaderDelegate.java:64)
at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:91)
... 23 more
Caused by: org.flywaydb.core.internal.dbsupport.FlywaySqlScriptException: Error executing statement at line 803: CREATE TABLE places (
id bigint DEFAULT nextval('places_sequence'::regclass) NOT NULL,
geo_location geometry,
created_at timestamp without time zone,
updated_at timestamp without time zone,
version bigint,
state boolean
)
at org.flywaydb.core.internal.dbsupport.SqlScript.execute(SqlScript.java:91)
at org.flywaydb.core.internal.resolver.sql.SqlMigrationExecutor.execute(SqlMigrationExecutor.java:73)
at org.flywaydb.core.internal.command.DbMigrate$5.doInTransaction(DbMigrate.java:287)
at org.flywaydb.core.internal.command.DbMigrate$5.doInTransaction(DbMigrate.java:285)
at org.flywaydb.core.internal.util.jdbc.TransactionTemplate.execute(TransactionTemplate.java:72)
at org.flywaydb.core.internal.command.DbMigrate.applyMigration(DbMigrate.java:285)
at org.flywaydb.core.internal.command.DbMigrate.access$800(DbMigrate.java:46)
at org.flywaydb.core.internal.command.DbMigrate$2.doInTransaction(DbMigrate.java:207)
at org.flywaydb.core.internal.command.DbMigrate$2.doInTransaction(DbMigrate.java:156)
at org.flywaydb.core.internal.util.jdbc.TransactionTemplate.execute(TransactionTemplate.java:72)
at org.flywaydb.core.internal.command.DbMigrate.migrate(DbMigrate.java:156)
at org.flywaydb.core.Flyway$1.execute(Flyway.java:864)
at org.flywaydb.core.Flyway$1.execute(Flyway.java:811)
at org.flywaydb.core.Flyway.execute(Flyway.java:1171)
at org.flywaydb.core.Flyway.migrate(Flyway.java:811)
at co.brandly.configuration.FlywayMigration.init(FlywayMigration.java:17)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1682)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1621)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1550)
... 40 more
Caused by: org.postgresql.util.PSQLException: ERROR: type "geometry" does not exist
Позиция: 276
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2102)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1835)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:500)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:374)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:366)
at org.flywaydb.core.internal.dbsupport.JdbcTemplate.executeStatement(JdbcTemplate.java:235)
at org.flywaydb.core.internal.dbsupport.SqlScript.execute(SqlScript.java:89)
... 62 more
So why there is 'ERROR: type "geometry" does not exist'?
My spring application context:
<bean id="flyway" class="org.flywaydb.core.Flyway">
<property name="dataSource" ref="dataSource"/>
<property name="schemas" value="my_second_schema, my_schema"/>
<property name="validateOnMigrate" value="false"/>
<property name="outOfOrder" value="true"/>
<property name="placeholderPrefix" value="$flyway{"/>
<property name="placeholderSuffix" value="}"/>
<property name="placeholders">
<map>
<entry key="schema" value="${flyway.placeholders.schema}"/>
<entry key="schema_analytics" value="${flyway.placeholders.schema_analytics}"/>
</map>
</property>
</bean>

Problem was in users privileges or something.
ALTER USER myuser WITH SUPERUSER;
This helps

Hey maybe your problem is that you didn't add all the extension in your schema :
-- Enable PostGIS (includes raster)
CREATE EXTENSION postgis;
-- Enable Topology
CREATE EXTENSION postgis_topology;
-- fuzzy matching needed for Tiger
CREATE EXTENSION fuzzystrmatch;
-- Enable US Tiger Geocoder
CREATE EXTENSION postgis_tiger_geocoder;
I hope it helps you.

As of flyway 3.1 there is baseline target, executing that which will treat your database as an existing database with postgis functionality already imported to your schema when creating the database with postigs_template.

Related

How to execute JSONPath or XPATH expression on the Message Context property in WSO2 ESB

I'm trying to set an Json object to a proerty using property mediator and then access the values of it using JSONPath or XPATH.
In this case, first I set the JSON Object to an OM typed property. In there, using $ctx:parent/child pattern I could access the values. But I can't execute the XPath expression over it (eg : $ctx:metadataOM/suppliers[0]).
I tried various scenarios and I noted that this can be done by setting that Json object to Payload and do JSONpath / XPath operations on it.
Is there anyway to do this?
Is there anyway to access the Message Context properties from json-eval?
Is there anyway to execute the JSONPath / XPATH on get-property method or $ctx: expressions?
Note : I'm looking for an answer that doesn't use Script medaitor and Class Mediator.
Payload :
{
"metadata":{
"language":"en",
"customerCountry":"GB",
"client":"coolpal",
"suppliers" : ["supplier-a","supplier-b"],
"currency":"USD"
}
}
API.xml
<inSequence>
<property expression="//jsonObject/metadata" name="metadataOM" scope="default" type="OM"/>
<property expression="//jsonObject/metadata" name="metadataSTR" scope="default" type="STRING"/>
<log level="custom">
<property expression="$ctx:metadataOM/suppliers" name="metadataOM-suppliers"/>
<property expression="$ctx:metadataOM/suppliers[0]" name="metadataOM-suppliers"/>
<property expression="json-eval($.metadataOM.suppliers)" name="eval-metadata-suppliers"/>
<property expression="json-eval($.metadataSTR.suppliers)" name="eval-metadata-suppliers"/>
<property expression="json-eval($ctx:metadataOM)" name="eval-metadata-suppliers"/>
</log>
<respond/>
</inSequence>
WSO2ESB version : 5.0.0
With Martin's help (Thanx Martin!) and some of self studies, I found an answer.
One reason was, when the json is connverted to the XML,it looks like below.
<metadata>
<language>en</language>
<customerCountry>GB</customerCountry>
<client>coolpal</client>
<suppliers>supplier-a</suppliers>
<suppliers>supplier-b</suppliers>
<currency>USD</currency>
</metadata>
Then I stored and loaded the first supplier as below. Second reason was, I forgot was suppliers are not zero indexed. :)
<property expression="/" name="metadataOM" scope="default" type="OM"/>
<log level="custom">
<property expression="$ctx:metadataOM" name="metadataOM"/>
<property expression="$ctx:metadataOM//jsonObject/metadata/suppliers[1]" name="metadataOM-supplier-1"/>
</log>

Cannot find Region in cache GemFireCache

Caused by: org.springframework.beans.factory.BeanInitializationException: Cannot find region [record] in cache GemFireCache[id = 20255495;
isClosing = false; isShutDownAll = false;
closingGatewayHubsByShutdownAll = false; created = Mon Jan 23 11:45:10 EST 2017; server = false; copyOnRead = false; lockLease = 120; lockTimeout = 60]
at org.springframework.data.gemfire.RegionLookupFactoryBean.lookupFallback(RegionLookupFactoryBean.java:72)
at org.springframework.data.gemfire.RegionLookupFactoryBean.afterPropertiesSet(RegionLookupFactoryBean.java:59)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1541)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1479)
... 13 more
My XML File is :
<beans>
....
<context:component-scan base-package="spring.gemfire.repository.deptemp"/>
<gfe:client-cache id="gemfireCache" pool-name="gfPool"/>
<!--Region for being used by the Record Bean -->
<gfe:replicated-region id="record" cache-ref="gemfireCache"/>
<bean id="record" class="spring.gemfire.repository.deptemp.beans.Record"/>
<gfe:pool id="gfPool" max-connections="10" subscription-enabled="true" >
<gfe:locator host="localhost" port="10334"/>
</gfe:pool>
<gfe:lookup-region id="record" />
<gfe-data:repositories base-package="spring.gemfire.repository.deptemp.repos"/>
</beans>
Abhisekh-
Why do you have both this...
<gfe:replicated-region id="record" cache-ref="gemfireCache"/>
And this...
<gfe:lookup-region id="record" />
Also, you have defined this...
<bean id="record" class="spring.gemfire.repository.deptemp.beans.Record"/>
Which (most definitely) overrode your REPLICATE Region bean definition (also with id="record") based on the "order" of your bean definitions in your XML defined above.
While Spring first and foremost adheres to dependency order between bean definitions, it will generally follow the declared order when no dependencies (explicit or implicit) exist.
Since <bean id="record" .../> comes after <gfe:replicated-region id="record" ../>, then <bean id=record../> overrides the <gfe:replicated-region id="record"/> bean definition.
Additionally, the <gfe:lookup-region> is not needed since you are not using GemFire/Geode's native configuration (e.g. cache.xml) or Cluster Configuration Service.
Furthermore, you are declaring a ClientCache, so technically probably want a <gfe:client-region> to match the GemFire/Geode Server Region, yes?!
While you can create REPLICATE Regions (and even PARTITION Regions) on a client, you typically do not do this since those Regions are NOT part of any distributed system, or cluster of GemFire "Server" nodes.
A client Region (which can be a PROXY, or even a CACHING_PROXY) will distribute data operations to the Server. Additionally, if you have data that only a particular client needs, then you ought to create local Regions, using <gfe:local-region>.
I would definitely read this, first...
http://gemfire.docs.pivotal.io/geode/basic_config/book_intro.html
Followed by this next...
http://gemfire.docs.pivotal.io/geode/topologies_and_comm/book_intro.html
Then this...
http://gemfire.docs.pivotal.io/geode/developing/book_intro.html
And lastly...
http://docs.spring.io/spring-data-gemfire/docs/current/reference/html/#bootstrap
-John

Birt 4.6.0-20160607 throws ClassNotFoundException for OracleDriver

i have a naven application in which i use Birt 4.6. Below my dependencies.
<dependency>
<groupId>org.eclipse.birt.ojdbc</groupId>
<artifactId>odajdbc</artifactId>
<version>4.6.0-201606072122</version>
</dependency>
<dependency>
<groupId>org.eclipse.birt.runtime</groupId>
<artifactId>org.eclipse.birt.runtime</artifactId>
<version>4.6.0-20160607</version>
<exclusions>
<exclusion>
<groupId>org.eclipse.birt.runtime</groupId>
<artifactId>org.apache.xerces</artifactId>
</exclusion>
<exclusion>
<artifactId>org.apache.poi</artifactId>
<groupId>org.eclipse.birt.runtime</groupId>
</exclusion>
</exclusions>
</dependency>
I am able to connect with the database and generate the reports. Those are the good news.
Unfortunately, i noticed in my log file that there is an exception thrown. The exception can seen below
2017-01-10 14:57:15,446 SEVERE [org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager] (default task-6) DriverClassLoader failed to load class: oracle.jdbc.driver.OracleDriver: java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
at org.eclipse.birt.core.framework.URLClassLoader.findClass1(URLClassLoader.java:188)
at org.eclipse.birt.core.framework.URLClassLoader$1.run(URLClassLoader.java:156)
at org.eclipse.birt.core.framework.URLClassLoader$1.run(URLClassLoader.java:1)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.birt.core.framework.URLClassLoader.findClass(URLClassLoader.java:151)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadExtraDriver(JDBCDriverManager.java:1064)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.findDriver(JDBCDriverManager.java:859)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadAndRegisterDriver(JDBCDriverManager.java:986)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadAndRegisterDriver(JDBCDriverManager.java:958)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.doConnect(JDBCDriverManager.java:285)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.getConnection(JDBCDriverManager.java:236)
at org.eclipse.birt.report.data.oda.jdbc.Connection.connectByUrl(Connection.java:254)
at org.eclipse.birt.report.data.oda.jdbc.Connection.open(Connection.java:163)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.open(OdaConnection.java:250)
at org.eclipse.birt.data.engine.odaconsumer.ConnectionManager.openConnection(ConnectionManager.java:165)
at org.eclipse.birt.data.engine.executor.DataSource.newConnection(DataSource.java:224)
at org.eclipse.birt.data.engine.executor.DataSource.open(DataSource.java:212)
at org.eclipse.birt.data.engine.impl.DataSourceRuntime.openOdiDataSource(DataSourceRuntime.java:217)
at org.eclipse.birt.data.engine.impl.QueryExecutor.openDataSource(QueryExecutor.java:437)
at org.eclipse.birt.data.engine.impl.QueryExecutor.prepareExecution(QueryExecutor.java:325)
at org.eclipse.birt.data.engine.impl.PreparedQuery.doPrepare(PreparedQuery.java:463)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.produceQueryResults(PreparedDataSourceQuery.java:190)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.execute(PreparedDataSourceQuery.java:178)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery.execute(PreparedOdaDSQuery.java:179)
at org.eclipse.birt.report.data.adapter.impl.DataRequestSessionImpl.execute(DataRequestSessionImpl.java:651)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:152)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:286)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1947)
at org.eclipse.birt.report.engine.executor.QueryItemExecutor.executeQuery(QueryItemExecutor.java:80)
at org.eclipse.birt.report.engine.executor.TableItemExecutor.execute(TableItemExecutor.java:62)
at org.eclipse.birt.report.engine.internal.executor.dup.SuppressDuplicateItemExecutor.execute(SuppressDuplicateItemExecutor.java:43)
at org.eclipse.birt.report.engine.internal.executor.wrap.WrappedReportItemExecutor.execute(WrappedReportItemExecutor.java:46)
at org.eclipse.birt.report.engine.internal.executor.l18n.LocalizedReportItemExecutor.execute(LocalizedReportItemExecutor.java:34)
at org.eclipse.birt.report.engine.layout.html.HTMLBlockStackingLM.layoutNodes(HTMLBlockStackingLM.java:65)
at org.eclipse.birt.report.engine.layout.html.HTMLPageLM.layout(HTMLPageLM.java:92)
at org.eclipse.birt.report.engine.layout.html.HTMLReportLayoutEngine.layout(HTMLReportLayoutEngine.java:100)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.doRun(RunAndRenderTask.java:181)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.run(RunAndRenderTask.java:77)
For some reason the JDBCDriverManager struggles to find the correct driver, throws the exception , finally finds the driver connects to the database and generates the report.
I did a debug on JDBCDriverManager and hope that the information below does help a bit.
The app goes through the doConnect() function JDBCDriverManager. Inside there the Connection getJndiDSConnection( driverClass, jndiNameUrl, connectionProperties ); returns null . Same happens for the getJndiDSConnection in the doConnect. also returns null
Then the loadAndRegisterDriver( driverClass, driverClassPath ); is called with following arguments oracle.jdbc.driver.OracleDriver and null
Inside the loadAndRegisterDriver the findDriver( className, driverClassPath, refreshClassLoader ) is called with following arguments oracle.jdbc.driver.OracleDriver , null, false
On the next step driverClass = loadExtraDriver( className, true, refresh, driverClassPath ); is called with oracle.jdbc.driver.OracleDriver , true , false , null which throws the ClassNotFoundException mentioned above.
Final step, we are still inside findDriver method where the driver = this.getDriverInstance( driverClass, refresh ); method is called and finally returns oracle.jdbc.driver.OracleDriver .
After step 5 everything works fine. As i mentioned, the exception appears only one time and still the connection with the database is created and the reports are generated. After that, no matter how many times i create a report, the exception is never thrown again.
I would like here to add some further info about the findDriver method. This method tries in several ways to get the driver. First is
// Driver not in plugin class path; find it in drivers directory
driverClass = loadExtraDriver( className, true, refresh, driverClassPath );
Which returns null and then gives a try to get the driver from the context
driverClass = Class.forName( className, true, Thread.currentThread( ).getContextClassLoader());
This times it finally achieves to retrieve the driver.
What am i missing? It is clear that it cannot load it from the plugins since i do not have any plugins directory. Is there a way to overcome this exception ?
As Mark mentioned, there was no need to add as a dependency the org.eclipse.birt.ojdbc . I stopped using the org.eclipse.birt.report.data.oda.jdbc_4.6.0.v201606072122.jar and used my local ojdbc driver.
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.2.0.4.0</version>
</dependency>
The above fixes the exception we get on the first try to load the driver.
Adding ojdbc7.jar under the WEB-INF path of the Birt Viewer folder (Web/App Server side) solved the problem for me:
[1] ../lib
[2] ../platform/plugins/org.eclipse.birt.report.data.oda.jdbc_<VERSION>/drivers
Logs
Before adding [2] above (was having only [1]):
20-Mar-2017 14:12:26.752 SEVERE [http-nio-8080-exec-4] org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadExtraDriver DriverClassLoader failed to load class: oracle.jdbc.driver.OracleDriver java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
After adding [2] above (was having only [1]):
20-Mar-2017 14:49:42.196 INFO [http-nio-8080-exec-4] org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager$DriverClassLoader.refreshFileURL JDBCDriverManager: found JAR file ojdbc7.jar. URL=file:../WEB-INF/platform/plugins/org.eclipse.birt.report.data.oda.jdbc_4.6.0.v201606072122/drivers/ojdbc7.jar

Spring Batch DB2 update DB2GSE.ST_POLYGON fails

I am trying to insert a polygon into db2 table hosted on z/OS
This is my database Item Writer
<bean id="databaseItemWriter"
class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="dataSource" ref="dataSource" />
<property name="sql">
<value>
<![CDATA[
INSERT INTO SAMPLE_GEOMETRIES
(GEO_NAME, GEOMETRY)
VALUES
( ?, DB2GSE.ST_POLYGON(?, 1))
]]>
</value>
</property>
<property name="itemPreparedStatementSetter">
<bean class="com.amex.elbs.DAO.GeometriesItemPreparedStatementSetter" />
</property>
</bean>
This is my custom prepared statement setter
public class GeometriesItemPreparedStatementSetter implements ItemPreparedStatementSetter{
#Override
public void setValues(Geometries item, PreparedStatement ps) throws SQLException {
ps.setString(1, item.Id);
ps.setString(2, item.Polygon);
}
}
This is my sample input file. It is pipe delimited and it has the ID and the Polygon Co-ordinates.
pm251|'POLYGON((-159.335174733889 21.9483433404175,-159.327130348878 22.0446395507162,-159.295025589769 22.1248124949548,-159.343195828355 22.1970166285359,-159.391366885913 22.2291198667724,-159.576012589057 22.2131796383001,-159.712505933171 22.1490592515515,-159.800814224332 22.0366665967853,-159.736592652746 21.9644203111023,-159.640246973766 21.9483657695954,-159.576021285803 21.8841361312636,-159.439545188912 21.8680716835921,-159.335174733889 21.9483433404175))', 1
The below statement when executed on z/OS is successful.
,INSERT,INTO SAMPLE_GEOMETRIES
(GEO_NAME, GEOMETRY)
VALUES
( 'PM',
DB2GSE.ST_POLYGON('POLYGON((
-159.335174733889 21.9483433404175,
-159.327130348878 22.0446395507162,
-159.295025589769 22.1248124949548,
-159.343195828355 22.1970166285359,
-159.391366885913 22.2291198667724,
-159.576012589057 22.2131796383001,
-159.712505933171 22.1490592515515,
-159.800814224332 22.0366665967853,
-159.736592652746 21.9644203111023,
-159.640246973766 21.9483657695954,
-159.576021285803 21.8841361312636,
-159.439545188912 21.8680716835921,
-159.335174733889 21.9483433404175))',1))
---------+---------+---------+---------+---------
DSNE615I NUMBER OF ROWS AFFECTED IS 1
This is what I get when I execute
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-245, SQLSTATE=428F5, SQLERRMC=DB2GSE.ST_POLYGON, DRIVER=4.12.55
at com.ibm.db2.jcc.am.hd.a(hd.java:676)
at com.ibm.db2.jcc.am.hd.a(hd.java:60)
at com.ibm.db2.jcc.am.hd.a(hd.java:127)
at com.ibm.db2.jcc.am.mn.c(mn.java:2621)
at com.ibm.db2.jcc.am.mn.d(mn.java:2609)
at com.ibm.db2.jcc.am.mn.a(mn.java:2085)
at com.ibm.db2.jcc.am.nn.a(nn.java:7054)
at com.ibm.db2.jcc.am.mn.a(mn.java:2062)
at com.ibm.db2.jcc.t4.cb.g(cb.java:136)
at com.ibm.db2.jcc.t4.cb.a(cb.java:41)
at com.ibm.db2.jcc.t4.q.a(q.java:32)
at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
at com.ibm.db2.jcc.am.mn.ib(mn.java:2055)
at com.ibm.db2.jcc.am.nn.rc(nn.java:3219)
at com.ibm.db2.jcc.am.nn.s(nn.java:3370)
at com.ibm.db2.jcc.am.nn.l(nn.java:2499)
at com.ibm.db2.jcc.am.nn.addBatch(nn.java:2438)
at org.springframework.batch.item.database.JdbcBatchItemWriter$1.doInPreparedStatement(JdbcBatchItemWriter.java:190)
at org.springframework.batch.item.database.JdbcBatchItemWriter$1.doInPreparedStatement(JdbcBatchItemWriter.java:185)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:644)
... 28 more
The error message for SQLCODE -245 reads: "THE INVOCATION OF FUNCTION routine-name IS AMBIGUOUS".
Apparently, there are more than one version of DB2GSE.ST_POLYGON in the database, accepting different types of arguments. You are using an untyped parameter marker: DB2GSE.ST_POLYGON(?, 1), so DB2 is unable to determine which version of DB2GSE.ST_POLYGON you want.
Add an explicit cast to the function invokation using the appropriate data type, for example:
DB2GSE.ST_POLYGON(CAST( ? AS VARCHAR(1000)), 1)

Camel type converter fails: InvalidPayloadException: No body available of type

The application is based on OSGI.
I have a custom annotated converter:
package com.domain.bundle1.web.camel.converters;
import ...;
#Converter
public class FooTransferObjectConverter {
public FooTransferObjectConverter() {
}
#Converter
public static FooTransferObject toFooTransferObject(Foo foo, Exchange exchange) throws Exception {
// some magic
return fooTransferObject;
}
}
Also i declared package where it plased in TypeConverter file:
http://i.stack.imgur.com/U3QQH.png
which contains:
com.domain.bundle1.web.camel.converters
And camel-context file contains next code:
<log loggingLevel="INFO" message="Converting to FooTransferObject" />
<convertBodyTo type="com.domain.bundle2.model.FooTransferObject" />
<log loggingLevel="INFO" message="Converted!" />
Before converting, body of message is a Foo object.
But when process reaches converting, then throws an exception:
Failed delivery for (MessageId: ID-EPUALVIW0567-55536-1401106375216-26-5 on ExchangeId: ID-EPUALVIW0567-55536-1401106375216-26-6).
Exhausted after delivery attempt: 1 caught: org.apache.camel.InvalidPayloadException: No body available of type: com.domain.bundle2.model.FooTransferObject but has value: Foo{97, Wall, null, null} of type: com.domain.bundle3.model.Foo on: Message: Foo{97, Wall, null, null}.
Caused by: Error during type conversion from type: com.domain.bundle3.model.Foo to the required type: com.domain.bundle2.model.FooTransferObject with value Foo{97, Wall, null, null} due 6 counts of IllegalAnnotationExceptions. Exchange[Message: Foo{97, Wall, null, null}]. Caused by: [org.apache.camel.TypeConversionException - Error during type conversion from type: Foo{97, Wall, null, null} to the required type: com.domain.bundle2.model.FooTransferObjec with value....
then exception cached by custom handler,
and then I found this:
Caused by: javax.xml.bind.MarshalException
- with linked exception:
[com.sun.istack.internal.SAXException2: A cycle is detected in the object graph. This will cause infinitely deep XML: freebaseball SpeedKick -> fr????f????tb??ll Sp????dK??ck -> free
football SpeedKick ]
at com.sun.xml.internal.bind.v2.runtime.MarshallerImpl.write(MarshallerImpl.java:311)[:1.7.0_40]
at com.sun.xml.internal.bind.v2.runtime.MarshallerImpl.marshal(MarshallerImpl.java:236)[:1.7.0_40]
at javax.xml.bind.helpers.AbstractMarshallerImpl.marshal(AbstractMarshallerImpl.java:95)
at org.apache.camel.converter.jaxb.FallbackTypeConverter.marshall(FallbackTypeConverter.java:238)
at org.apache.camel.converter.jaxb.FallbackTypeConverter.convertTo(FallbackTypeConverter.java:95)
... 163 more
Caused by: com.sun.istack.internal.SAXException2: A cycle is detected in the object graph. This will cause infinitely deep XML: freebaseball SpeedKick -> fr????f????tb??ll Sp????dK??c
k -> freebaseball SpeedKick
at com.sun.xml.internal.bind.v2.runtime.XMLSerializer.reportError(XMLSerializer.java:237)[:1.7.0_40]
How do You think what's a problem? How can I see loaded converters in TypeConverterRegistry?
I have already solved my problem. FallbackTypeConverter started work, because camel didn't load my custom regular type converter.
I checked the map of converters in TypeConverterRegister in debug mode, and didn't find my FooTransferObjectConverter.
The problem was in file TypeConverter. I just added name of converter class to path and after that it loaded to registry.
com.domain.bundle1.web.camel.converters.FooTransferObjectConverter
Camel version in application - 2.11.1. In camel docs written next:
In Camel 2.8 we improved the type converter loader to support
specifying the FQN class name of the converter classes. This has the
advantage of avoiding having to scan packages for #Converter classes.
Instead it loads the #Converter class directly. This is a highly
recommend approach to use going forward.
But i tryed run the application from chapter 3 (from 'Camel in action' book) with custom converter. And file TypeConverter contained only from package path.

Resources