In my Oracle database I have a column IS_ID of type NUMBER which I use for storing boolean values (0,1). The corresponding entity class declares the field:
#Column(name="IS_ID")
private Integer isId;
Fetching the entity class results in an exception being thrown:
org.apache.openjpa.persistence.PersistenceException: java.lang.Boolean cannot be cast to java.lang.Integer
While debugging the code I found out that the field is actually fetched as Boolean from the database. Why is this happening? My container is ServiceMix 5.3.0 with OpenJPA version 2.3.0
Related
I have a jsonb column in an Entity annotated as shown in the sample code. Everything works fine without the #Audited annotation. Adding the Audited annotation creates the org_master_aud table with the column custom_fields of type uuid instead of jsonb and the insert fails
#TypeDef(name = "jsonb", typeClass = JsonBinaryType.class)
#Audited
public class OrgMaster {
#Type(type = "jsonb")
#Column(columnDefinition = "jsonb",name="custom_fields",nullable=false)
private JsonNode customFields;
}
org.springframework.orm.jpa.JpaSystemException: Unable to perform beforeTransactionCompletion callback: org.hibernate.exception.DataException: could not execute statement; nested exception is org.hibernate.HibernateException: Unable to perform beforeTransactionCompletion callback: org.hibernate.exception.DataException: could not execute statement
at org.springframework.orm.jpa.vendor.HibernateJpaDialect.convertHibernateAccessException(HibernateJpaDialect.java:353)
.
.
.
.
at org.hibernate.internal.SessionImpl.doFlush(SessionImpl.java:1352)
... 94 more
Caused by: org.postgresql.util.PSQLException: ERROR: invalid input syntax for type uuid: "{}"
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2578)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2313)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:331)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:448)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:369)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:159)
at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:125)
at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeUpdate(ProxyPreparedStatement.java:61)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeUpdate(HikariProxyPreparedStatement.java)
at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.executeUpdate(ResultSetReturnImpl.java:197)
... 105 more
Find below the Snapshots of the custom_fields column in main and audit table where the audit table column is uuid whereas the main table is jsonb. Both are autogenerated.
This issue was resolved after updating the hibernate version from 5.4.12.Final to 5.4.14.Final.
This was a bug introduced in versions > 5.4.10 and was fixed in 5.4.14. Here is the link to jira issue.
https://hibernate.atlassian.net/browse/HHH-13886
Trying to read data from XMLTYPE column using spring jdbc
Declaration
#Autowired
OracleXmlHandler sqlXmlHandler;
Object Read as string
sqlXmlHandler.getXmlAsString(resultSet, "xml_column")
getting the following exception :
Caused by: java.sql.SQLException: Inconsistent java and sql object types: SYS.XMLTYPE
at oracle.sql.OPAQUE.toClass(OPAQUE.java:395) ~[ojdbc6-11.2.0.4.jar:11.2.0.4.0]
at oracle.sql.OPAQUE.toJdbc(OPAQUE.java:333) ~[ojdbc6-11.2.0.4.jar:11.2.0.4.0]
at oracle.jdbc.driver.NamedTypeAccessor.getObject(NamedTypeAccessor.java:193) ~[ojdbc6-11.2.0.4.jar:11.2.0.4.0]
at oracle.jdbc.driver.NamedTypeAccessor.getObject(NamedTypeAccessor.java:123) ~[ojdbc6-11.2.0.4.jar:11.2.0.4.0]
at oracle.jdbc.driver.OracleResultSetImpl.getObject(OracleResultSetImpl.java:1108) ~[ojdbc6-11.2.0.4.jar:11.2.0.4.0]
at oracle.jdbc.driver.OracleResultSet.getObject(OracleResultSet.java:462) ~[ojdbc6-11.2.0.4.jar:11.2.0.4.0]
at org.jboss.jca.adapters.jdbc.WrappedResultSet.getObject(WrappedResultSet.java:1199) ~[?:?]
pom.xml has the required entries for xdb6,xmlparserv2,spring-data-oracle,spring-oxm
Tried other alternatives with out spring-jdbc as well.
Any insights will be really helpful.
I'm writing a custom Hive SerDe in order to parse logs (the goal is to parse the user agent into a complexe structure in a hive table but it doesn't apear is the code yet).
However, a ClassCastException apears when I try to put data in columns with a type that is not STRING.
My version of hive is 0.9.0
here is my Custom Serde :
#Override
public void initialize(Configuration conf, Properties tbl)
throws SerDeException {
String colNamesStr = tbl.getProperty(serdeConstants.LIST_COLUMNS);
colNames = Arrays.asList(colNamesStr.split(","));
String colTypesStr = tbl.getProperty(serdeConstants.LIST_COLUMN_TYPES);
List<TypeInfo> colTypes = TypeInfoUtils.getTypeInfosFromTypeString(colTypesStr);
rowTypeInfo = (StructTypeInfo) TypeInfoFactory.getStructTypeInfo(colNames, colTypes);
rowOI = TypeInfoUtils.getStandardJavaObjectInspectorFromTypeInfo(rowTypeInfo);
}
#Override
public Object deserialize(Writable blob) throws SerDeException {
row.clear();
String[] line = blob.toString().split("\t");
row.add(line[0]);
row.add(Long.parseLong(line[1]));
row.add(line[2]);
return row;
}
Here is the table creation :
CREATE EXTERNAL TABLE logs (
token STRING,
tmstmp BIGINT,
user_agent STRING )
ROW FORMAT SERDE 'com.hive.serde.LogsSerDe'
LOCATION '/user/Input/logs';
And here is the error :
java.io.IOException: java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:173)
at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1382)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:270)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:563)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long
at org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaLongObjectInspector.get(JavaLongObjectInspector.java:39)
at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:203)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:483)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serializeField(LazySimpleSerDe.java:436)
at org.apache.hadoop.hive.serde2.DelimitedJSONSerDe.serializeField(DelimitedJSONSerDe.java:69)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:420)
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:163)
... 11 more
Seems like all the values returned by the "deserialize" function are strings.
Thank you in advance for your help
The tmstmp column in your DDL is a BIGINT. You're returning a Long whereas Hive is expecting a LongWritable. Try:
row.add(new LongWritable(Long.valueOf(line[1])));
Similarly, you may have to convert your Strings to Text using: new Text(javaStringObject);
I've a web application that uses apache dbcp and spring jdbc to perform database operations on an oracle database. I need to write a performance logger that logs the individual times of each database operation. I tried writing an around advice on all 'execute' methods of org.springframework.jdbc.core.JdbcTemplate but it results in an error when spring gets initialized. The logger class and the exception stacktrace is as follows:-
I also tried to use CGLIB proxies by enabling but it errors out on dao classes that extends from spring's StoredProcedure class and use constructor injection.
#Aspect
public class Logger {
#Around("this(org.springframework.jdbc.core.JdbcTemplate) && execution(* execute(*))")
public Object invoke(ProceedingJoinPoint pjp) throws Throwable {
long time = System.currentTimeMillis();
Object result = pjp.proceed();
LOGGER.debug("time consumed = " + (System.currentTimeMillis() - time));
return result;
}
Exception stacktrace:
SEVERE: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener
org.springframework.beans.factory.UnsatisfiedDependencyException:
Error creating bean with name 'myDao' defined in class path resource [spring/my-dao.xml]: Unsatisfied dependency expressed through constructor argument with index 0 of type [org.springframework.jdbc.core.JdbcTemplate]:
Could not convert constructor argument value of type [$Proxy7] to required type [org.springframework.jdbc.core.JdbcTemplate]:
Failed to convert value of type '$Proxy7 implementing org.springframework.jdbc.core.JdbcOperations,org.springframework.beans.factory.InitializingBean,org.springframework.aop.SpringProxy,org.springframework.aop.framework.Advised'
to required type 'org.springframework.jdbc.core.JdbcTemplate';
nested exception is
java.lang.IllegalStateException: Cannot convert value of type [$Proxy7 implementing org.springframework.jdbc.core.JdbcOperations,org.springframework.beans.factory.InitializingBean,org.springframework.aop.SpringProxy,org.springframework.aop.framework.Advised]
to required type [org.springframework.jdbc.core.JdbcTemplate]:
no matching editors or conversion strategy found
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:702)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:196)
Recently we upgraded from hibernate 3.5 to 4.1.7 as well as spring from 3.0.5 to 3.1.3. Hibernate is configured via jpa in spring so no changes is made.
After the upgrade, most of the stuff works fine but one function that uses stored procedure is broken with the following exception:
java.lang.ClassCastException: $Proxy188 cannot be cast to oracle.jdbc.OracleConnection
at oracle.sql.TypeDescriptor.setPhysicalConnectionOf(TypeDescriptor.java:829)
at oracle.sql.TypeDescriptor.(TypeDescriptor.java:583)
at oracle.sql.ArrayDescriptor.(ArrayDescriptor.java:224)
at org.springframework.data.jdbc.support.oracle.SqlArrayValue.createTypeValue(SqlArrayValue.java:71)
at org.springframework.jdbc.core.support.AbstractSqlTypeValue.setTypeValue(AbstractSqlTypeValue.java:58)
at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:281)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:217)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:128)
at org.springframework.jdbc.core.CallableStatementCreatorFactory$CallableStatementCeatorImpl.createCallableStatement(CallableStatementCreatorFactory.java:212)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:1008)
at org.springframework.jdbc.core.JdbcTemplate.call(JdbcTemplate.java:1064)
at org.springframework.jdbc.object.StoredProcedure.execute(StoredProcedure.java:144)
In debug mode, I found the AbsructSqlTypeValue.setTypeValue() method has the following implementation:
public final void setTypeValue(PreparedStatement ps, int paramIndex, int sqlType, String typeName)
throws SQLException {
Object value = createTypeValue(ps.getConnection(), sqlType, typeName);
if (sqlType == TYPE_UNKNOWN) {
ps.setObject(paramIndex, value);
}
else {
ps.setObject(paramIndex, value, sqlType);
}
}
The ps.getConnection() method here actually returns a new Hibernate 4 LogicalConnectionImpl which wraps around the real OracleConnection. And that's why the it throws the ClassCastException in Oracle driver.
The reason why it calls to oracle.SqlArrayValue is because the stored procedure takes list of longs as input parameter. When the input parameter is defined, we uses OracleTypes.ARRAY then while binding the values, we create a new SqlArrayValue object to wrap around the Long[]. I tried to use the generic Types.Array and Long[] directly but it didn't work either with the following exception:
Caused by: java.sql.SQLException: Fail to convert to internal representation: [Ljava.lang.Long;#337f5afe
at oracle.sql.ARRAY.toARRAY(ARRAY.java:187)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:8782)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:8278)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:8877)
at oracle.jdbc.driver.OracleCallableStatement.setObject(OracleCallableStatement.java:4992)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.setObject(OraclePreparedStatementWrapper.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at oracle.ucp.jdbc.proxy.StatementProxyFactory.invoke(StatementProxyFactory.java:230)
at oracle.ucp.jdbc.proxy.PreparedStatementProxyFactory.invoke(PreparedStatementProxyFactory.java:124)
at oracle.ucp.jdbc.proxy.CallableStatementProxyFactory.invoke(CallableStatementProxyFactory.java:101)
at $Proxy214.setObject(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.hibernate.engine.jdbc.internal.proxy.AbstractStatementProxyHandler.continueInvocation(AbstractStatementProxyHandler.java:122)
I don't understand why the jdbcTemplate somehow uses the hibernate connection instead of the native OracleConnection, maybe there is some configuration somewhere can fix it magically?
Found the root cause of it. The class that extends StoredProcedure didn't define the jdbcTemplate property so the default one is used which doesn't have nativeJdbcExtractor defined. After adding the jdbcTemplate dependency to refer to the one defined with org.springframework.jdbc.support.nativejdbc.CommonsDbcpNativeJdbcExtractor as nativeJdbcExtractor resolve the issue. I guess hibernate 3.5 with spring 3.0 doesn't have this issue since at that time the returned jdbc connection is already the OracleConnection.