How to Select a BLOB column from database using iBatis - oracle

One of a table's column is of BLOB datatype (Oracle 10g). We have a simple select query executed via iBatis to select the BLOB column and display it using Struts2 & JSP.
The result tag in the iBatis xml file had the jdbctype as java.sql.Blob
<result property="uploadContent" column="uploadcontent" jdbctype="Blob"/>
Should we be mentioning any typeHandler class for Blob column ? Currently we are getting an error stating column type mismatch.
Note: This column is selected and mapped into a java bean who has an attribute of type java.sql.Blob

I think you cannot use native jdbctype for LOB types in Oracle with iBatis. The solution is to create custom typeHandler to handle LOB and then map it like -
<result property="aClassStringProperty" column="aClobColumn" typeHandler="com.path.to.my.ClobTypeHandler"/>
More information on typeHandlerCallback here.

It is not neccesary to create a typeHandler. For Oracle, the jdbctype is BLOB
<result property="bytes" column="COLUMNBLOB" jdbcType="BLOB" />
Assumming "bytes" as byte [].
The important thing: in the select sql, you must set the jdbcType in this way:
INSERT INTO X (COLUMNBLOB) VALUES #bytes:BLOB#
I noticed that this jdbctype for Postgresql is different. You must set:
<result property="bytes" column="COLUMNBLOB" jdbcType="BINARY" />

I found somebody who deal with this here.
For a CLOB :
<result property="uploadContent" column="obfile" jdbctype="String" />
For a BLOB :
<result property="uploadContent" column="obfile" jdbctype="byte[]" />
I am still looking for it to work with C# !

I dindn't have problems using INSERTs, my problems where when I did SELECT of the blob type. I am using Oracle 9i and this is how I've done:
Add the Oracle JDBC driver to your project, you will need mybatis dependencies too. If you are using Maven:
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc14</artifactId>
<version>10.2.0.3.0</version>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis-spring</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis</artifactId>
<version>3.2.3</version>
</dependency>
Add the custom BaseTypeHandler for reading byte[] from Oracle BLOB class:
#MappedTypes(byte[].class)
public class OracleBlobTypeHandler extends BaseTypeHandler<byte[]> {
#Override
public void setNonNullParameter(PreparedStatement preparedStatement, int i, byte[] bytes, JdbcType jdbcType) throws SQLException {
// see setBlobAsBytes method from https://jira.spring.io/secure/attachment/11851/OracleLobHandler.java
try {
if (bytes != null) {
//prepareLob
BLOB blob = BLOB.createTemporary(preparedStatement.getConnection(), true, BLOB.DURATION_SESSION);
//callback.populateLob
OutputStream os = blob.getBinaryOutputStream();
try {
os.write(bytes);
} catch (Exception e) {
throw new SQLException(e);
} finally {
try {
os.close();
} catch (Exception e) {
e.printStackTrace();//ignore
}
}
preparedStatement.setBlob(i, blob);
} else {
preparedStatement.setBlob(i, (Blob) null);
}
} catch (Exception e) {
throw new SQLException(e);
}
}
/** see getBlobAsBytes method from https://jira.spring.io/secure/attachment/11851/OracleLobHandler.java */
private byte[] getBlobAsBytes(BLOB blob) throws SQLException {
//initializeResourcesBeforeRead
if(!blob.isTemporary()) {
blob.open(BLOB.MODE_READONLY);
}
//read
byte[] bytes = blob.getBytes(1L, (int)blob.length());
//releaseResourcesAfterRead
if(blob.isTemporary()) {
blob.freeTemporary();
} else if(blob.isOpen()) {
blob.close();
}
return bytes;
}
#Override
public byte[] getNullableResult(ResultSet resultSet, String columnName) throws SQLException {
try {
//use a custom oracle.sql.BLOB
BLOB blob = (BLOB) resultSet.getBlob(columnName);
return getBlobAsBytes(blob);
} catch (Exception e) {
throw new SQLException(e);
}
}
#Override
public byte[] getNullableResult(ResultSet resultSet, int i) throws SQLException {
try {
//use a custom oracle.sql.BLOB
BLOB blob = (BLOB) resultSet.getBlob(i);
return getBlobAsBytes(blob);
} catch (Exception e) {
throw new SQLException(e);
}
}
#Override
public byte[] getNullableResult(CallableStatement callableStatement, int i) throws SQLException {
try {
//use a custom oracle.sql.BLOB
BLOB blob = (BLOB) callableStatement.getBlob(i);
return getBlobAsBytes(blob);
} catch (Exception e) {
throw new SQLException(e);
}
}
}
Add the type handlers package to mybatis configuration. As you can see, I am using spring-mybatis:
<bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="typeHandlersPackage" value="package.where.customhandler.is" />
</bean>
And then, you can read byte[] from Oracle BLOBs from Mybatis:
public class Bean {
private byte[] file;
}
interface class Dao {
#Select("select file from some_table where id=#{id}")
Bean getBean(#Param("id") String id);
}
I hope this will help.
This is an adaptation of this excellent answer: This is an adaptation of this excellent answer: https://stackoverflow.com/a/27522590/2692914.

Related

Writing protobuf object in parquet using apache beam

I fetch protobuf data from google pub/sub and deserialize the data to Message type object. So i get PCollection<Message> type object. Here is sample code:
public class ProcessPubsubMessage extends DoFn<PubsubMessage, Message> {
#ProcessElement
public void processElement(#Element PubsubMessage element, OutputReceiver<Message> receiver) {
byte[] payload = element.getPayload();
try {
Message message = Message.parseFrom(payload);
receiver.output(message);
} catch (InvalidProtocolBufferException e) {
LOG.error("Got exception while parsing message from pubsub. Exception =>" + e.getMessage());
}
}
}
PCollection<Message> event = psMessage.apply("Parsing data from pubsub message",
ParDo.of(new ProcessPubsubMessage()));
I want to apply transformation on PCollection<Message> eventto write in parquet format. I know apache beam has provided ParquetIO but it works fine for PCollection<GenericRecord> type and conversion from Message to GenericRecord may solve the problem (Yet don't know how to do that). There is any easy way to write in parquet format ?
It can be solved by using the following library :
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-protobuf</artifactId>
<version>1.7.7</version>
</dependency>
private GenericRecord getGenericRecord(Event event) throws IOException {
ProtobufDatumWriter<Event> datumWriter = new ProtobufDatumWriter<Event>(Event.class);
ByteArrayOutputStream os = new ByteArrayOutputStream();
Encoder e = EncoderFactory.get().binaryEncoder(os, null);
datumWriter.write(event, e);
e.flush();
ProtobufDatumReader<Event> datumReader = new ProtobufDatumReader<Event>(Event.class);
GenericDatumReader<GenericRecord> genericDatumReader = new GenericDatumReader<GenericRecord>(datumReader.getSchema());
GenericRecord record = genericDatumReader.read(null, DecoderFactory.get().binaryDecoder(new ByteArrayInputStream(os.toByteArray()), null));
return record;
}
For details: https://gist.github.com/alexvictoor/1d3937f502c60318071f

Instrumentation: Casting org.apache.tomcat.dbcp.dbcp.PoolingDataSource$PoolGuardConnectionWrapper to oracle.jdbc.OracleConnection

I'm trying to instrument my jdbc connections. I know there are several similar questions about this topic.
I tried everything but couldn't find the propper way to solve my issue so far.
Also tried the answers to this question, with no result:
Apache Commons DBCP connection object problem, Thread: ClassCastException in org.apache.tomcat.dbcp.dbcp.PoolingDataSource$PoolGuardConnectionWrapper
I'm working with Tomcat 7 and Java 7. Here's where I define the oracle connection pool in my context.xml:
<Resource name="jdbc/myDS"
type="javax.sql.DataSource"
auth="Container"
maxActive="350"
maxIdle="50"
minIdle="10"
maxWait="10000"
username="user_own"
password="mypassw"
accessToUnderlyingConnectionAllowed="true"
driverClassName="oracle.jdbc.driver.OracleDriver"
url="jdbc:oracle:thin:#192.168.110.173:1521/orcl" />
My instrumentation code:
private static void initInstrumentation(Connection con, final String usuario, final String modulo, final String accion) throws Exception {
if (Utils.getParameter("instrumentation.active").equals("1")) {
try {
OracleConnection oracleConnection = null;
//This is where I try to get the oracle connection, but no succeed
if (con != null) {
if (con instanceof OracleConnection) {//NEVER COME IN HERE
oracleConnection = (OracleConnection) con;
} else if (con.isWrapperFor(OracleConnection.class)) {//NEVER COME IN HERE
oracleConnection = con.unwrap(OracleConnection.class);
} else{
//NO ORACLECONNECTION NO isWrapperFor -> ALWAYS ENDS HERE!!!
//oracleConnection = (OracleConnection) ((DelegatingConnection) con).getDelegate();
oracleConnection = (OracleConnection) new DelegatingConnection(con).getInnermostDelegate();
}
}
if (oracleConnection != null) {
String[] metrics = new String[OracleConnection.END_TO_END_STATE_INDEX_MAX];
metrics[OracleConnection.END_TO_END_MODULE_INDEX] = modulo;
metrics[OracleConnection.END_TO_END_ACTION_INDEX] = "Inicio: " + accion;
metrics[OracleConnection.END_TO_END_CLIENTID_INDEX] = usuario;
oracleConnection.setEndToEndMetrics(metrics, (short) 0);
}
} catch (Exception e) {
throw new Exception("Error initInstrumentation " + e);
}
}
}
My open connection method:
private static Connection genericOpenConnection() throws Exception {
Connection con = null;
try {
DataSource dataSource = (DataSource) new InitialContext().lookup(Utils.cStrPlx(Utils.getParameter("dataSourceJndiName")));
con = dataSource.getConnection();
con.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED);
} catch (Exception e) {
Error.escribeLog("Error : " + e.getMessage());
throw new SQLException(e.getMessage());
}
return con;
}
So after calling openConnection and initInstrumentation I'm getting the next exception trying to cast the connection to an oracle connection. Any ideas of how to do this? What am I getting wrong?
Thanks in advance.
java.lang.ClassCastException: org.apache.tomcat.dbcp.dbcp.PoolingDataSource$PoolGuardConnectionWrapper cannot be cast to oracle.jdbc.OracleConnection
I found my problem. I hope this can help anyone with the same issue.
The thing seems to be related to a conflict with the ojdbc driver libraries.
I have one driver in my tomcat, and another one declared in pom.xml via maven.
<!-- Driver oracle -->
<dependency>
<groupId>com.plexus</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.2.0</version>
<scope>provided</scope>
</dependency>
Declaring this driver as provided fixed my problem, and the connection now is been retrieved as described below
if (con.isWrapperFor(OracleConnection.class)) {
oracleConnection = con.unwrap(OracleConnection.class);
}

set batch size in spring JDBC batch update

How do I set batch size in spring JDBC batch update to improve performance?
Listed below is my code snippet.
public void insertListOfPojos(final List<Student> myPojoList) {
String sql = "INSERT INTO " + "Student " + "(age,name) " + "VALUES "
+ "(?,?)";
try {
jdbcTemplateObject.batchUpdate(sql,
new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i)
throws SQLException {
Student myPojo = myPojoList.get(i);
ps.setString(2, myPojo.getName());
ps.setInt(1, myPojo.getAge());
}
#Override
public int getBatchSize() {
return myPojoList.size();
}
});
} catch (Exception e) {
System.out.println("Exception");
}
}
I read that with Hibernate you can provide your batch size in the
configuration xml.
For example,
<property name="hibernate.jdbc.batch_size" value="100"/>.
Is there something similar in Spring's jdbc?
There is no option for jdbc that looks like Hibernate; I think you have to get a look to specif RDBMS vendor driver options when preparing connection string.
About your code you have to use
BatchPreparedStatementSetter.getBatchSize()
or
JdbcTemplate.batchUpdate(String sql, final Collection<T> batchArgs, final int batchSize, final ParameterizedPreparedStatementSetter<T> pss)
if you use JDBC directly, you decide yourself how much statements are used in one commit, while using one of the provided JDBCWriters you decide the batch* size with the configured commit-rate
*afaik the actual spring version uses the prepared statement batch methods under the hood, see https://github.com/SpringSource/spring-framework/blob/master/spring-jdbc/src/main/java/org/springframework/jdbc/core/JdbcTemplate.java#L549

MyBatis select statement returns null values

I'm trying to run a simple MyBatis example, selecting all rows from the "trains" table.
The problem is that the query performs, but it returns a list with the correct number of elements, but populated with null values.
The same query runned directly with JDBC PreparedStatement works fine.
Perhaps it's a configuration problem, but I cannot figure out what I'm doing wrong.
Here is the code. Thanks in advance.
Train.java
package org.example.mybatis.domain;
public class Train implements Serializable
{
private int id;
private String type;
// getters and setters
}
TrainMapper.java
package org.example.mybatis.persistence;
public interface TrainMapper {
List<Train> getAllTrains();
}
TrainSelector.java
package org.example.mybatis.test;
public class TrainSelector implements TrainMapper {
private static String resource = "mybatis-config.xml";
private static SqlSessionFactory factory = null;
private SqlSessionFactory getSqlSessionFactory()
{
if (factory == null)
{
try {
InputStream inputStream = Resources.getResourceAsStream(resource);
factory = new SqlSessionFactoryBuilder().build(inputStream);
} catch (IOException e) {
e.printStackTrace();
}
}
return factory;
}
#Override
public List<Train> getAllTrains()
{
List<Train> trains = null;
SqlSession session = getSqlSessionFactory().openSession();
try {
TrainMapper mapper = session.getMapper(TrainMapper.class);
trains = mapper.getAllTrains();
} finally {
session.close();
}
return trains;
}
public static void main(String[] args) {
List<Train> trains = null;
TrainSelector trainSelector = new TrainSelector();
trains = trainSelector.getAllTrains();
System.out.println(trains);
}
}
mybatis-config.xml
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE configuration
PUBLIC "-//mybatis.org//DTD Config 3.0//EN"
"http://mybatis.org/dtd/mybatis-3-config.dtd">
<configuration>
<properties resource="database.properties" />
<typeAliases>
<typeAlias alias="Train" type="org.example.mybatis.domain.Train" />
<!--package name="org.example.mybatis.domain" />-->
</typeAliases>
<environments default="development">
<environment id="development">
<transactionManager type="JDBC" />
<dataSource type="POOLED">
<property name="driver" value="${database.driver}" />
<property name="url" value="${database.url}" />
<property name="username" value="${database.username}" />
<property name="password" value="${database.password}" />
</dataSource>
</environment>
</environments>
<mappers>
<mapper resource="org/example/mybatis/persistence/TrainMapper.xml" />
</mappers>
</configuration>
TrainMapper.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN"
"http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="org.example.mybatis.persistence.TrainMapper">
<cache />
<select id="getAllTrains" parameterType="list" resultType="Train">
SELECT *
FROM trains
</select>
</mapper>
JdbcStatementExample.java
package org.example.mybatis.test;
public class JdbcStatementExample {
private static void selectAllTrains() throws SQLException
{
String sql = "SELECT * FROM trains";
Connection conn = null;
PreparedStatement ps = null;
ResultSet rs = null;
String url = "jdbc:mysql://localhost/testing";
String user = "test";
String password = "test";
try {
conn = DriverManager.getConnection(url, user, password);
ps = conn.prepareStatement(sql);
rs = ps.executeQuery();
while (rs.next()) {
String id = rs.getString("train_id");
String type = rs.getString("train_type");
System.out.println("id: " + id);
System.out.println("type: " + type);
}
} catch (SQLException e) {
throw new RuntimeException(e);
} finally {
if (ps != null) {
ps.close();
}
if (conn != null) {
conn.close();
}
}
}
public static void main(String[] args)
{
try {
selectAllTrains();
} catch (SQLException e) {
e.printStackTrace();
}
}
}
The names of the columns in the result set are different from the names of the properties in the Train object. You need an explicit result map to let Mybatis know which column is to be mapped to which property.
<resultMap id="trainMap" type="Train>
<id property="id" column="train_id" javaType="java.lang.Integer" jdbcType="INTEGER"/>
<result property="type" column="train_type" javaType="java.lang.String" jdbcType="VARCHAR"/>
</resultMap>
Making your select element into
<select id="getAllTrains" parameterType="list" resultType="trainMap">
SELECT * FROM trains
</select>
Other option is to use column names an aliases.
The column names will be your database's and the aliases will be set to match with your Train object properties:
<select id="getAllTrains" parameterType="list" resultType="trainMap">
SELECT
train_id as id,
train_type as type
FROM trains
</select>
I had the same problem, but only for fields with multiple words. Of course my naming convention in SQL was user_id and in java was userId. This piece of config inside my mybatis-config.xml file saved the day:
<settings>
<setting name="mapUnderscoreToCamelCase" value="false"/>
</settings>
or for properties file:
mybatis.configuration.map-underscore-to-camel-case=true
credit: https://chois9105.github.io/spring/2017/12/31/configuring-mybatis-underscore-to-camel-case.html
Results can be mapped as described by Seeta or in the official docs here:
https://mybatis.org/mybatis-3/sqlmap-xml.html
In MyBatis 3.x the example doesn't work as you need to set resultMap rather than resultType. And you must not set both at the same time! Working example looks like:
<select id="getAllTrains" parameterType="list" resultMap="trainMap">
SELECT * FROM trains
</select>
if you are using spring boot, you can change the map-underscore-to-camel-case property as true like below. because most if the time we use _ (user_id) when create the table attributes. but in java we use camelCase (userId) for the variables. then mybatis don't know about that and when it tries to mapping, the error is thrown.
mybatis.configuration.map-underscore-to-camel-case=true

Spring translate java.sql.SQLException to DataAccessException

Hallo.
Since it seems that I cannot use the spring DataAccessException translation mechanism in my dao, I would like to know if it possible to translate the
Internal Exception: java.sql.SQLException: [BEA][Oracle JDBC Driver][Oracle]ORA-00001: unique constraint (JSP_OWN.IDX_MC_CC_RAPPORTI_02) violated
to the DataAccessException hierarchy manually.
Kind regards
Massimo
If you have a JdbcTemplate, you can do
catch (SqlException e) {
throw jdbcTemplate.getExceptionTranslator().translate("my task", null, e);
}
If you do not have JdbcTemplate, just look at the source code of the JdbcTemplate.getExceptionTranslator() method:
public synchronized SQLExceptionTranslator getExceptionTranslator() {
if (this.exceptionTranslator == null) {
DataSource dataSource = getDataSource();
if (dataSource != null) {
this.exceptionTranslator = new SQLErrorCodeSQLExceptionTranslator(dataSource);
}
else {
this.exceptionTranslator = new SQLStateSQLExceptionTranslator();
}
}
return this.exceptionTranslator;
}
And mimic it's behaviour :-)

Resources