mariadb jdbc driver blob update not supported - jdbc

After I replaced mysql jdbc driver 5.1 with mariadb jdbc driver 1.1.5 and tested the existing code base that connected with MySQL Server 5.0 and MariaDB Server 5.2, everything works fine except a JDBC call to update a blob field in a table.
The blob field contains XML configuration file. It can be read out, and convert to xml and insert some values.
Then convert it to ByteArrayInputStream object, and call the method
statement.updateBinaryStream(columnLabel, the ByteArrayInputStream object, its length)
but an exception is thrown:
Perhaps you have some incorrect SQL syntax?
java.sql.SQLFeatureNotSupportedException: Updates are not supported
at
org.mariadb.jdbc.internal.SQLExceptionMapper.getFeatureNotSupportedException(SQLExceptionMapper.java:165)
at
org.mariadb.jdbc.MySQLResultSet.updateBinaryStream(MySQLResultSet.java:1642)
at
org.apache.commons.dbcp.DelegatingResultSet.updateBinaryStream(DelegatingResultSet.java:511)
I tried updateBlob method, the same exception was thrown.
The code works well with mysql jdbc driver 5.1.
Any suggestions on how to work around with this situation?

See the ticket updating blob with updateBinaryStream, which in commnet states that it isn't supported.
A workaround would be to use two SQL statements. One which is used to select the data and other to update the data. Something like this:
final Statement select = connection.createStatement();
try {
final PreparedStatement update = connection.prepareStatement( "UPDATE table SET blobColumn=? WHERE idColumn=?" );
try {
final ResultSet selectSet = select.executeQuery( "SELECT idColumn,blobColumn FROM table" );
try {
final int id = selectSet.getInt( "idColumn" );
final InputStream stream = workWithSTreamAndRetrunANew( selectSet.getBinaryStream( "blobColumn" ) ) );
update.setBinaryStream( 1,stream );
update.setInt( 2,id );
update.execute();
}
finally {
if( selectSet != null )
selectSet.close();
}
}
finally {
if( update != null )
update.close();
}
}
finally {
if( select != null )
select.close();
}
But be aware that you need some information how to uniquely identify a table entry, in this example the column idColumn was used for that purpose. Furthermore is you stored empty stream in the
database you might get an SQLException.

A simpler work around is using binary literals (like X'2a4b54') and concatenation (UPDATE table SET blobcol = blobcol || X'2a4b54') like this:
int iBUFSIZ = 4096;
byte[] buf = new byte[iBUFSIZ];
int iLength = 0;
int iUpdated = 1;
for (int iRead = stream.read(buf, 0, iBUFSIZ);
(iUpdated == 1) && (iRead != -1) && (iLength < iTotalLength);
iRead = stream.read(buf, 0, iBUFSIZ))
{
String sValue = "X'" + toHex(buf,0,iRead) + "'";
if (iLength > 0)
sValue = sBlobColumn + " || " + sValue;
String sSql = "UPDATE "+sTable+" SET "+sBlobColumn+"= "+sValue;
Statement stmt = connection.createStatement();
iUpdated = stmt.executeUpdate(sSql);
stmt.close();
}

Related

How to read data from Excel File saved in an blob in azure with EPPlus library

I'm trying to read my excel files saved in my azure storage container like this
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("concursos");
foreach (BlobItem blobItem in containerClient.GetBlobs())
{
BlobClient blobClient = containerClient.GetBlobClient(blobItem.Name);
ExcelPackage.LicenseContext = LicenseContext.NonCommercial;
using (var stream=blobClient.OpenRead(new BlobOpenReadOptions(true)))
using (ExcelPackage package = new ExcelPackage(stream))
{
ExcelWorksheet worksheet = package.Workbook.Worksheets.FirstOrDefault();
int colCount = worksheet.Dimension.End.Column;
int rowCount = worksheet.Dimension.End.Row;
for (int row = 1; row <= rowCount; row++)
{
for (int col = 1; col <= colCount; col++)
{
Console.WriteLine(" Row:" + row + " column:" + col + " Value:" + worksheet.Cells[row, col].Value.ToString().Trim());
}
}
But the sentence
ExcelWorksheet worksheet = package.Workbook.Worksheets.FirstOrDefault();
throws me an error:System.NullReferenceException: 'Object reference not set to an instance of an object.' worksheet was null
I debug an see fine my stream an my package
The excels in blobs are like this one .xls
Any idea, please?
Thanks
Please check if worksheet is empty .This error occurs if there is empty sheet with empty coumns and rows.
I tried to reproduce the same
Initially I tried to read a excel sheet with EPplus , where starting column and rows are filled and not empty and could execute and read successfully using the same code as yours.
Then I removed column1 to be empty and stored in blob and tried to read it and got null reference exception.
The Dimension object of the ExcelWorksheet will be null if the worksheet was just initialized and is empty .
And so throws null reference exception, AFAIK , the only way is to check if files are empty or to add content to it before accessing them so that if columns are empty , it would not throw exception.
worksheet.Cells[1, 1].Value = "Some text value";
Same way try to add worksheet, to avoid exception if in case there are no sheets in container blob.
ExcelWorksheet worksheet = new ExcelPackage().Workbook.Worksheets.Add("Sheet1");
This code will not throw an exception since the Dimension object was initialized by adding content to the worksheet.If the loaded
ExcelWorksheet already contains data, you will not face this issue.
ExcelWorksheet worksheet = package.Workbook.Worksheets.First();
//or ExcelWorksheet worksheet = package.Workbook.Worksheets[0];
// Add below line to add new sheet , if no sheets are present and returning null exception
//ExcelWorksheet worksheet = new ExcelPackage().Workbook.Worksheets.Add("Sheet1");
//Add below line to add column and row , if sheet is empty and returning null exception
worksheet.Cells[1, 1].Value = " This is the end of worksheet";
int colCount = worksheet.Dimension.End.Column;
int rowCount = worksheet.Dimension.End.Row;
for (int row = 1; row <= rowCount; row++)
{
for (int col = 1; col <= colCount; col++)
{
Console.WriteLine(" Row:" + row + " column:" + col + " Value:" + worksheet.Cells[row, col].Value.ToString().Trim());
}
}
You can alternatively check if the value is null.
if(worksheet.cells[row,column].value != null)
{
//proceed with code
}
The problem was the file extension of the excel files in blobs
Only works fone with .xlsx not with .xls
Thanks

Why does this ADO.NET query return no results?

I have the following code that executes a SQL statement and looks for a result.
var sql = #"select BOQ_IMPORT_ID "
+ "from ITIS_PRJ.PRJ_BOQ_IMPORT_HEADER "
+ "where PROJECT_ID = :Projectid "
+ "order by CREATED_ON desc "
+ "fetch first 1 row only";
using (var conn = new OracleConnection(ApplicationSettings.ConnectionString))
using (var cmd = new OracleCommand(sql, conn))
{
conn.Open();
cmd.Parameters.Add(LocalCreateParameterRaw("ProjectId", projectId));
var reader = cmd.ExecuteReader();
if (reader.Read())
{
byte[] buffer = new byte[16];
reader.GetBytes(0, 0, buffer, 0, 16);
var boqId = new Guid(buffer);
return boqId;
}
return null;
}
Where LocalCreateParameterRaw is declared as:
public static OracleParameter LocalCreateParameterRaw(string name, object value)
{
OracleParameter oracleParameter = new OracleParameter();
oracleParameter.ParameterName = name;
oracleParameter.OracleDbType = OracleDbType.Raw;
oracleParameter.Size = 16;
oracleParameter.Value = value;
return oracleParameter;
}
The underlying type for 'projectId' is 'Guid'.
The if (reader.Read()) always evaluates to false, despite there being exactly one row in the table. It normally should return only one row.
Using GI Oracle Profiler I can catch the SQL sent to the db, but only once did the profiler provide a value for the :ProjectId parameter, and it was in lower case. Like that it returned no results, but as soon as I applied UPPER to that value, I get a result.
It looks like I somehow have to get my parameter into uppercase for the query to work, but I have no idea how. Yet if I do a ToString().ToUpper() on the projectId GUID, I get a parameter binding error.
VERY IMPORTANT:
I have tried removing the where clause altogether, and no longer add a parameter, so all rows in the table should be returned, yet still no results.
I don't know how, but making the SQL string a verbatim string (prefixed with #) causes the proc to work. So, it doesn't work with:
var sql = #"SELECT BOQ_IMPORT_ID "
+ "FROM ITIS_PRJ.PRJ_BOQ_IMPORT_HEADER "
+ "WHERE PROJECT_ID = :projectid "
+ "ORDER BY CREATED_ON DESC "
+ "FETCH FIRST ROW ONLY";
Yet the same command string in SQL Developer executes and returns results. When I make my SQL string verbatim, as below, I get results.
var sql = #"select BOQ_IMPORT_ID
from ITIS_PRJ.PRJ_BOQ_IMPORT_HEADER
where PROJECT_ID = :ProjectId
order by CREATED_ON desc
fetch first 1 row only";
Using a more general approach, try the following
var sql = "SELECT BOQ_IMPORT_ID "
+ "FROM ITIS_PRJ.PRJ_BOQ_IMPORT_HEADER "
+ "WHERE PROJECT_ID = :projectid "
+ "ORDER BY CREATED_ON DESC "
+ "FETCH FIRST ROW ONLY";
using (DbConnection conn = new OracleConnection(ApplicationSettings.ConnectionString))
using (DbCommand cmd = conn.CreateCommand()) {
DbParameter parameter = cmd.CreateParameter();
parameter.ParameterName = "projectid";
parameter.Value = projectId.ToString("N").ToUpper(); //<-- NOTE FORMAT USED
cmd.Parameters.Add(parameter);
cmd.CommandType = CommandType.Text;
cmd.CommandText = sql;
conn.Open();
var reader = cmd.ExecuteReader();
if (reader.Read()) {
var boqId = new Guid((byte[])reader[0]);
return boqId;
}
return null;
}
It looks like I somehow have to get my parameter into uppercase for the query to work, but I have no idea how. Yet if I do a ToString().ToUpper() on the projectId GUID, I get a parameter binding error.
Reference Guid.ToString Method
Specifier N formats it to 32 digits: 00000000000000000000000000000000
When no format is provided the default format is D which would include 32 digits separated by hyphens.
00000000-0000-0000-0000-000000000000
That would explain your binding error.

.NETCore Oracle ManagedDataAccess Client: cannot read BLOBs (TTC error)

I have .NET Core app and we are trying to use Oracle Managed Data Access Client (currently it has beta version only).
However, when I read BLOB from database it gets 'TTC Error'. Does anyone have any ideas how to proceed?
using (OracleConnection conn = new OracleConnection("Data Source=db;User ID=userid;Password=pass;Pooling=False;"))
{
conn.Open();
var sql = "SELECT id, blobdata FROM templ";
OracleCommand cmd = new OracleCommand(sql, conn);
cmd.CommandType = CommandType.Text;
OracleDataReader reader = cmd.ExecuteReader();
using (reader)
{
while (reader.Read()) //TTC Error
{
}
}
"TTC Errror" is main Exception message.
This is not ideal but I ran into this issue today and ended up using the DBMS_LOB.SUBSTR function to read it out in chucks of 2000 (2000 is the largest that RAW can be).
Below is a query that returns a row where each row is 2K chucks of the file at a specific offset.
WITH
INFO
AS
(
SELECT
dbms_lob.getlength(A.FILE_CONTENT) AS FILE_CONTENT_LENGTH,
MOD(dbms_lob.getlength(A.FILE_CONTENT),2000) AS MOD,
CASE
WHEN MOD(dbms_lob.getlength(A.FILE_CONTENT),2000) > 0 THEN TRUNC((dbms_lob.getlength(A.FILE_CONTENT)/2000) + 1)
ELSE TRUNC(dbms_lob.getlength(A.FILE_CONTENT)/2000)
END INTERATION_COUNT,
A.FILE_CONTENT,
A.FILE_ID
FROM TABLE_OF_FILES A WHERE A.FILE_ID = 345321561
)
,OFFSETS AS
(
SELECT
(2000 * (ROWNUM-1)) + 1 AS OFFSET,
I.MOD,
I.FILE_CONTENT_LENGTH,
I.FILE_CONTENT,
I.FILE_ID,
I.INTERATION_COUNT
FROM INFO I
CONNECT BY LEVEL <= I.INTERATION_COUNT
)
,RESULT AS
(
SELECT
DBMS_LOB.SUBSTR(O.FILE_CONTENT, 2000, O.OFFSET) AS CONTENT,
O.OFFSET,
O.MOD,
O.FILE_CONTENT_LENGTH,
O.FILE_ID,
O.INTERATION_COUNT
FROM OFFSETS O
)
SELECT * FROM RESULT R ORDER BY R.OFFSET ASC;

Storing .NET double value in Oracle DB

I'm using ODP.NET to access Oracle DB from C# .NET.
Please see following code:
OracleConnection con = new OracleConnection();
con.ConnectionString = "User Id=user;Password=pass;Data Source=localhost/orcl";
con.Open();
/* create table */
DbCommand command = con.CreateCommand();
command.CommandType = CommandType.Text;
try
{
command.CommandText = "DROP TABLE TEST";
command.ExecuteNonQuery();
}
catch
{
}
//command.CommandText = "CREATE TABLE TEST (VALUE BINARY_DOUBLE)";
command.CommandText = "CREATE TABLE TEST (VALUE FLOAT(126))";
command.ExecuteNonQuery();
/* now insert something */
double val = 0.8414709848078965;
command.CommandText = "INSERT INTO TEST VALUES (" + val.ToString(System.Globalization.CultureInfo.InvariantCulture) + ")";
command.ExecuteNonQuery();
/* and now read inserted value */
command.CommandText = "SELECT * FROM TEST";
DbDataReader reader = command.ExecuteReader();
reader.Read();
double res = (double) (decimal)reader[0];
Console.WriteLine("Inserted " + val + " selected " + res);
The output from this is always:
Inserted 0,841470984807897 selected 0,841470984807897
But looking at variable values under debugger
val == 0.8414709848078965
res == 0,841470984807897
Why res is rounded up?
I looked into DB and there is stored rounded-up value.
On the other hand I used Oracle SQL Developer to modify this value, and I'm able to store 0.8414709848078965 in database?
I tried types NUMBER, FLOAT(126), BINARY_DOUBLE... always the same result.
Why there is a problem using ODP.NET?
OK, I have found that it works if parameter type is OracleDbType.BinaryDouble. But it causes my code to be dependent of ODP.NET. I wanted to use ADO.NET types (DbType) to achieve my code independency.
Oracle actually has a higher precision for it's numbers than .net!
I tried this in straight Oracle and it works fine, I recommend changing to use a param
e.g.
-- CREATE TABLE TEST (VALUE NUMBER(38,38)); (initial test)
INSERT INTO TEST VALUES (0.8414709848078965);
SELECT * FROM TEST;
VALUE
----------------------
0.8414709848078965
(recommendation)
OracleParameter param = cmd.CreateParameter();
param.ParameterName = "NUMBERVALUE";
param.Direction = ParameterDirection.Input;
param.OracleDbType = OracleDbType.Decimal;
param.Value = "0.8414709848078965";
command.Parameters.Add(param);

Execute sql statement via JDBC with CLOB binding

I have the following query (column log is of type CLOB):
UPDATE table SET log=? where id=?
The query above works fine when using the setAsciiStream method to put a value longer than 4000 characters into the log column.
But instead of replacing the value, I want to append it, hence my query looks like this:
UPDATE table SET log=log||?||chr(10) where id=?
The above query DOES NOT work any more and I get the following error:
java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
It looks to me like you have to use a PL/SQL block to do what you want. The following works for me, assuming there's an entry with id 1:
import oracle.jdbc.OracleDriver;
import java.sql.*;
import java.io.ByteArrayInputStream;
public class JDBCTest {
// How much test data to generate.
public static final int SIZE = 8192;
public static void main(String[] args) throws Exception {
// Generate some test data.
byte[] data = new byte[SIZE];
for (int i = 0; i < SIZE; ++i) {
data[i] = (byte) (64 + (i % 32));
}
ByteArrayInputStream stream = new ByteArrayInputStream(data);
DriverManager.registerDriver(new OracleDriver());
Connection c = DriverManager.getConnection(
"jdbc:oracle:thin:#some_database", "user", "password");
String sql =
"DECLARE\n" +
" l_line CLOB;\n" +
"BEGIN\n" +
" l_line := ?;\n" +
" UPDATE table SET log = log || l_line || CHR(10) WHERE id = ?;\n" +
"END;\n";
PreparedStatement stmt = c.prepareStatement(sql);
stmt.setAsciiStream(1, stream, SIZE);
stmt.setInt(2, 1);
stmt.execute();
stmt.close();
c.commit();
c.close();
}
}
BLOBs are not mutable from SQL (well, besides setting them to NULL), so to append, you would have to download the blob first, concatenate locally, and upload the result again.
The usual solution is to write several records to the database with a common key and a sequence which tells the DB how to order the rows.

Resources