JDBC, Oracle SQL, adding and displaying picture on label - oracle

I have picture 'osta.jpg' on my desktop and I want to add this to my database Oracle SQL and then load this picture from database in Java application (JDBC) and show on label by a JLabel lblFilm = new JLabel(new ImageIcon( "picture_from_SQL") (Swing library)
I try adding image to database:
ALTER TABLE film ADD bfile_loc bfile, bfile_type varchar2(4);
UPDATE film SET bfile_type = 'JPEG';
UPDATE film SET bfile_loc = bfilename('GIF_FILES','C:\Users\Maciej\Desktop\osta.jpg') WHERE kategoria IN 'thriller';
However, I don't know how load it in the Java application.

First things first, in order to create a valid BFILE you need to specify an Oracle directory and then the location of the file within that directory. I don't know what your GIF_FILES directory is, but if it were created using
CREATE DIRECTORY GIF_FILES AS 'C:\Users\Maciej\Desktop';
then you would use the following to set the BFILE:
UPDATE film SET bfile_loc = bfilename('GIF_FILES', 'osta.jpg') WHERE kategoria IN 'thriller';
Secondly, in order to read the BFILE out of Oracle using JDBC, you need to use some Oracle-specific classes. Firstly, you need to cast the ResultSet to oracle.jdbc.OracleResultSet, so you can use the getBFILE() method to get the BFILE locator from the ResultSet. Then open the BFILE, get an InputStream from the BFILE's binaryStreamValue() method and read the data out of that.
Here's some example code (error handling could be improved):
import java.io.*;
import java.sql.*;
import oracle.sql.BFILE;
import oracle.jdbc.OracleResultSet;
// ...
// 'connection' here is your Oracle database connection.
Statement stmt = connection.createStatement();
OracleResultSet rSet = (OracleResultSet)stmt.executeQuery(
"SELECT bfile_loc FROM film");
if (rSet.next()) {
BFILE bfile = rSet.getBFILE(1);
System.out.println("Length: " + bfile.length());
bfile.open();
InputStream is = bfile.binaryStreamValue();
// Read data from input stream...
is.close();
bfile.close();
}
In the code above, I also fetch the length of the file, which isn't essential but it will fail if the file cannot be found.
Thirdly, you need to convert the InputStream to an ImageIcon. The ImageIcon class has a constructor that takes a byte array, and you can use answers to this question to convert the InputStream into a byte array.

Related

mirth connect use of executeUpdateAndGetGeneratedKeys with Oracle

I am using Mirth Connect 3.5.0.8232. I have created a persisted connection to an Oracle database and using it throughout my source and destination connectors. One of the methods Mirth provides for talking with the database is executeUpdateAndGetGeneratedKeys. It would be quite useful for insert statements that would return the primary keys for the inserted rows.
My question is - how do you specify WHICH columns to return? Running the provided function works, but returns ROWID in the CachedRowSet, which is not what I want.
As far as I understood, which columns to return depends on the type of the database, and every database behaves differently. I am interested in Oracle specifically.
Thank you.
The executeUpdateAndGetGeneratedKeys method uses the Statement.RETURN_GENERATED_KEYS flag to signal to the driver that auto-generated keys should be returned. However, from the Oracle docs:
If key columns are not explicitly indicated, then Oracle JDBC drivers cannot identify which columns need to be retrieved. When a column name or column index array is used, Oracle JDBC drivers can identify which columns contain auto-generated keys that you want to retrieve. However, when the Statement.RETURN_GENERATED_KEYS integer flag is used, Oracle JDBC drivers cannot identify these columns. When the integer flag is used to indicate that auto-generated keys are to be returned, the ROWID pseudo column is returned as key. The ROWID can be then fetched from the ResultSet object and can be used to retrieved other columns.
So instead, try using their suggestion of passing in a column name array to prepareStatement:
var dbConn;
try {
dbConn = DatabaseConnectionFactory.createDatabaseConnection('oracle.jdbc.driver.OracleDriver','jdbc:oracle:thin:#localhost:1521:DBNAME','user','pass');
// Create a Java String array directly
var keyColumns = java.lang.reflect.Array.newInstance(java.lang.String, 1);
keyColumns[0] = 'id';
var ps = dbConn.getConnection().prepareStatement('INSERT INTO tablename (columnname) VALUES (?)', keyColumns);
try {
// Set variables here
ps.setObject(1, 'test');
ps.executeUpdate();
var result = ps.getGeneratedKeys();
result.next();
var generatedKey = result.getObject(1);
logger.info(generatedKey);
} finally {
ps.close();
}
} finally {
if (dbConn) {
dbConn.close();
}
}

Can I control how Oracle maps the integer types in ADO.NET?

I've got a legacy database that was created with the database type INTEGER for many (1.000+) Oracle columns. A database with the same structure exists for MS SQL. As I was told, the original definition was created using a tool that generated the scripts from a logical model to the specific one for MS SQL and Oracle.
Using C++ and MFC the columns were mapped nicely to the integer type for both DBMs.
I am porting this application to .NET and C#. The same C# codebase is used to access both MS SQL and Oracle. We use the same DataSets and logic and we need the same types (int32 for both).
The ODP.NET driver from Oracle maps them to Decimal. This is logical as Oracle created the integer columns as NUMBER(37) automatically. The columns in MS SQL map to int32.
Can I somehow control how to map the types in the ODP.NET driver? I would like to say something like "map NUMBER(37) to int32". The columns will never hold values bigger than the limits of an int32. We know this because it is being used in the MS SQL version.
Alternatively, can I modify all columns from NUMBER(37) to NUMBER(8) or SIMPLE_INTEGER so that they map to the right type for us? Many of these columns are used as primary keys (think autoincrement).
Regarding type mapping, hope this is what you need
http://docs.oracle.com/cd/E51173_01/win.122/e17732/entityDataTypeMapping.htm#ODPNT8300
Regarding type change, if table is empty, you may use following script (just replace [YOUR_TABLE_NAME] with table name in upper case):
DECLARE
v_table_name CONSTANT VARCHAR2(30) := '[YOUR_TABLE_NAME]';
BEGIN
FOR col IN (SELECT * FROM user_tab_columns WHERE table_name = v_table_name AND data_type = 'NUMBER' AND data_length = 37)
LOOP
EXECUTE IMMEDIATE 'ALTER TABLE '||v_table_name||' MODIFY '||col.column_name||' NUMBER(8)';
END LOOP;
END;
If some of these columns are not empty, then you can't decrease precision for them
If you have not too much data, you may move it to temp table
create table temp_table as select * from [YOUR_TABLE_NAME]
then truncate original table
truncate [YOUR_TABLE_NAME]
then run script above
then move data back
insert /*+ append */ into [YOUR_TABLE_NAME] select * from temp_table
commit
If data amount is substantial it is better to move it once. In such case it is faster to create new table with correct datatypes and all indexes, constraints and so on, then move data, then rename both tables to make new table have proper name.
Unfortunately the mapping of numeric types between .NET and Oracle is hardcoded in OracleDataReader class.
In general I usually prefer to setup appropriate data types in the database, so if possible I would change the column datatypes because they better represent the actual values and their constraints.
Another option is to wrap the tables using views casting to NUMBER(8) but will negatively impact execution plans because it prohibits index lookups.
Then you have also some application implementation options:
Implement your own data reader or subset of ADO.NET classes (inheriting from DbProviderFactory, DbConnection, DbCommmand, DbDataReader, etc. and wrapping Oracle classes), depending on how complex is your implementation. Oracle.DataAccess, Devart and all providers do exactly the same because it gives total control over everything including any magic with the data types. If the datatype conversion is the only thing you want to achieve, most of the implementation would be just calling wrapped class methods/properties.
If you have access to OracleDataReader after command is executed and before you start to read it you can do a simple hack and set the resulting numeric type using reflection (following implementation is just simplified demonstration).
However this will not work with ExecuteScalar as this method never exposes the underlying data reader.
var connection = new OracleConnection("DATA SOURCE=HQ_PDB_TCP;PASSWORD=oracle;USER ID=HUSQVIK");
connection.Open();
var command = connection.CreateCommand();
command.CommandText = "SELECT 1 FROM DUAL";
var reader = command.ExecuteDatabaseReader();
reader.Read();
Console.WriteLine(reader[0].GetType().FullName);
Console.WriteLine(reader.GetFieldType(0).FullName);
public static class DataReaderExtensions
{
private static readonly FieldInfo NumericAccessorField = typeof(OracleDataReader).GetField("m_dotNetNumericAccessor", BindingFlags.NonPublic | BindingFlags.Instance);
private static readonly object Int32DotNetNumericAccessor = Enum.Parse(typeof(OracleDataReader).Assembly.GetType("Oracle.DataAccess.Client.DotNetNumericAccessor"), "GetInt32");
private static readonly FieldInfo MetadataField = typeof(OracleDataReader).GetField("m_metaData", BindingFlags.NonPublic | BindingFlags.Instance);
private static readonly FieldInfo FieldTypesField = typeof(OracleDataReader).Assembly.GetType("Oracle.DataAccess.Client.MetaData").GetField("m_fieldTypes", BindingFlags.NonPublic | BindingFlags.Instance);
public static OracleDataReader ExecuteDatabaseReader(this OracleCommand command)
{
var reader = command.ExecuteReader();
var columnNumericAccessors = (IList)NumericAccessorField.GetValue(reader);
columnNumericAccessors[0] = Int32DotNetNumericAccessor;
var metadata = MetadataField.GetValue(reader);
var fieldTypes = (Type[])FieldTypesField.GetValue(metadata);
fieldTypes[0] = typeof(Int32);
return reader;
}
}
I implemented extension method for command execution returning the reader where I can set up the desired column numeric types. Without setting the numeric accessor (it's just internal enum Oracle.DataAccess.Client.DotNetNumericAccessor) you will get System.Decimal, with accessor set you get Int32. Using this you can get all Int16, Int32, Int64, Float or Double.
columnNumericAccessors index is a column index and it will applied only to numeric types, if column is DATE or VARCHAR the numeric accessor is just ignored. If your implementation doesn't expose the provider specific type, make the extension method on IDbCommand or DbCommand and then safe cast the DbDataReader to OracleDataReader.
EDIT: Added the hack for GetFieldType method. But it might happen that the static mapping hashtable might be updated so this could have unwanted effects. You need to test it properly. The fieldTypes array holds the types returned for all columns of the data reader.

Insert xml file into XMLTable using JDBC

I have created a XMLType table in Oracle. I am trying to insert an XML file into the table using JDBC. It is throwing -
ORA-00932: inconsistent datatypes: expected - got BINARY
The code is -
OraclePreparedStatement statement = (OraclePreparedStatement) getConnection().prepareStatement
("insert into person values(?)");
FileInputStream fileinp = new FileInputStream(file);
statement.setBinaryStream(1, fileinp, fileLength);
statement.executeUpdate();
You're trying to insert binary data directly into your XMLType column, and there is no implicit casting for that. Assuming your file is actually text you can treat is as a CLOB rather than a BLOB:
OraclePreparedStatement statement =
(OraclePreparedStatement) getConnection().prepareStatement(
"insert into person values(xmltype(?))");
FileInputStream fileinp = new FileInputStream(file);
InputStreamReader filerdr = new InputStreamReader(fileinp);
pStmt.setCharacterStream(1, filerdr, fileLength);
pStmt.executeUpdate();
Note that the statement is now using xmltype(?) (although it works without that as there is implicit casting from CLOB, but I think it's better to be explicit anyway); and I'm using an InputStreamReader to pass the text in. You can, and probably should, use a buffered reader:
FileInputStream fileinp = new FileInputStream(file);
InputStreamReader filerdr = new InputStreamReader(fileinp);
BufferedReader filebuf = new BufferedReader(filerdr);
pStmt.setCharacterStream(1, filebuf, fileLength);
pStmt.executeUpdate();
Tested with an XMLType column in a normal table, and with an XMLType table.
Passing a text file with setBinaryStream confuses XMLType; with the same valid file, using setBinaryStream() gets error:
ORA-31011: XML parsing failed
ORA-19202: Error occurred in XML processing
LPX-00210: expected '<' instead of '3'
There's probably a way around that, but I'm assuming your file is just text and won't be a problem as a CLOB.
As I see in the official documentation here you can insert an XMLType in Java in one of two ways:
By CLOB or string binding
By setObject() or setOPAQUE() using
I would read the file to a string and then use it to the insert query

how to export and import BLOB data type in oracle

how to export and import BLOB data type in oracle using any tool. i want to give that as release
Answering since it has a decent view count even with it being 5 year old question..
Since this question was asked 5 years ago there's a new tool named SQLcl ( http://www.oracle.com/technetwork/developer-tools/sqlcl/overview/index.html)
We factored out the scripting engine out of SQLDEV into cmd line. SQLDev and this are based on java which allows usage of nashorn/javascript engine for client scripting. Here's a short example that is a select of 3 columns. ID just the table PK , name the name of the file to create, and content the BLOB to extract from the db.
The script command triggers this scripting. I placed this code below into a file named blob2file.sql
All this adds up to zero plsql, zero directories instead just some sql scripts with javascript mixed in.
script
// issue the sql
// bind if needed but not in this case
var binds = {}
var ret = util.executeReturnList('select id,name,content from images',binds);
// loop the results
for (i = 0; i < ret.length; i++) {
// debug messages
ctx.write( ret[i].ID + "\t" + ret[i].NAME+ "\n");
// get the blob stream
var blobStream = ret[i].CONTENT.getBinaryStream(1);
// get the path/file handle to write to
// replace as need to write file to another location
var path = java.nio.file.FileSystems.getDefault().getPath(ret[i].NAME);
// dump the file stream to the file
java.nio.file.Files.copy(blobStream,path);
}
/
The result is my table emptied into files ( I only had 1 row ). Just run as any plain sql script.
SQL> #blob2file.sql
1 eclipse.png
blob2file.sql eclipse.png
SQL>

How to use variable mapping while using Oracle OLE DB provider in SSIS?

How to use variable mapping while using Oracle OLE DB provider? I have done the following:
Execute SQL Task: Full result set to hold results of the query.
Foreach ADO Enumerator: ADO object source above variable (Object data type).
Variable Mapping: 1 field.
The variable is setup as Evaluate as an Express (True)
Data Flow: SQL Command from variable, as SELECT columnName FROM table where columnName = ?
Basically what I am trying to do is use the results of a query from a SQL Server table, (ie ..account numbers) and pull records from Oracle reference the results from the SQL query
It feels like you're mixing items. The Parameterization ? is a placeholder for a variable which, in an OLE DB Source component, you'd click on the Parameters button and map.
However, since you're using the SQL Command from a Variables, that doesn't allow you to use the Parameterization option, probably because the risk of a user changing the shape of the result set, via Expressions, is too high.
So, pick one - either "SQL Command" with proper parametetization or "SQL Command from Variable" where you add in your parameters in terrible string building fashion like Dynamically assign value to variable in SSIS SQL Server 2005/2008/2008R2 people, be aware that you are limited to 4k characters in a string variable that uses Expressions.
Based on the comment of "Basically what I am trying to do is use the results of a query from a SQL Server table, (ie ..account numbers) and pull records from Oracle reference the results from the SQL query"
There's two ways of going about this. With what you've currently developed, my above answer still stands. You are shredding the account numbers and using those as the filter in your query to Oracle. This will issue a query to Oracle for each account number you have. That may or may not be desirable.
The upside to this approach is that it will allow you to retrieve multiple rows. Assuming you are pulling Sales Order type of information, one account number likely has many sales order rows.
However, if you are working with something that has a zero to one mapping with the account numbers, like account level data, then you can simplify the approach you are taking. Move your SQL Server query to an OLE DB Source component within your data flow.
Then, what you are looking for is the Lookup Component. That allows you to enrich an existing row of data with additional data. Here you will specify a query like "SELECT AllTheColumnsICareAbout, AccountNumber FROM schema.Table ". Then you will map the AccountNumber from the OLE DB Source to the one in the Lookup Component and the click the checkmark next to all the columns you want to augment the existing row with.
I believe what you are asking is how to use SSIS to push data to Oracle OleDb provider.
I will assume that Oracle is the destination. The idea of using data destinations with variable columns is not supported out of the box. You should be able to use the SSIS API or other means, I take a simpler approach.
I recently set up a package to get all tables from a database and create dynamic CSV output. One file for each table. You could do something similar.
Switch out the streamwriter part with a section to 1. Create the table in destination. 2. Insert records into Oracle. I am not sure if you will need to do single inserts to Oracle. In another project that works in reverse, dynamic csv into SQL. SInce I work with SQL server, I load a datatable and use SQLBulkCopy class to use bulk loading which provides excellent performance.
public void Main()
{
string datetime = DateTime.Now.ToString("yyyyMMddHHmmss");
try
{
string TableName = Dts.Variables["User::CurrentTable"].Value.ToString();
string FileDelimiter = ",";
string TextQualifier = "\"";
string FileExtension = ".csv";
//USE ADO.NET Connection from SSIS Package to get data from table
SqlConnection myADONETConnection = new SqlConnection();
myADONETConnection = (SqlConnection)(Dts.Connections["connection manager name"].AcquireConnection(Dts.Transaction) as SqlConnection);
//Read data from table or view to data table
string query = "Select * From [" + TableName + "]";
SqlCommand cmd = new SqlCommand(query, myADONETConnection);
//myADONETConnection.Open();
DataTable d_table = new DataTable();
d_table.Load(cmd.ExecuteReader());
//myADONETConnection.Close();
string FileFullPath = Dts.Variables["$Project::ExcelToCsvFolder"].Value.ToString() + "\\Output\\" + TableName + FileExtension;
StreamWriter sw = null;
sw = new StreamWriter(FileFullPath, false);
// Write the Header Row to File
int ColumnCount = d_table.Columns.Count;
for (int ic = 0; ic < ColumnCount; ic++)
{
sw.Write(TextQualifier + d_table.Columns[ic] + TextQualifier);
if (ic < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
// Write All Rows to the File
foreach (DataRow dr in d_table.Rows)
{
for (int ir = 0; ir < ColumnCount; ir++)
{
if (!Convert.IsDBNull(dr[ir]))
{
sw.Write(TextQualifier + dr[ir].ToString() + TextQualifier);
}
if (ir < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
}
sw.Close();
Dts.TaskResult = (int)ScriptResults.Success;
}
catch (Exception exception)
{
// Create Log File for Errors
//using (StreamWriter sw = File.CreateText(Dts.Variables["User::LogFolder"].Value.ToString() + "\\" +
// "ErrorLog_" + datetime + ".log"))
//{
// sw.WriteLine(exception.ToString());
//}
Dts.TaskResult = (int)ScriptResults.Failure;
throw;
}
Dts.TaskResult = (int)ScriptResults.Success;

Resources