How to get schema name from DataSource or Connection object - jdbc

I want to get the schema name from my DataSource or Connection object so that it can be used in my SQL queries dynamically. I'm using DB2 and there is no implementation of connection.getSchema() in DB2 driver.
I'm using DataSource to get connection. Since connection.getSchema() is not working, I tried another approach as given below
connection.getMetaData().getURL()
But this is returning connection URL without schema information like below:
jdbc:db2://servername:1446/DBName
But i have given schema information in the URL while creating the datasource in embeddable Container.
jdbc:db2://servername:1446/DBName:currentSchema=mySchema
I need to get schema name to use it in query. Somebody knows how to get schema name.

Try the SQL statement
values current schema

The Db2BaseDataSource has a property currentSchema, along with a getter and a setter.
There's also a property called user .
setter:
db2ds.setCurrentSchema("fred");
getter:
String x = db2ds.getCurrentSchema() ;

Related

dbunit schema issues with h2 and oracle, schema is always PUBLIC

From a previous export of an Oracle DB, i have both xml and dtd files with lots of data but dbunit seems to ignore the dtdfiles when i try to import the data.
flatXMLBuilder.build(xmlFile);
I always get this errormessage when trying to reference one object of a table:
ERROR Table 'TABLE' not found in tableMap=org.dbunit.dataset.OrderedTableNameMap
and the warning:
WARN session=global o.d.database.DatabaseConnection - The given schema 'SA' does not exist
I am not sure if the including of the dtd files would fix the schemaproblem (which is always set to PUBLIC)
But i would be happy if i at least could test it out.
The files are always in the same directory:
T1.xml
T1.dtd
Also the core problem seems to be the schema which is always set to PUBLIC on my target H2 in memory db.
I tried out some schema set methods like "set schema" or "init=create schema" in the connection url but the currentschema was always public.
So when i was logged in as the SA user for example i couldn't find any tables becaue in the SA schema are no tables only in PUBLIC.
I also tried the setSchema(String) method on the h2connection but it don't work (i get an uncatchable error on call)
Update:
Currently i use a FileInputStream to read the dtd file and add it to the builder:
builder.setMetaDataSetFromDtd(dtdStream);
But it didn't help with the problem.
If you want to create a schema x, and also set the default schema to x, in the database URL, then you need to use:
jdbc:h2:~/data/databaseName;init=create schema if not exists x\;set schema x
Please note the escaped semicolon. In Java, you need to escape the backslash as well:
String url = "jdbc:h2:~/data/databaseName;init=create schema if not exists x\\;set schema x";
This URL is a bit long. You might want to move all the statements to an init script, and just run that:
jdbc:h2:~/data/databaseName;init=runscript 'init.sql'

Database specific queries in a Spring Hibernate application

In a dao class implementation,I want to use different sql query depending upon the underlying database. Since my SQL query is complex which selects from a database view and uses "UNION" key word and uses database specific functions, I can not use JPQL (or HQL). I did some search on Stackoverflow and threads suggest the good way would be to find out the dialect used in the application. Can anyone provide some code example?
EDIT : My apologies, I did not explain my question well enough. In my dao class implementation , I want to determine the database ( say mysql or oracle) on which my application is running and then execute the appropriate query. I need the code like jdbcTemplate.findOutTheDialect().
JPA have the native queries for that.
An example you can find here.
You can use spring JdbcTemplate to connect and query from your database.
For Example..
String query = "SELECT COUNTRY_NAME FROM COUNTRY WHERE MCC_COUNTRY NOT IN ('Reserved') ORDER BY COUNTRY_NAME";
jdbcTemplate.query(query, new ObjectRowMapper());
Where "ObjectRowMapper" will be a class to map return resultset to list of Objects.

JDBC: How to get current connection?

I am trying to create a MyBatis custom FILE type handler for the BLOB of Postgres.
Here is the method I need to implement to fulfill the interface:
#Override
public File getNullableResult(ResultSet rs, String columnName) throws SQLException {
1.get current connection
2.get postgreSQL LargeObjectManager from current connection
3.get oid from ResultSet, so the Larget Object can be found
4.read the large object and rewrite it into a file
5.return the file
}
However I do not know how to get the current connection in this situation. Is there a way to get connection from ResultSet?
Thanks in advance,
UPDATE:
PostgreSQL implements blob (large object, not bytea) in a special (nonstandard maybe) way. It saves all blob in pg_largeobject table, and use oid as a "pointer" so you can reference the blob from your real table.
Postgres JDBC driver has separated API to handle blob. More details in the following link:
http://jdbc.postgresql.org/documentation/91/binary-data.html
You can use:
Connection connection = rs.getStatement().getConnection();
But check this method for get BLOB data directly from rs:
rs.getBinaryStream("myBlobColumn");

How do I map two entities in ColdFusion ORM that are in different schemas?

I have two tables that exist in the same Oracle database system but different schemas, which I've mapped like this:
ABC.Store:
component schema="ABC" table="Stores"
{
property name="Id" fieldtype="id" generator="sequence" sequence="store_id_seq";
property name="Products" fieldtype="one-to-many" cfc="Product";
}
DEF.Product:
component schema="DEF" table="Products"
{
property name="Id" fieldtype="id" generator="sequence" sequence="product_id_seq";
}
I set my application's default datasource as this.datasource = "ABC" in application.cfc.
The problem I'm running into here is whenever I try to save a Product. ColdFusion spits out an error that says the sequence cannot be found for the Id property on Product. This is because the product_id_seq sequence is in the DEF schema, but ColdFusion is trying to find it in the ABC schema, even though I set the schema on the Product as DEF.
If I set the datasource attribute on Product to DEF, I then get an error that says the Products property on Store is unmapped. This is because, as the ColdFusion documentation states:
"Since a Hibernate configuration uses a single data source, all related CFCs (using ORM relationships) must have the same data source."
My question then is, how do I map the two tables in two different schemas, using a sequence as an ID generator?
I've been able to get it to work if I specify the schema for the sequence:
property name="Id" fieldtype="id" generator="sequence" sequence="def.product_id_seq";
But this is hard-coded and I'd like it to be dynamic and pull the schema name from a configuration bean.
The only way I've been able to get this to work seamlessly was to:
Create a single user in database, in this case MySQL, that had access to the desired schemas.
Setup and configure a single datasource in CFIDE that utilizes the newly created user for authentication.
Set the datasource attribute in all desired persistent objects to the newly created datasource.
Set the schema attribute in all desired persistent objects to reference the correct schema, or database. (the two are synonymous in ColdFusion ORM)
Note: Be sure to use full component path when referencing CFCs in your COM.

Pentaho reporting connectionFactory has new method added but no description

I am in process of upgrading my pentaho reporting from 3.6.1 to 3.8.0 in my web application. when I updated all necessary jar files, I got one compilation error in one of my class which implements ConnectionProvider. following is my class.
public class DataSourceConnectionProvider implements ConnectionProvider
{
....
}
The error is saying that my class should implement getConnectionHash() method as it is defined in ConnectionProvider interface. but It was not there in 3.6.1 version. so I am bit confused why they have added it and how to implement it in my class.
This method returns a object that is comparable and hashable and is used during the caching of datasources. It allows us to build some sort of key to detect changes in the connection definition while many reports run within the same JVM.
The cache implementation itself does not know any of the details of the various datasources and the "ConnectionHash" allows us keep result-sets separate.
My basic implementation of it simply returns a ArrayList with all relevant connection properties added to it.
Simple example how and where it is needed:
Imagine you have a JDBC datasource that connects to a database where several schemas with the same table structures exists, for example in a multi-tenant environment where each tenant has his own schema.
With a query like "SELECT * FROM CUSTOMERS WHERE COUNTRY = ${country-parameter}" the datasource will return different datasets based on which tenant performs the query. The sum of "connection-hash", "query-name" and "parameter used in the query" now forms a unique identifier that we can use to store and later lookup the resultset from the cache.

Resources