Spring data jdbc Identifier Processing - spring

Use case: Connecting MySQL and oracle database
Issue: If I annotate any one of the data sources as primary, it always uses the primary database Identifier processing and forms the query based on that.
MySQL
#Bean
#Primary
#Qualifier("mySqlJdbcConverter")
public JdbcConverter mySqlJdbcConverter(JdbcMappingContext mappingContext, #Lazy RelationResolver relationResolver,
#Qualifier("mysqlJdbcOperationsReference") NamedParameterJdbcOperations mysqlJdbcOperationsReference) {
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(
mysqlJdbcOperationsReference.getJdbcOperations());
return new BasicJdbcConverter(mappingContext, relationResolver, mySqlJdbcCustomConversions(), jdbcTypeFactory,
IdentifierProcessing.create(new Quoting("`"), LetterCasing.UPPER_CASE));
}
#Bean
#Primary
#Qualifier("mySqlJdbcDialect")
public Dialect mySqlJdbcDialect(final JdbcConverter JdbcConverter) {
return MySqlDialect.INSTANCE;
}
Oracle
#Bean
#Qualifier("oracleJdbcConverter")
public JdbcConverter oracleJdbcConverter(JdbcMappingContext mappingContext, #Lazy RelationResolver relationResolver,
#Qualifier("oracleJdbcOperationsReference") NamedParameterJdbcOperations oracleJdbcOperationsReference) {
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(
oracleJdbcOperationsReference.getJdbcOperations());
return new BasicJdbcConverter(mappingContext, relationResolver, oracleJdbcCustomConversions(), jdbcTypeFactory,
IdentifierProcessing.create(new Quoting("\""), LetterCasing.UPPER_CASE));
}
#Bean
#Qualifier("oracleJdbcDialect")
#RequestScope
public Dialect oracleJdbcDialect(final JdbcMappingContext JdbcConverter) {
return OracleDialect.INSTANCE;
}
In the above case, always query carries backquote character. Even though it is connecting to the oracle database, but the identifier is always backquote
Query:
SELECT `service`.`SERVICE_ID` AS `SERVICE_ID`, `service`.`SERVICE_NAME` AS `SERVICE_NAME` FROM `service`
May I know why it is happening?

The Dialect is not picked up as bean from the ApplicationContext. If you want to use your own Dialect you need to do the following:
implement your own Dialect.
implement a JdbcDialectProvider returning that Dialect.
register the provider by putting a file spring.factories in the META-INF folder of your class path and add the line org.springframework.data.jdbc.repository.config.DialectResolver$JdbcDialectProvider=<fully qualified name of your JdbcDialectProvider>
See https://spring.io/blog/2020/05/20/migrating-to-spring-data-jdbc-2-0#dialects
But really you shouldn't have to do that, since dialects for Oracle and MySql are already provided out of the box.

Related

Spring Boot 2 + JdbcTemplate - is there a way to provide SQL query for each database supported?

I'm working on a Spring Boot 2.4.2 based project and using "spring-boot-starter-jdbc" and "com.oracle.database.jdbc" for Oracle Jdbc driver.
As I use JdbcTemplate to interact with DB, everything seems clean and easy. But I may need to support multiple database types in future - Oracle, SQL Server, MySQL, DB2, etc.
Did quite a bit of Googling but did not find any option for this..
Like mentioned above, I am using Spring-Jdbc (not Spring Data JDBC or Spring Data JPA) - how do I provide the SQL queries specific to each database supported in the code or configuration?
Please, let me know your thoughts. Thanks.
I'm not familiar with Spring JDBC, but you could user Spring dependency injection mechanism to create a profile for each database.
First an interface:
public interface DbQueries {
String createQueryForSelectingUsers();
}
Then implement the interface for each supported database:
#Profile("mysql")
#Component
public class MySqlDbQueries implements DbQueries {
#Override
public String createQueryForSelectingUsers() {
return "SELECT * FROM USER";
}
}
2nd example:
#Profile("oracle")
#Component
public class OracleDbQueries implements DbQueries {
#Override
public String createQueryForSelectingUsers() {
return "SELECT * FROM USER";
}
}
Afterwards use it where you need it:
public class MyRepository {
private DbQueries dbQueries;
// DbQueries implementation will be injected based on your current profile
public MyRepository(DbQueries dbQueries) {
this.dbQueries = dbQueries;
}
public void printAllUsers() {
String query = dbQueries.createQueryForSelectingUsers();
// stuff to execute query
}
}
Remember to start your app with a profile with e.q. --spring.profiles.active=mysql or adding active profile information into application.properties:
spring.profiles.active=mysql

how to use EnumCodec in r2dbc-postgresql

I am trying to use EnumCodec from the latest version of r2dbc-postgresql (0.8.4) unsuccessfully, and I wondered if you could help me.
I use also spring-data-r2dbc version 1.1.1.
I took the exact example from the GitHub and created an enum type “my_enum” in my Postgres,
and a table “sample_table” which contains ‘name’ (text) and ‘value’ (my_enum).
Then I did as in the example:
SQL:
CREATE TYPE my_enum AS ENUM ('FIRST', 'SECOND');
Java Model:
enum MyEnumType {
FIRST, SECOND;
}
Codec Registration:
PostgresqlConnectionConfiguration.builder()
.codecRegistrar(EnumCodec.builder().withEnum("my_enum", MyEnumType.class).build());
I use DatabaseClient in order to communicate with the DB.
I tried to insert using 2 methods:
databaseClient.insert().into(SampleTable.class)
.using(sampleTable).fetch().rowsUpdated();
or:
databaseClient.insert().into("sample_table")
.value("name", sampleTable.getName())
.value("value", sampleTable.getValue())
.then();
where SampleTable is:
#Data
#AllArgsConstructor
#NoArgsConstructor
#Builder
#Table("sample_table")
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonInclude(JsonInclude.Include.NON_NULL)
public class SampleTable implements Serializable {
private String name;
#Column("value")
#JsonProperty("value")
private MyEnumType value;
}
But I get the same error using both:
column "value" is of type my_enum but expression is of type character varying
Can you please help me understand what I did wrong, or refer me to some working example?
I appreciate your help!
Spring Data considers enum values as values to be converted to String by default. You need to register a Converter that retains the type by writing the enum-type as-is.
#WritingConverter
class MyEnumTypeConverter implements Converter<MyEnumType, MyEnumType> {
#Override
public MyEnumType convert(MyEnumType source) {
return source;
}
}
Next, you need to register the converter. If you're using Spring Data R2DBC's AbstractR2dbcConfiguration, then override getCustomConverters():
class MyConfiguration extends AbstractR2dbcConfiguration {
#Override
protected List<Object> getCustomConverters() {
return Collections.singletonList(new MyEnumTypeConverter());
}
// …
}
Alternatively, if you configure DatabaseClient standalone, then you need a bit more of code:
PostgresqlConnectionConfiguration configuration = PostgresqlConnectionConfiguration.builder()
.codecRegistrar(EnumCodec.builder().withEnum("my_enum", MyEnumType.class).build())
.host(…)
.username(…)
.password(…)
.database(…).build();
R2dbcDialect dialect = PostgresDialect.INSTANCE;
DefaultReactiveDataAccessStrategy strategy = new DefaultReactiveDataAccessStrategy(dialect, Collections.singletonList(new MyEnumTypeConverter()));
DatabaseClient databaseClient = DatabaseClient.builder()
.connectionFactory(new PostgresqlConnectionFactory(configuration))
.dataAccessStrategy(strategy)
.build();
However, there are two bugs in the R2DBC driver that prevent Spring Data from working as expected:
Row.decode(…) fails for enum type with IllegalArgumentException: 72093 is not a valid object id #301
EnumCodec decoding fails if the requested value type is Object #302
As temporary workaround, you can duplicate EnumCodec in your codebase and apply the fix from #302 until a new release of R2DBC Postgres is available.
I have tried to use pg enum type and Java Enum class in my sample projects.
If you are using DatabaseClient API(in Spring 5.3 core, not use Spring Data R2dbc), register an EnumCodec in PostgresConnectionFactory is enough.
Check my exmaple.
If creating a pg type enum as a column type in the table schema, and Register an EnumCodec in the PostgresConnectionFactory.builder. You need to write custom #ReadingConverter to read the custom enum.
Check my example here.
If you use text-based type(varchar) in the table schema with Java Enum. no need for the extra effort on conversion, check my example here.
The Spring Data R2dbc said if using the driver built-in mechanism to handle enum, you have to register an EnumWriteSupport. But according to my experience, when using Spring Data R2dbc, the write can be handled automatically, but reading converter is required to read enum from Postgres.

Table name configured with external properties file

I build a Spring-Boot application that accesses a Database and extracts data from it. Everything is working fine, but I want to configure the table names from an external .properties file.
like:
#Entity
#Table(name = "${fleet.table.name}")
public class Fleet {
...
}
I tried to find something but I didn't.
You can access external properties with the #Value("...") annotation.
So my question is: Is there any way I can configure the table names? Or can I change/intercept the query that is sent by hibernate?
Solution:
Ok, hibernate 5 works with the PhysicalNamingStrategy. So I created my own PhysicalNamingStrategy.
#Configuration
public class TableNameConfig{
#Value("${fleet.table.name}")
private String fleetTableName;
#Value("${visits.table.name}")
private String visitsTableName;
#Value("${route.table.name}")
private String routeTableName;
#Bean
public PhysicalNamingStrategyStandardImpl physicalNamingStrategyStandard(){
return new PhysicalNamingImpl();
}
class PhysicalNamingImpl extends PhysicalNamingStrategyStandardImpl {
#Override
public Identifier toPhysicalTableName(Identifier name, JdbcEnvironment context) {
switch (name.getText()) {
case "Fleet":
return new Identifier(fleetTableName, name.isQuoted());
case "Visits":
return new Identifier(visitsTableName, name.isQuoted());
case "Result":
return new Identifier(routeTableName, name.isQuoted());
default:
return super.toPhysicalTableName(name, context);
}
}
}
}
Also, this Stackoverflow article over NamingStrategy gave me the idea.
Table names are really coming from hibernate itself via its strategy interfaces. Boot configures this as SpringNamingStrategy and there were some changes in Boot 2.x how things can be customised. Worth to read gh-1525 where these changes were made. Configure Hibernate Naming Strategy has some more info.
There were some ideas to add some custom properties to configure SpringNamingStrategy but we went with allowing easier customisation of a whole strategy beans as that allows users to to whatever they need to do.
AFAIK, there's no direct way to do config like you asked but I'd assume that if you create your own strategy you can then auto-wire you own properties to there. As in those customised strategy interfaces you will see the entity name, you could reserve a keyspace in boot's configuration properties to this and match entity names.
mytables.naming.fleet.name=foobar
mytables.naming.othertable.name=xxx
Your configuration properties would take mytables and within that naming would be a Map. Then in your custom strategy it would simply be by checking from mapping table if you defined a custom name.
Spring boot solution:
Create below class
#Configuration
public class CustomPhysicalNamingStrategy extends SpringPhysicalNamingStrategy{
#Value("${table.name}")
private String tableName;
#Override
public Identifier toPhysicalTableName(final Identifier identifier, final JdbcEnvironment jdbcEnv) {
return Identifier.toIdentifier(tableName);
}
}
Add below property to application.properties:
spring.jpa.properties.hibernate.physical_naming_strategy=<package.name>.CustomPhysicalNamingStrategy
table.name=product

Multiple datasources migrations using Flyway in a Spring Boot application

We use Flyway for db migration in our Spring Boot based app and now we have a requirement to introduce multi tenancy support while using multiple datasources strategy. As part of that we also need to support migration of multiple data sources. All data sources should maintain the same structure so same migration scripts should be used for migrating of all data sources. Also, migrations should occur upon application startup (as opposed to build time, whereas it seems that the maven plugin can be configured to migrate multiple data sources). What is the best approach to use in order to achieve this? The app already has data source beans defined but Flyway executes the migration only for the primary data source.
To make #Roger Thomas answer more the Spring Boot way:
Easiest solution is to annotate your primary datasource with #Primary (which you already did) and just let bootstrap migrate your primary datasource the 'normal' way.
For the other datasources, migrate those sources by hand:
#Configuration
public class FlywaySlaveInitializer {
#Autowired private DataSource dataSource2;
#Autowired private DataSource dataSource3;
//other datasources
#PostConstruct
public void migrateFlyway() {
Flyway flyway = new Flyway();
//if default config is not sufficient, call setters here
//source 2
flyway.setDataSource(dataSource2);
flyway.setLocations("db/migration_source_2");
flyway.migrate();
//source 3
flyway.setDataSource(dataSource3);
flyway.setLocations("db/migration_source_3");
flyway.migrate();
}
}
Flyway supports migrations coded within Java and so you can start Flyway during your application startup.
https://flywaydb.org/documentation/migration/java
I am not sure how you would config Flyway to target a number of data sources via the its config files. My own development is based around using Java to call Flyway once per data source I need to work against. Spring Boot supports the autowiring of beans marked as #FlywayDataSource, but I have not looked into how this could be used.
For an in-java solution the code can be as simple as
Flyway flyway = new Flyway();
// Set the data source
flyway.setDataSource(dataSource);
// Where to search for classes to be executed or SQL scripts to be found
flyway.setLocations("net.somewhere.flyway");
flyway.setTarget(MigrationVersion.LATEST);
flyway.migrate();
Having your same problem... I looked into the spring-boot-autoconfigure artifact for V 2.2.4 in the org.springframework.boot.autoconfigure.flyway package and I found an annotation FlywayDataSource.
Annotating ANY datasource you want to be used by Flyway should do the trick.
Something like this:
#FlywayDataSource
#Bean(name = "someDatasource")
public DataSource someDatasource(...) {
<build and return your datasource>
}
Found an easy solution for that - I added the step during the creation of my emf:
#Qualifier(EMF2)
#Bean(name = EMF2)
public LocalContainerEntityManagerFactoryBean entityManagerFactory2(
final EntityManagerFactoryBuilder builder
) {
final DataSource dataSource = dataSource2();
Flyway.configure()
.dataSource(dataSource)
.locations("db/migration/ds2")
.load()
.migrate();
return builder
.dataSource(dataSource)
.packages(Role.class)
.properties(jpaProperties2().getProperties())
.persistenceUnit("domain2")
.build();
}
I disabled spring.flyway.enabled for that.
SQL files live in resources/db/migration/ds1/... and resources/db/migration/ds2/...
This worked for me.
import javax.annotation.PostConstruct;
import org.flywaydb.core.Flyway;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
#Configuration
public class FlywaySlaveInitializer {
#Value("${firstDatasource.db.url}")
String firstDatasourceUrl;
#Value("${firstDatasource.db.user}")
String firstDatasourceUser;
#Value("${firstDatasource.db.password}")
String firstDatasourcePassword;
#Value("${secondDatasource.db.url}")
String secondDatasourceUrl;
#Value("${secondDatasource.db.user}")
String secondDatasourceUser;
#Value("${secondDatasource.db.password}")
String secondDatasourcePassword;
#PostConstruct
public void migrateFlyway() {
Flyway flywayIntegration = Flyway.configure()
.dataSource(firstDatasourceUrl, firstDatasourceUser, firstDatasourcePassword)
.locations("filesystem:./src/main/resources/migration.first")
.load();
Flyway flywayPhenom = Flyway.configure()
.dataSource(secondDatasourceUrl, secondDatasourceUser, secondDatasourcePassword)
.locations("filesystem:./src/main/resources/migration.second")
.load();
flywayIntegration.migrate();
flywayPhenom.migrate();
}
}
And in my application.yml this property:
spring:
flyway:
enabled: false

How to specify DataSource in JdbcTemplate?

In my application.properties I have set:
datasource.test.driverClass=org.postgresql.Driver
datasource.test.url=jdbc:postgresql://localhost:5433/test
datasource.test.username=admin
datasource.test.password=admin
logging.level.com.eternity = DEBUG
In my controller, I am trying to execute some SQL query form a string like this:
String selectQueryPartOne = "SELECT name, ("+ StringUtils.join(sumString, " + ")+") AS 'Price' FROM house WHERE NOT (" +StringUtils.join(sumString, " IS NULL OR ")+" IS NULL);";
JdbcTemplate statement = new JdbcTemplate();
statement.queryForList(selectQueryPartOne);
Which would work fine, however, I am receiving the following error:
java.lang.IllegalArgumentException: No DataSource specified
I've discovered, that in my statement object, I need to setDataSource first. However, I have no idea where I can get this dataSource object. Could you help?
When you create the JdbcTemplate instance yourself, you are working outside of the Spring dependency injection, and therefore will not have the DataSource injected. You need to use the Spring-provided instance via autowiring, something like:
#Controller
public class MyController {
#Autowired private JdbcTemplate jdbcTemplate;
#RequestMapping("/")
public String myAction(){
// do stuff with the jdbc template
}
}
Also, the Spring and Spring-boot documentation are great resources for further study on working with spring.

Resources