Migration to Spring Boot is causing problems with Spring Batch - spring

I'm trying to convert an older application that so far only uses Spring to Spring Boot. I managed to get the application to run again. But I'm having problems with Spring Batch.
The application uses Spring + Spring Batch + Spring Security + Hibernate + Wicket + ojdbc and is running on Tomcat with a Oracle 19c database.
I'm fighting these Spring Batch problems:
Massive performance drop: A Batch Step that took 2 minutes before is now running for an hour. This seems to me like it's caused by a slow database connection. Spring Batch spends a lot of time creating the objects with the ItemReader.
The Spring Job fails at the end with a "ObjectOptimisticLockingFailureException: Batch update returned unexpected row count from update" There is a versioned table that contains the status of the calculations. This status is updated multiple times during the Batch Job. The status update works in the beginning, but near the end the EntityManager returns an result with an outdated version. Updating causes the exception.
Do you have any ideas or hints what could cause these problems? The functionality was not changed, this should only be a migration to Spring Boot.

Related

Spring Batch with unknown datasource

I have a working Spring Boot application which embeds a Spring Batch Job. The job is not run on a schedule, instead we kick it with an endpoint. It is working as it should. The basics of the batch are
Kick the endpoint to start the job
Reader reads from input file
Processor reads from oracle database using jpa repository and simple spring datasource config
Writer writes to output file
However there are new requirements:
The schema of the repository database is from here on unknown on application startup. The tables are the same, it is just an unknown schema. This fact is out of our control and you might think it is stupid but there are reasons for it and this cant be changed. This means that with current functionality we need to reconfigure the datasource when we know the new schema name, and restart the application. This is a job that we will run for a number of times when migrating from one system to another, so it has a limited lifecycle and we just need a "quick fix" to be able to use it without rewriting the whole app. So what I would like to do is:
Send the schema name as a query param to the application, put it in job parameters and then - get a new datasource when the processor reads from the repository. Would this be doable at all using Spring Batch? Any help appreciated!

Does Spring boot Support Fast fail on Liquibase

From Spring DB Initialization docs, Spring boot DB initialization using Spring JDBC supports
feature Fast fail - that means if there are any issues in DB init script or migration scripts then Spring boot Context initialization failed. as result Spring boot server won't start.
now can we have this functionality when we use Advanced DB Migration tool like liquibase?
Spring Docs Doesn't say anything about this in liquibase section. Does this feature only works in Spring JDBC initialization?
whenever there is an issue in your liquibase script, application won't start because datasource is not properly initialised.
You can specify the desired behaviour using 'Preconditions'.
On default, it actually will fail fast, but you can overwrite it (example using SQL syntax):
-- preconditions onFail:WARN onError:WARN
Available parameters:
HALT: Immediately halt the execution of the entire changelog. [DEFAULT]
CONTINUE: Skip over the changeset. Execution of the change set will be
attempted again on the next update. Continue with the changelog.
MARK_RAN: Skip over the changeset, but mark it as executed. Continue
with the changelog. WARN Output a warning and continue executing the
changeset / changelog as normal.
see https://docs.liquibase.com/concepts/advanced/preconditions.html

Not wanted JOB_EXECUTION_PARAMS in spring batch

I am facing a strange situation. I am developping a springboot app that uses spring-batch to handle some jobs. The thing is that for some reason on my dB an aditional row is being inserted in the batch_job_execution_params. But this only happens if the springboot application runs with any arguments (args).
Any ideas of why this happens ?

Spring DataSourceTransactionManager makes saves to fail

I am working on a Spring Boot (2.0.2.RELEASE) application. Recently I enabled transaction management and created a DataSourceTransactionManager in the app configuration. We are using Spring Data JPA with Hibernate.
I noticed some time later that many (but not all) saves were not actual happening, but all log entries state everything was fine. Only affected saves were made by a Quartz job. Some of them were successful, but most of them not. No errors in the logs.
I since removed the transaction manager and now everything works as expected.
I assume now that DataSourceTransactionManager isn't meant to be used with JPA or Hibernate. I still don't know what to use and why this error happened.
Any help is appreciated.

Java Spring Boot Batch - Need some advise in design

I am a newbie in Java and trying to implement a Spring Boot batch application.
My requirement is like to check some data in database (one part) and delete if found (another part).
I am planning to implement Spring Boot batch for this.
I will have one job which will have 2 steps. If Step 1 find some data then only execute step 2? Can I achieve in Spring Boot Batch? Or what is the best way to achieve this keeping in mind I have to schedule this to run weekly.
With just the scheduled job for find and delete records from DB, I don't suggest using Spring Batch. Spring has nice good way of doing it without Batch using scheduling-tasks. You can see example here. Use Spring Batch only if you need to run jobs in batch that can't be handled with normal operation.
If you need complex scheduler, you can use Spring Quartz scheduler.

Resources