I have spring batch maven project deployed in a unix server.
I know that there are a couple of questions on this topic already and I have tried all those solutions. I have tried adding the date and even the time in milliseconds as a job parameter to keep it unique. I am testing something and I have to keep triggering the job manually a lot of times in a day. I have created a folder in a unix server and I have compiled my spring maven project into a jar file and moved it to the server. But whenever I run the spring batch job there by giving a time in application.properties it gives me the Jobinstancealreadycompleteexception.
One more surprising thing is that this issue does not happen in local, only in the server.
Related
In our Linux environment applications are restarted periodically (the reasons aren't important here). It would be convenient for us to deploy new versions of an application by copying the application Spring Boot jar on top of the existing (old) jar thereby overwriting it and then simply wait for the application to restart (that is, the JVM running the application to restart).
However, this seems to not work. We get different kinds of errors - sometimes the app just hangs, sometimes we get a ClassNotFoundException. It's as if Spring Boot (or something inside Spring Boot) reopens the jar and expects it to be the same one it was when the application was originally started.
We had a look through Spring's common application properties, but didn't see anything appropriate. Is there a way to make this work? When we were using WAR files we configured the servlet container to unpack the WAR file and run from the unpacked version. Can we do something similar with Spring Boot?
First of all the errors you are experiencing can come from multiple sources. Usually replacing the file from a process is not a big problem as the whole file is loaded into memory before execution. Java is a little bit different, because the actual process that is running is the JVM and it only loads the jar file from disk. The JVM loads classes only on demand, this means if there was any class that was not loaded before it will try to load it and most likely fail, if the jar file is different. In the case of spring boot there are also other resources (such as HTML files) inside the jar file that are dynamically loaded.
You mentioned you are using a linux environment. If you can just replace your startup with a script you can just copy the jar and start it from the copied location:
#!/bin/bash
JAR_NAME="spring-boot.jar"
NEW_JAR_NAME=".$JAR_NAME" # Use an appropriate name here
cp $JAR_NAME $NEW_JAR_NAME
java -jar $NEW_JAR_NAME
rm $NEW_JAR_NAME
Now every time you start the application a copy is being made and started from there. You can replace the original jar and on the next restart the new application will load.
You coud also use rsync instead of cp to avoid copying the same jar twice, if the application is restarted multiple times without changing the jar.
It would be convenient for us to deploy new versions of an application
by copying the Spring Boot jar on top of the existing (old) jar and
then simply wait for the application to restart
Why would you do such a thing to yourself? You are trying solve a usecase that's against best practices, sound like asking for trouble just to avoid an app restart. When you are doing a deployment, you need to make sure the deployment went through, otherwise how will you troubleshoot if something goes wrong in your application, you will have one more variable in hand when you troubleshoot, i,e the uncertainty of current version of the code.
If you are having downtime while deploying (I am assuming thats why you want to limit the restarts), why don't you bring up another instance with the newer version of code and once its healthy shutdown the old one
I am new to Spring Boot from php world. In Php development, it is simple to make changes on the file, upload and run.
But on Spring boot, my development relies on remote ubuntu server, every time I make change in *.java, I have to build the Fat Jar, upload the Jar, kill the current java process on ubuntu, and run the java -jar my.jar again, which spend much time on the upload because the Jar is about 60 mb.
Is there any way I can work like php, just upload the changed file, so the spring boot just compile the class and run?
Does change to build *.war help to faster deployment?
There are a few option to mitigate the roundtrip of building the jar file and upload it.
Hot-swap: For minor changes, you can hot-swap changes automatially when you have a remote-debugger attached. I use Intellij as Ide, which provide this out of the box after a file is recompiled, see more at this link how to enable it.
Reloading tool: use a tool that are designed to reload Java classes, such as JRebel which extends the classloader and updates a class if a change has been detected. However, they are often only available in a paid version.
Spring Boot dev-tools: this tool also monitors changes and restart the application with the new changes (so no need to rebuild the jar file). It is possible to use on a remote application. See this link for more info.
Using a war file is different concept since a war file is executed inside an application container (e.g a Wildfly server). You can dynamically upload a war file to a running application server, which will only restart the war file. But I'm not sure if this will lead to faster deployment, however it is a different approach how the application is run.
I'm facing very weird issue while integrating flyway DB migration with spring boot application.
When I run the application from executable WAR using command line, it creates new DB at the start-up of application.
Now, If I switch the application run mode to IDE (i.e. run from STS), it again fires all the script from my db/migration folder. I can see the installed_on column time changes every-time I switch between these 2 run modes. I have tried enabling baselineOnMigrate property, but didn't get any effect of it.
Do you think its something related to spring boot embedded tomcat ? because at both run it creates individual tomcat which is embedded.
Please find my spring boot application.properties below:
mssql.dbname=issueDB
mssql.password=password
mssql.dbserver=localhost
mssql.port=1501
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.url=jdbc:sqlserver://${mssql.dbserver}:${mssql.port};databaseName=${mssql.dbname}
spring.datasource.username=user
spring.datasource.password=${mssql.password}
spring.flyway.baselineOnMigrate=true
spring.flyway.locations=classpath:db/migration/testissue
spring.flyway.out-of-order=true
spring.flyway.baseline-version=1.3
spring.flyway.placeholder-prefix=$
spring.flyway.placeholder-suffix=$
spring.flyway.mixed=true
spring.flyway.cleanOnValidationError=true
I suppose, it could be caused by this property spring.flyway.cleanOnValidationError=true. According to the docs:
Whether to automatically call clean or not when a validation error occurs.
This is exclusively intended as a convenience for development. Even tough we strongly recommend not to change migration scripts once they have been checked into SCM and run, this provides a way of dealing with this case in a smooth manner. The database will be wiped clean automatically, ensuring that the next migration will bring you back to the state checked into SCM.
May be that you got some validation problems if you are running your application in different ways on the same database and flyway just clean your database and overwrite it with the current scripts state.
We have Spring4 and Spring Batch 3 and our app consumes CSV files as input file. Currently we kick off the jobs manually from the command line, using CommandLineJobRunner with parms, including the name of the file to process.
I want to kick off a job to process asynchronously just as soon as the input file arrives in a monitored directory. How can we do that?
You may use java.nio.file.WatchService to monitor directory for a file.
Once file appears you may start (or kick off a job to process asynchronously) actual processing.
You may also use FileReadingMessageSource.WatchServiceDirectoryScanner from Spring Integration (https://docs.spring.io/spring-integration/reference/html/files.html#watch-service-directory-scanner)
Comparing release notes Spring Batch https://github.com/spring-projects/spring-batch/releases
to Spring Integration https://github.com/spring-projects/spring-integration/releases it looks that Spring Integration is released more often. It also has more features and Integration points.
In this case it looks like a overkill to bring Spring Integration if you just need to watch a directory for a file.
I would recommend using the powerful combination of Spring Batch with Spring Integration. For example, you can use a FileInboundChannelAdapter from Spring Integration to monitor a directory and start a Spring Batch Job as soon as the input file arrives.
There is a code example for this typical use case in the reference documentation of Spring Batch here: https://docs.spring.io/spring-batch/4.0.x/reference/html/spring-batch-integration.html#launching-batch-jobs-through-messages
I hope this helps.
Just looking for some information if others have solved this pattern. I want to use Spring Integration and Spring Batch together. Both of these are SpringBoot applications and ideally I'd like to keep them and their respective configuration separated, so they are both their own executable jar. I'm having problems executing them in their own process space and I believe I want, unless someone can convince me otherwise, each to run like they are their own Spring Boot app and initialize themselves with their own profiles and properties. What I'm having trouble with though is the invocation of the job in my SpringBatch project from my SpringIntegration project. At first I couldn't get the properties loaded from the batch project, so I realized I need to pass the spring.active.profiles as a Job Parameter and that seemed to solve that. But there are other things in the Spring Boot Batch application that aren't loading correctly like the schema-platform.sql file and the database isn't getting initialized, etc.
On this initial launch of the job I might want the response to go back to Spring Integration for some messaging on Job Status. There might be times when I want to run a job without Spring Integration kicking off the job, but still take advantage of sending statuses back to the Spring Integration project providing its listening on a channel or something.
I've reviewed quite a few Spring samples and have yet to find my exact scenario, most are with the two dependencies in the same project, so maybe I'm doing something that's not possible, but I'm sure I'm just missing a little something in the Spring configuration.
My questions/issues are:
I don't want the Spring Integration project to know anything about the SpringBatch configuration other than the job its kicking off. I have found a good way to do that reference to the Job Bean without getting my entire batch configuration loading.
Should I keep these two projects separated or would it be better to combine them since I have two-way communication between both.
How should the Job be launch from the integration project. We're using the spring-batch-integration project with JobLaunchRequest and JobLauncher. This seems to run it in the same process as the Spring Integration project and I'm missing a lot of my SpringBootBatch projects initialization
Should I be using a CommandLineRunner instead to force it to another process.
Is SpringApplication.run(BatchConfiguration.class) the answer?
Looking for some general project configuration setup to meet these requirements.
Spring Cloud Data Flow in combination with Spring Cloud Task does exactly what you're asking. It launches Spring Cloud Task applications (which can contain batch jobs) as new processes on the platform you choose. I'd encourage you to check out that project here: http://cloud.spring.io/spring-cloud-dataflow/