I have Spring boot application, in which I am generating file dynamically and want flyway plugin to read file at the same time.
I am able to generate file but not able to find the file at the same run by flyway plugin. If I refresh app it shows that file and able to read at next run.
I want that generates and at same time able to read.
spring.devtools.restart.additional-paths=
tried this but it restarts application and write into same file only. because of that records are duplicated and also it not refresh resource dir so not able to read.
I tried many other ways but couldn't work for me. Is there any solution.
Related
In our Linux environment applications are restarted periodically (the reasons aren't important here). It would be convenient for us to deploy new versions of an application by copying the application Spring Boot jar on top of the existing (old) jar thereby overwriting it and then simply wait for the application to restart (that is, the JVM running the application to restart).
However, this seems to not work. We get different kinds of errors - sometimes the app just hangs, sometimes we get a ClassNotFoundException. It's as if Spring Boot (or something inside Spring Boot) reopens the jar and expects it to be the same one it was when the application was originally started.
We had a look through Spring's common application properties, but didn't see anything appropriate. Is there a way to make this work? When we were using WAR files we configured the servlet container to unpack the WAR file and run from the unpacked version. Can we do something similar with Spring Boot?
First of all the errors you are experiencing can come from multiple sources. Usually replacing the file from a process is not a big problem as the whole file is loaded into memory before execution. Java is a little bit different, because the actual process that is running is the JVM and it only loads the jar file from disk. The JVM loads classes only on demand, this means if there was any class that was not loaded before it will try to load it and most likely fail, if the jar file is different. In the case of spring boot there are also other resources (such as HTML files) inside the jar file that are dynamically loaded.
You mentioned you are using a linux environment. If you can just replace your startup with a script you can just copy the jar and start it from the copied location:
#!/bin/bash
JAR_NAME="spring-boot.jar"
NEW_JAR_NAME=".$JAR_NAME" # Use an appropriate name here
cp $JAR_NAME $NEW_JAR_NAME
java -jar $NEW_JAR_NAME
rm $NEW_JAR_NAME
Now every time you start the application a copy is being made and started from there. You can replace the original jar and on the next restart the new application will load.
You coud also use rsync instead of cp to avoid copying the same jar twice, if the application is restarted multiple times without changing the jar.
It would be convenient for us to deploy new versions of an application
by copying the Spring Boot jar on top of the existing (old) jar and
then simply wait for the application to restart
Why would you do such a thing to yourself? You are trying solve a usecase that's against best practices, sound like asking for trouble just to avoid an app restart. When you are doing a deployment, you need to make sure the deployment went through, otherwise how will you troubleshoot if something goes wrong in your application, you will have one more variable in hand when you troubleshoot, i,e the uncertainty of current version of the code.
If you are having downtime while deploying (I am assuming thats why you want to limit the restarts), why don't you bring up another instance with the newer version of code and once its healthy shutdown the old one
I have a working Spring Boot application which embeds a Spring Batch Job. The job is not run on a schedule, instead we kick it with an endpoint. It is working as it should. The basics of the batch are
Kick the endpoint to start the job
Reader reads from input file
Processor reads from oracle database using jpa repository and simple spring datasource config
Writer writes to output file
However there are new requirements:
The schema of the repository database is from here on unknown on application startup. The tables are the same, it is just an unknown schema. This fact is out of our control and you might think it is stupid but there are reasons for it and this cant be changed. This means that with current functionality we need to reconfigure the datasource when we know the new schema name, and restart the application. This is a job that we will run for a number of times when migrating from one system to another, so it has a limited lifecycle and we just need a "quick fix" to be able to use it without rewriting the whole app. So what I would like to do is:
Send the schema name as a query param to the application, put it in job parameters and then - get a new datasource when the processor reads from the repository. Would this be doable at all using Spring Batch? Any help appreciated!
I'm fairly new to Spring Boot and MongoDB. Currently I have a project that can send data back and forth to a server that is running locally on my computer, but I want to change this and make it edit and retrieve data from an externally running database. In other tutorials I have followed I have had an application.properties file that I can edit details in to get it to connect, but I can't find this in any of the sub folders (I pulled the code from a tutorial) and I can't find anything to say that it is specifically connecting to the local instance.
Would it be okay to just create the application.properties file in the right sub folder and enter the external database's details there? Or am I going to have to try a separate method and tutorial to try and connect to the external database in another way?
I have a feeling that to answer it you will need to see/understand more of the code, but I'm not sure how to summarise anything else or what would actually be relevant. Thank you.
Spring Boot has several default folders, where it searches for properties.
One of those places is for example src/main/resources/application.properties, there you can just create this file.
An overview of other possible places for Spring Boot properties can be found here:
https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-external-config.html
In my German blog I wrote an article about how to use Spring-data to access MongoDB - there I used also application.properties file:
https://agile-coding.blogspot.com/2020/10/keine-ahnung-von-mongodb-dann-nimm.html
I am working on building a SpringBoot app with my workmates. We all have different preferred logging levels for the app. We have been battling back and forth with each other's logging changed to application.properties. Is there a way to move all of the logging.level.* stuff out of that application.properties file and into a ~logging.properties file or something? That way we can add that file to the .gitignore and not track that file so we can each leave our logging alone. We are using Java annotations and not xml btw.
I've tried adding #PropertySource("classpath:logging.properties") to the application file, but I read somewhere that the logging gets setup early on in the init process and this won't work. I tried it anyway and it doesn't work (so confirmed I guess).
I can't believe there's isn't more info on this out there.. I'd imagine the members of a dev team each want their own custom level of logging and don't want to keep stepping on each other's toes/commits.
You can just override it using a command line property. The fallback strategy will take command line -D args as the highest overriding priority.
https://docs.spring.io/spring-boot/docs/current/reference/html/spring-boot-features.html#boot-features-external-config
I would recommend to have logback.xml file inside resources/ with some defaults that works for everyone (more likely to the application in question) and then you can provide --logging.config=/path/to/your/very/custom/logback.xml file whenever you start the application locally.
That should work for everyone.
I have spring batch maven project deployed in a unix server.
I know that there are a couple of questions on this topic already and I have tried all those solutions. I have tried adding the date and even the time in milliseconds as a job parameter to keep it unique. I am testing something and I have to keep triggering the job manually a lot of times in a day. I have created a folder in a unix server and I have compiled my spring maven project into a jar file and moved it to the server. But whenever I run the spring batch job there by giving a time in application.properties it gives me the Jobinstancealreadycompleteexception.
One more surprising thing is that this issue does not happen in local, only in the server.