Spring Boot app passing JVM args automatically? - spring

Using Eclipse STS4. Spring Boot 2.4.0 app, this one happens to be Webflux / Netty. Is there a way to specify JVM args in a way that hits in various run scenarios?
debugging / running through STS4
running through mvn command line
running through standalone
I know I can pass in cmd args to all these different ways, but kind of looking for something I can put in my application.properties or something like that? Just trying to avoid having to document it, educate people on how to run it properly, etc.
I've tried various things I found through Google like inlineConfScript, etc. but none of them seem to work.

Have you tried setting JAVA_OPTS in your .conf? See Customizing the start script

Related

Launching spring-boot with some properties set

This is probably pretty basic but...
I have had this working in the past, but have completely forgotten how now.
In my code I have several lines like this:
data.setAuthURL(System.getProperty("catalystTokenServerUrl"));
On the Jboss server we set properties in the console.
How do I set these in Eclipse so when I launch my service from the boot dashboard, they are available?

How to show all available routes in Spring Boot with the Command Line?

How can I print all my routes out in a Spring Boot application with the command line? I know this question has been asked before, but the answers either assume one is using Intellij or wants to see the printed routes in the local server logs.
For some background information, I use vim and build my application locally with the command line Gradle commands. I am not using an IDE such as Intellij.

Why my tests succeed without a running mongodb instance?

My app uses mongodb and I have bunch tests that save/query and update data.
I run my tests without a mongodb instance and they all pass! why?
Is this a known feature? what is it called?
Can someone please point me to the bit of documentation that confirms this?
It seems like Spring Boot will use an Embedded (in memory) version of Mongo. I'm not sure if this is enabled by default, or if you need to add an extra dependency.
See this page: https://www.baeldung.com/spring-boot-embedded-mongodb

Possible to configure Jaeger via application.properties?

According to https://quarkus.io/guides/opentracing-guide all Jeager configuration is via JVM args (-DJAEGER_ENDPOINT...) but I'd like to use either application.properties or microprofile-config.properties to configure tracing. I've tried the following but the only config that seems to be picked up by Quarkus is the service-name all other properties are ignored.
quarkus.jaeger.service-name=my-service <-this one is working
quarkus.jaeger.endpoint=http://localhost:14268/api/traces <- seems to be ignored
quarkus.jaeger.reporter-log-spans=true
quarkus.jaeger.sampler.type=const
quarkus.jaeger.sampler.parameter=1
So, question is if it is possible to configure via config-files or this is not currently supported?
While doing mvnDebug quarkus:dev (without jvm.args) and placing a breakpoint here, I see that you all your params are being passed except quarkus.jaeger.sampler.parameter which is wrong.
It should be quarkus.jaeger.sampler.param

Spring Integration Invoking Spring Batch

Just looking for some information if others have solved this pattern. I want to use Spring Integration and Spring Batch together. Both of these are SpringBoot applications and ideally I'd like to keep them and their respective configuration separated, so they are both their own executable jar. I'm having problems executing them in their own process space and I believe I want, unless someone can convince me otherwise, each to run like they are their own Spring Boot app and initialize themselves with their own profiles and properties. What I'm having trouble with though is the invocation of the job in my SpringBatch project from my SpringIntegration project. At first I couldn't get the properties loaded from the batch project, so I realized I need to pass the spring.active.profiles as a Job Parameter and that seemed to solve that. But there are other things in the Spring Boot Batch application that aren't loading correctly like the schema-platform.sql file and the database isn't getting initialized, etc.
On this initial launch of the job I might want the response to go back to Spring Integration for some messaging on Job Status. There might be times when I want to run a job without Spring Integration kicking off the job, but still take advantage of sending statuses back to the Spring Integration project providing its listening on a channel or something.
I've reviewed quite a few Spring samples and have yet to find my exact scenario, most are with the two dependencies in the same project, so maybe I'm doing something that's not possible, but I'm sure I'm just missing a little something in the Spring configuration.
My questions/issues are:
I don't want the Spring Integration project to know anything about the SpringBatch configuration other than the job its kicking off. I have found a good way to do that reference to the Job Bean without getting my entire batch configuration loading.
Should I keep these two projects separated or would it be better to combine them since I have two-way communication between both.
How should the Job be launch from the integration project. We're using the spring-batch-integration project with JobLaunchRequest and JobLauncher. This seems to run it in the same process as the Spring Integration project and I'm missing a lot of my SpringBootBatch projects initialization
Should I be using a CommandLineRunner instead to force it to another process.
Is SpringApplication.run(BatchConfiguration.class) the answer?
Looking for some general project configuration setup to meet these requirements.
Spring Cloud Data Flow in combination with Spring Cloud Task does exactly what you're asking. It launches Spring Cloud Task applications (which can contain batch jobs) as new processes on the platform you choose. I'd encourage you to check out that project here: http://cloud.spring.io/spring-cloud-dataflow/

Resources