How to run a Spring Boot microservice multiple instances or times at the same time? - spring-boot

I have one Spring Boot Microservice that will generate data like half an hour for one user. I need to run the same microservice at the same time to generate data for multiple users. So how to create or run the same microservice multiple times or instances?

Am not 100% sure that this is the best way , but you can use gradle bootrun with multiple ports to run the same instance like this
./gradlew bootRun --args='--server.port=8888'
and if you are using maven , you can do it like this
mvn spring-boot:run -Dspring-boot.run.arguments=--server.port=8085
and there are profiles that can be used and set when running like this
./gradlew bootRun --args='--spring.profiles.active=dev'
so you will need to create few profile and then run each one alone .
am pretty sure that there is a smarter way though .

Related

How to run multiple instances of a Spring Boot Application via a script

I need to create 20 instances of a Spring Boot Application. Is there anyway to automate it so that the below command can be run multiple times and increment the port number each time:
mvn spring-boot:run -Dspring-boot.run.arguments=--server.port=8001
Furthermore once the instances are all started up, is there a way to send an identical POST request with a JSON body to each instance at an endpoint on the Spring Boot Application /someEndpoint.
Try writing a script (.sh file ) in which run a loop for desired time for the command. Use --server.port=0 which will randomly select the port.

Best way to start multiple dependent spring boot microservices locally?

Currently my team maintains many spring boot microservices. When running them locally, our workflow is to open a new IntelliJ IDEA window and pressing the "run" button for each microservice. This does the same thing as typing gradle bootRun. At a minimum each service depends on a config server (from which they get their config settings) and a eureka server. Their dependencies are specified in a bootstrap.yml file. I was wondering if there is a way to just launch one microservice (or some script or run configuration), and it would programatically know which dependencies to start along with the service I am testing? It seems cumbersome to start them the way we do now.
If you're using docker then you could use docker compose to launch services in a specific order using the depends_on option. Take a look here and see if that will solve your problem.
https://docs.docker.com/compose/startup-order/

Multiple Profiles For Spring Integration Tests?

I need different profiles for a few things. First we have the issue of my databases. When I run local tests I expect to use one datasource. When I run an acceptance profile (for a CI acceptance build), I expect to use a different datasource. Finally, when I run acceptance not in test I expect to use a third datasource. How I imagined this would work is.
/src/main/resources/application.properties
/src/main/resources/application-acceptance.properties
/src/test/resources/application-test.properties
/src/test/resources/application-acceptance-test.properties
However when I run mvn clean install -Dspring.profiles.active=acceptance it does not run the application-accepatnce-test.properties.
Finally, I would like to be able to run a mvn install while running the tests but not the integrations test. For this I imagine I would add a -Dspring.profiles.active=nointegration and then simply add an #ActiveProfiles('!nointegration') on the integration tests.
I've had no luck with either of these. Is it even possible to get profiles on test runs?
If it helps I am using Spring Boot 1.3.0.RELEASE.
EDIT:
On my integration tests I have #ActiveProfiles("test"). Is there any way to generate the profile here based on the java-opt spring.profiles.active?

Spring Integration Invoking Spring Batch

Just looking for some information if others have solved this pattern. I want to use Spring Integration and Spring Batch together. Both of these are SpringBoot applications and ideally I'd like to keep them and their respective configuration separated, so they are both their own executable jar. I'm having problems executing them in their own process space and I believe I want, unless someone can convince me otherwise, each to run like they are their own Spring Boot app and initialize themselves with their own profiles and properties. What I'm having trouble with though is the invocation of the job in my SpringBatch project from my SpringIntegration project. At first I couldn't get the properties loaded from the batch project, so I realized I need to pass the spring.active.profiles as a Job Parameter and that seemed to solve that. But there are other things in the Spring Boot Batch application that aren't loading correctly like the schema-platform.sql file and the database isn't getting initialized, etc.
On this initial launch of the job I might want the response to go back to Spring Integration for some messaging on Job Status. There might be times when I want to run a job without Spring Integration kicking off the job, but still take advantage of sending statuses back to the Spring Integration project providing its listening on a channel or something.
I've reviewed quite a few Spring samples and have yet to find my exact scenario, most are with the two dependencies in the same project, so maybe I'm doing something that's not possible, but I'm sure I'm just missing a little something in the Spring configuration.
My questions/issues are:
I don't want the Spring Integration project to know anything about the SpringBatch configuration other than the job its kicking off. I have found a good way to do that reference to the Job Bean without getting my entire batch configuration loading.
Should I keep these two projects separated or would it be better to combine them since I have two-way communication between both.
How should the Job be launch from the integration project. We're using the spring-batch-integration project with JobLaunchRequest and JobLauncher. This seems to run it in the same process as the Spring Integration project and I'm missing a lot of my SpringBootBatch projects initialization
Should I be using a CommandLineRunner instead to force it to another process.
Is SpringApplication.run(BatchConfiguration.class) the answer?
Looking for some general project configuration setup to meet these requirements.
Spring Cloud Data Flow in combination with Spring Cloud Task does exactly what you're asking. It launches Spring Cloud Task applications (which can contain batch jobs) as new processes on the platform you choose. I'd encourage you to check out that project here: http://cloud.spring.io/spring-cloud-dataflow/

What is the "maven way" for this ant development workflow?

How can maven be configured to support this type of workflow:
One Time Setup Invoke maven to do one time setup of a developers machine such as
Create a custom version of tomcat configured for this application
Create a local postgres database on the developers machine
load sample data into the database
run a junit test to configure other resources needed to run the application
Integration Tests Invoke maven to do run integration tests which should do the following
Create an integration test db
setup the db
Run command line integration tests against the db
Run a test version of tomcat with the application in it
Run command line junit tests that test the restful services exposed by the application
Release Build Invoke maven to do a release build of the system
do all the steps for an integration test
generate resources and configurations that are used on the server rather than production
deposit the end result in a git repo, commit, and push the changes to production
Test Build Invoke maven to do a test build of the system
do all the steps of a release build but configure the test release package with test server configuration
The main thing I am struggling with is that maven has a single build life-cycle with a well defined sequence of phases not sure if the workflow I want to build is a good fit for maven.
Can maven be configured for this type of workflow? If yes what are the key features of maven that allow for the different configurations of the four main ways that I want to use maven?
Update What I mean by this workflow, is that I want to be able to do something like
mvn setup
mvn integration
mvn prod-release
mvn test-release
I know the above example look like ant, I am long time ant user and total noob with maven.
You could setup Maven to do all that...
You probably would use (shock horror) profiles to achieve some of this...
BUT you don't want to do that
You are following ANT style thinking... if you like that style of thinking then use ANT or Gradle and be happy.
If you want to follow the Maven way, then you will solve the problem differently.
Coming from the Maven way, here are my thoughts:
Why do you need one-time setup? I usually have a run profile that dynamically provisions the correct application server and starts it with the App deployed, tearing down everything afterwards when I hit ^C. Typically this involves starting up a database server or two... hence things I have developed like the cassandra-maven-plugin. That way when I am working on a different project (which could be in 10 minutes time) I don't have to worry about background database servers eating up all my laptop's ram.
Integration tests are actually trivial when you have the above working... in fact I created the Maven Failsafe Plugin to make it easy to have plugin execution tied to the appropriate phases for integration testing. The Maven convention is to have a profile called run-its for running integration tests.
Release builds being different from test builds... ugh! You should be building environment agnostic artifacts. Have them pick up their configuration from the environment they are deployed in. That removes the worry that something has changed between the "test" build and the "production" build. If you really need to bundle the config, then I usually would resort to a separate module for taking the agnostic artifact and rebundling with the required configuration. That way it is easy to prove that you have a reproducible transformation and that nothing has changed inbetween what went to QA vs what is going to Ops.
I always make the release builds include the integration testing.
So typically I have my projects such that
$ mvn -Prun
will fire up the application starting from zero. Hitting ^C will tear everything back down again, and mvn clean or in extreme situations if I have a more complex setup process and need some caching mvn post-clean (think really clean) will remove anything that the run profile put into play
To run the integration tests I typically do
$ mvn -Prun-its verify
To make a release I typically do
$ mvn release:prepare release:perform -B
That is (in my view) the ideal way of handling the above steps you need.
HTH.
BTW I have not had to use PostgreSQL specifically (typically my integration tests and run profile can get away with a pure java database such as derby or hsqldb and because the artifacts are environment agnostic it is easy to have the integration test/dev flyweight app server inject the correct JDBC url) so you may hit some issues with regard to PostgreSQL

Resources