How to push and run a spring-batch application on Bluemix? - spring-boot

I have created a spring batch application in spring boot to perform a daily activity for processing data in batch. I am also able to build and create an uber-jar that has all the dependencies in it. How do I push this jar file as a batch application (not as a web application) and how do I invoke the application to start from the command line if possible?

For cloud foundry applications without the web interface, you can run them with:
---
...
no-route: true
This will disable the web interface - https://docs.cloudfoundry.org/devguide/deploy-apps/manifest.html#no-route
You may also want to set the health check to process:
---
...
health-check-type: process
This will monitor the exit status of your application. Note that this will monitor the process your java process. If it stops (e.g. batch job application finishes), cloud foundry will try to restart it - https://docs.cloudfoundry.org/devguide/deploy-apps/healthchecks.html. This assumes that you want your application to run continuously.
You will probably want to check the java build pack for methods for running your jar - https://github.com/cloudfoundry/java-buildpack/blob/master/README.md.
I think you will want to deploy using the Java Main method, which runs Java applications with a main() method provided that they are packaged as self-executable JARs. Cloud foundry will automatically run the main() method.

Related

Best way to start multiple dependent spring boot microservices locally?

Currently my team maintains many spring boot microservices. When running them locally, our workflow is to open a new IntelliJ IDEA window and pressing the "run" button for each microservice. This does the same thing as typing gradle bootRun. At a minimum each service depends on a config server (from which they get their config settings) and a eureka server. Their dependencies are specified in a bootstrap.yml file. I was wondering if there is a way to just launch one microservice (or some script or run configuration), and it would programatically know which dependencies to start along with the service I am testing? It seems cumbersome to start them the way we do now.
If you're using docker then you could use docker compose to launch services in a specific order using the depends_on option. Take a look here and see if that will solve your problem.
https://docs.docker.com/compose/startup-order/

Start a spring boot application inside another java application?

I have two java applications:
a server starting with spring boot
a client using it (through REST api)
For the time, I start both applications in differents processes.
How could I start the server from the client to obtain a "standalone" application? The use of ProcessBuilder to call java.exe is a solution, but it has drawbacks: it will be OS dependant and cannot assure the server process will be shutdown / killed as the client leaves.
From the architecture point of view leave them separate is the best option, as you have a server and a client separate, it will be the behavior in a productive environment.
If you need it only during the development phase, and your reason to run both together is to save time, you can build both in containers using Docker. Basically create two applications, building from two different folders, and then you will start both together.
I found a solution in https://www.toptal.com/spring-boot/spring-boot-application-programmatic-launch . Igor Delac
opens the jar file containing the server to find the class ...loader.archive.JarFileArchive (and some other)
instanciates it and uses it to start the application on the jar itself.
The jar file is not extracted nor modified. Only a few classes are read.

Calling self's API on application Restart(Grails 3.3.9) (Tomcat Restart)

While building a grails 3.3.9 application, I've encountered a scenario where I need to call its own API on application restart (for a specific purpose). I long as I know we can't make an API calls via boostrap.groovy, So an alternate option I found the Application.main method from where I can make own API calls.
But this is only working on Grails - run mode. I need to make this working on Grails - war mode-as I have to deploy the war in the production environment once it verified in QC.
I can have the cron setup to do this but I am looking for the grails inbuilt function.
Others information:
Grails:3.3.9
Gradle:4.10.3
Tomcat 8.5.50

Run MapReduce Jar in Spring cloud data

I need to run a mapreduce spring boot application in spring cloud data flow. Usually applications registered in scdf is executed using "java -jar jar-name" command. But my program is a mapreduce and it has to be executed using "hadoop jar jar-name". How do I achieve this ? What would be better approach to run mapreduce application in scdf ? Is it possible to directly register mapreduce apps ?
I'm using local data flow server to register the application.
In SCDF the format of the command to run a JAR file is managed by a deployer. For example, there are local deployer. Cloud Foundry etc... There is/was Hadoop/YARN but it was discontinued I believe.
Given that the deployer itself is an SPI you can easily implement your own or even fork/extend local-deployer and modify only what's needed.

Is it sutable to use Spring Cloud DataFlow to orchestrate long running external batch jobs inside infinite running apps?

We have String Batch applications with triggers defined in each app.
Each Batch application runs tens of similar jobs with different parameters and is able to do that with 1400 MiB per app.
We use Spring Batch Admin, which is deprecated years ago, to launch individual job and to get brief overview what is going in jobs. Migration guide recommends to replace Spring Batch Admin with Spring Cloud DataFlow.
Spring Cloud DataFlow docs says about grabbing jar from Maven repo and running it with some parameters. I don't like idea to wait 20 sec for application downloading, 2 min to application launching and all that security/certificates/firewall issues (how can I download proprietary jar across intranets?).
I'd like to register existing applications in Spring Cloud DataFlow via IP/port and pass job definitions to Spring Batch applications and monitor executions (including ability to stop job). Is Spring Cloud DataFlow usable for that?
Few things to unpack here. Here's an attempt at it.
Spring Cloud DataFlow docs says about grabbing jar from Maven repo and running it with some parameters. I don't like idea to wait 20 sec for application downloading, 2 min to application launching and all that security/certificates/firewall issues
Yes, there's an App resolution process. However, once downloaded, we would reuse the App from Maven cache.
As for the 2mins bootstrapping window, it is up to Boot and the number of configuration objects, and of course, your business logic. Maybe all that in your case is 2mins.
how can I download proprietary jar across intranets?
There's an option to resolve artifacts from a Maven artifactory hosted behind the firewall through proxies - we have users on this model for proprietary JARs.
Each Batch application runs tens of similar jobs with different parameters and is able to do that with 1400 MiB per app.
You may want to consider the Composed Task feature. It not only provides the ability to launch child Tasks as Direct Acyclic Graphs, but it also allows transitions based on exit-codes at each node, to further split and branch to launch more Tasks. All this, of course, is automatically recorded at each execution level for further tracking and monitoring from the SCDF Dashboard.
I'd like to register existing applications in Spring Cloud DataFlow via IP/port and pass job definitions to Spring Batch applications and monitor executions (including ability to stop job).
As far as the batch-jobs are wrapped into Spring Cloud Task Apps, yes, you'd be able to register them in SCDF and use it in the DSL or drag & drop them into the visual canvas, to create coherent data pipelines. We have a few "batch-job as task" samples here and here.

Resources