My straight question is: in Spring Batch Admin there is a clear concept and practical functionalities to manage the jobs. So, in Spring Batch Admin you can launch, stop one job, stop all jobs, reatart, abandon, see the status for each job and check if it was successful or failed. In case I use Spring Cloud Task deployed as war and Spring Boot Admin deployed as war in same server, how can I manage my jobs?
Explaining better my context: I have developed few Spring Batch jobs in the last 6 months. All them were designed in order to run periodically, in other words, to be schedulled. I wasn't planned deploy them in Web Server but I was informed in my company in the moment to publish to production that I must run any solution inside of our Mainframe Websphere.
It wasn't any issue at all since I realized that I could use Spring Batch Admin in order to start/stop all the Spring Batch deployed in the same EAR. Unfortunately, Websphere ND 8.5 isn't comply with Spring Batch Admin (you may heard someone telling he/she has managed to make Spring Batch Admin up and running in Websphere 8.5 ND but I got the final position from IBM that neither JSR352 nor Spring Batch Admin is safe to be used).
One week ago, I firstly got in touch with Spring Boot Admin thanks to certain comment to my question how to register org.springframework.integration.monitor.IntegrationMBeanExporter.
In such comment I read "...consider to get rid of batch-admin. This project is obsolete already in favor of cloud.spring.io/spring-cloud-task..." however it doesn't seem to me to really provide the same functionalities Spring Batch Admin does. It seems to me be a bit more generic and aimed for application boxes instead of job executions.
To make my question even more clear, let's say I have this project deployed as war https://github.com/spring-cloud/spring-cloud-task/blob/master/spring-cloud-task-samples/batch-job/src/main/java/io/spring/configuration/JobConfiguration.java and the two jobs are both schedulled for run every 5 minutes. Addtionally, in such project I added the necessary configuration to make it visible to Spring Boot Admin (e.g spring-boot-admin-starter-client). In the same server I have this Spring Boot Admin deployed as war too https://github.com/codecentric/spring-boot-admin/tree/master/spring-boot-admin-samples/spring-boot-admin-sample-war. Now, my question that is the same from title but a bit more exemplified is: will I be able to stop one job and let the other keep running from Spring Boot Admin? Can I launch one and leave other stopped?
If you, reader, has an overview of Spring Batch Admin will probably understand quickly what is my intention. I read in th Spring Boot Admin manual
http://codecentric.github.io/spring-boot-admin/1.4.1/#jmx-bean-management
3.2. JMX-bean management "... To interact with JMX-beans in the admin UI you have to include Jolokia in your application". Is there some trick via Jolokia in order to manage each job individually?
Related
We have String Batch applications with triggers defined in each app.
Each Batch application runs tens of similar jobs with different parameters and is able to do that with 1400 MiB per app.
We use Spring Batch Admin, which is deprecated years ago, to launch individual job and to get brief overview what is going in jobs. Migration guide recommends to replace Spring Batch Admin with Spring Cloud DataFlow.
Spring Cloud DataFlow docs says about grabbing jar from Maven repo and running it with some parameters. I don't like idea to wait 20 sec for application downloading, 2 min to application launching and all that security/certificates/firewall issues (how can I download proprietary jar across intranets?).
I'd like to register existing applications in Spring Cloud DataFlow via IP/port and pass job definitions to Spring Batch applications and monitor executions (including ability to stop job). Is Spring Cloud DataFlow usable for that?
Few things to unpack here. Here's an attempt at it.
Spring Cloud DataFlow docs says about grabbing jar from Maven repo and running it with some parameters. I don't like idea to wait 20 sec for application downloading, 2 min to application launching and all that security/certificates/firewall issues
Yes, there's an App resolution process. However, once downloaded, we would reuse the App from Maven cache.
As for the 2mins bootstrapping window, it is up to Boot and the number of configuration objects, and of course, your business logic. Maybe all that in your case is 2mins.
how can I download proprietary jar across intranets?
There's an option to resolve artifacts from a Maven artifactory hosted behind the firewall through proxies - we have users on this model for proprietary JARs.
Each Batch application runs tens of similar jobs with different parameters and is able to do that with 1400 MiB per app.
You may want to consider the Composed Task feature. It not only provides the ability to launch child Tasks as Direct Acyclic Graphs, but it also allows transitions based on exit-codes at each node, to further split and branch to launch more Tasks. All this, of course, is automatically recorded at each execution level for further tracking and monitoring from the SCDF Dashboard.
I'd like to register existing applications in Spring Cloud DataFlow via IP/port and pass job definitions to Spring Batch applications and monitor executions (including ability to stop job).
As far as the batch-jobs are wrapped into Spring Cloud Task Apps, yes, you'd be able to register them in SCDF and use it in the DSL or drag & drop them into the visual canvas, to create coherent data pipelines. We have a few "batch-job as task" samples here and here.
Our application are built on Spring boot, the app will be packaged to a war file and ran with java -jar xx.war -Dspring.profile=xxx. Generally the latest war package will served by a static web server like nginx.
Now we want to know if we can add auto-update for the application.
I have googled, and people suggested to use the Application server which support hot deployment, however we use spring boot as shown above.
I have thought to start a new thread once my application started, then check update and download the latest package. But I have to terminate the current application to start the new one since they use the same port, and if close the current app, the update thread will be terminated too.
So how to you handle this problem?
In my opinion that should be managed by some higher order dev-ops level orchestration system not by either the app nor its container. The decision to replace an app should not be at the dev-ops level and not the app level
One major advantage of spring-boot is the inversion of the traditional application-web-container to web-app model. As such the web container is usually (and best practice with Spring boot) built within the app itself. Hence it is fully self contained and crucially immutable. It therefore should not be the role of the app-web-container/web-app to replace either part-of or all-of itself.
Of course you can do whatever you like but you might find that the solution is not easy because it is not convention to do it in this way.
Just looking for some information if others have solved this pattern. I want to use Spring Integration and Spring Batch together. Both of these are SpringBoot applications and ideally I'd like to keep them and their respective configuration separated, so they are both their own executable jar. I'm having problems executing them in their own process space and I believe I want, unless someone can convince me otherwise, each to run like they are their own Spring Boot app and initialize themselves with their own profiles and properties. What I'm having trouble with though is the invocation of the job in my SpringBatch project from my SpringIntegration project. At first I couldn't get the properties loaded from the batch project, so I realized I need to pass the spring.active.profiles as a Job Parameter and that seemed to solve that. But there are other things in the Spring Boot Batch application that aren't loading correctly like the schema-platform.sql file and the database isn't getting initialized, etc.
On this initial launch of the job I might want the response to go back to Spring Integration for some messaging on Job Status. There might be times when I want to run a job without Spring Integration kicking off the job, but still take advantage of sending statuses back to the Spring Integration project providing its listening on a channel or something.
I've reviewed quite a few Spring samples and have yet to find my exact scenario, most are with the two dependencies in the same project, so maybe I'm doing something that's not possible, but I'm sure I'm just missing a little something in the Spring configuration.
My questions/issues are:
I don't want the Spring Integration project to know anything about the SpringBatch configuration other than the job its kicking off. I have found a good way to do that reference to the Job Bean without getting my entire batch configuration loading.
Should I keep these two projects separated or would it be better to combine them since I have two-way communication between both.
How should the Job be launch from the integration project. We're using the spring-batch-integration project with JobLaunchRequest and JobLauncher. This seems to run it in the same process as the Spring Integration project and I'm missing a lot of my SpringBootBatch projects initialization
Should I be using a CommandLineRunner instead to force it to another process.
Is SpringApplication.run(BatchConfiguration.class) the answer?
Looking for some general project configuration setup to meet these requirements.
Spring Cloud Data Flow in combination with Spring Cloud Task does exactly what you're asking. It launches Spring Cloud Task applications (which can contain batch jobs) as new processes on the platform you choose. I'd encourage you to check out that project here: http://cloud.spring.io/spring-cloud-dataflow/
Is there any batch job manager console/GUI from where I can see the status of batch jobs in TomEE. I mean something like GUI in spring batch admin.
The OP is requesting information about TomEE, not Spring Framework ;)
The implementation TomEE uses is called BatchEE, which is an Apache project. It would seem a GUI is available in the incubator: http://batchee.incubator.apache.org/gui.html
You might try deploying that and report back if it works with TomEE or not.
Out of box , Spring Batch does not provide any GUI .
Spring batch Admin can be used as a GUI , but that need to be connected with your spring batch project .
For the job status you can always check spring batch tables .
I am new to Spring Integration and I am considering using it in order to poll a directory for new files in order to process those files.
My question is: is Spring Integration some sort of daemon one can launch and that one can use in order to poll a directory?
Is this is possible can someone please direct me to relevant section of the official documentation on how to launch Spring Integration?
All you need is to have a main method (or a WAR file if you want to deploy to Tomcat or another servlet container) that creates a Spring ApplicationContext (e.g. new ClassPathXmlApplicationContext("file-poller.xml"))
It can run with a cron trigger, fixed-rate or fixed-delay trigger.
JMX operations can be exposed on Spring Integration's File adapter (or any adapter) by simply adding a single config element (e.g. <mbean-export>).
Bottom line: you REALLY do not need an ESB if you simply want a File poller to run continuously. You can have a single small config file and one line of code in a main method.
Visit the samples for more info: https://github.com/springsource/spring-integration-samples (look under basic/file specifically)
Hope that helps,
Mark
Spring Integration is a part of framework, its not a programm or daemon.
What you cant do — is to configure Spring Integration to poll a directory, lunch JVM with Spring onboard and poller will do what you want.
You can start with this blog post.
More samples
Relevant section of documentation