We have a scheduling service running in weblogic server, implemented using Quartz API and custom db tables to manage create, delete, suspend and resume tasks on configuration of jobs and various pollers (database, jms) with the help of UI framework in Angular JS. Now we decided to go with open source and trying to make use of the capabilities of spring integration.
But with spring integration, any idea how one can dynamically perform create, suspend and resume tasks on a database and JMS pollers? Or can we integrate Quartz with Spring Integration and make use of capabilities of both?
Thanks
Related
I have an application which listen to an activeMQ queue and start a Batch Job when receiving a message.
I'd like to use Spring Cloud Dataflow to provide an UI but I can't find informations on how to configure it.
Since it uses Spring Boot I should be able to replicate how my application currently works (use a REST API to make it listen to activeMQ and start job when receiving message), but I can't find anything on how to make it start the batch in Cloud Dataflow.
You have a few options here.
Option 1: Launch your application as-is and manually send message to launch task.
Any arbitrary Spring Boot application can be launched from Dataflow (simply register it as type = "App").
Taken from https://github.com/spring-cloud/spring-cloud-dataflow/blob/main/spring-cloud-dataflow-docs/src/main/asciidoc/streams.adoc#register-a-stream-application:
Registering an application by using --type app is the same as registering a source, processor or sink. Applications of the type app can be used only in the Stream Application DSL (which uses double pipes || instead of single pipes | in the DSL) and instructs Data Flow not to configure the Spring Cloud Stream binding properties of the application. The application that is registered using --type app does not have to be a Spring Cloud Stream application. It can be any Spring Boot application. See the Stream Application DSL introduction for more about using this application type.
You would have to send the task launch in your code. You can use the Dataflow REST client to do this. You can get an idea of how to do that by looking at https://github.com/spring-cloud/spring-cloud-dataflow/tree/main/spring-cloud-dataflow-tasklauncher/spring-cloud-dataflow-tasklauncher-sink.
Option 2: Use pre-built stream applications to model the same flow as your application.
The app you describe can be logically modeled as a Spring Cloud Stream application.
There is a JMS source (provides messages to signal the need to kickoff task/batch job)
There is a TaskLauncher sink (receives messages and kicks off the task/batch job)
This app can actually be constructed w/ little effort by using the pre-packaged applications to model this flow.
JMS Source
Dataflow Tasklauncher Sink
If you have to register these applications in the UI - they can be found at:
maven://org.springframework.cloud.stream.app:jms-source-kafka:3.1.1
maven://org.springframework.cloud:spring-cloud-dataflow-tasklauncher-sink-kafka:2.9.2
Stream definition:
jms-source | dataflow-tasklauncher-sink
The README(s) on the above source/sinks give details about the configuration options.
Option 3: Custom Spring Cloud Stream app w/ function composition
The previous option would create 2 separate apps. However, if you want to keep the logic in a single app then you can look into creating a custom Spring Cloud Stream app that uses function composition and leverage the pre-built reusable Java functions that the apps in option 2 are built upon.
JMS Supplier
TaskLauncherFunction
I am working on a project where we are planning to use WLP (WebSphere liberty) instead of traditional WAS.
The code is using WAS scheduler for scheduling activities.
Does liberty also have the same level of support/features for scheduler as present in WAS .
How can I migrate the scheduler tasks from websphere to liberty?
Code using the Scheduler in traditional WebSphere Application Server should not be migrated to EE Concurrency Utilities unless you are certain that you do not need the transactional/persistent quality of service that the Scheduler provides (Scheduler tasks run in a transaction and can roll back and be retried, and they can also persist across server restart). To obtain a similar quality of service in Liberty, you should migrate your Scheduler tasks to Persistent EJB Timers. Note that while fail over support across multiple servers is not present in Persistent EJB Timers in Liberty at the time of writing this, it is currently being worked on.
I have two spring boot applications, where one features the creating and scheduling of the job. There is another Spring Boot application where I configured the Quartz Scheduler, which will prepare the job parameters using a shared database and launches the spring batch job.
I need to update the running Quartz Scheduler if the user updates or adds a new job from UI. Also if server restarts I need to restart the Scheduler and the Jobs.
How should I update my Quartz Scheduler object when there's the new job added or updated by the user? My Quartz Scheduler will always be in running condition. Can I use REST Template so that my UI application will notify my scheduler application for the jobs?
You will need to store the jobs to schedule in a database (using the share database) from the UI application, create a job and build a trigger for it.
Its hard to say without any code given, but there is a guide that does something like you want.
https://www.callicoder.com/spring-boot-quartz-scheduler-email-scheduling-example/
This is a dynamic scheduler/trigger
I have created a project with 2 rest API that launches different Jobs. My project is connected to a MySql database. I would like to monitor both the Jobs in spring cloud data flow. Please help me out how we need to configure SCDF to MySql so that both the Jobs will be monitored. And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance. If not, please let me know how we can do that.
Thanks in Advance
Please take a moment to read the Spring Batch Admin to SCDF migration guide. It is a requirement that the jobs are wrapped with Spring Cloud Task programming model.
Once you have the batch-job wrapped as a Task, you can register them in SCDF to build Task/Batch pipelines using SCDF's DSL or the GUI.
As for the datasource, all you must have to make sure is that the same datasource is shared between SCDF and the batch-jobs. With this, SCDF's Dashboard will automatically list the jobs and its execution details.
Here are a few examples for your reference.
And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance
Assuming you're referring to SCDF's Task launch API (e.g., a Scheduled trigger or by other means); if triggered, yes, the job executions will be captured in the database as far as SCDF and the batch-jobs share a common datasource as explained previously.
We are planning to retire the existing legacy java batch applications and recreate it with the latest available batch framework.
Given that we have a large number of batch jobs to be modernised, we are looking for a framework or architecture that would allow us to
Develop a batch solution that would allow us to dynamically deploy a new batch as and when they are created, without disturbing the existing deployed applications. - Does Spring cloud Task provide any of this feature. Note: We are looking only to deploy the apps to our local server, and has nothing to do with cloud.
If Spring Batch/Boot can provide us the feature we typically expect from a batch application, what is the special value add to go for Spring Cloud Task? - I wasn't able to completely understand this from the Spring documentation available online.
From the documentation of the Spring Cloud Task, I was able to understand that it allows an application to have many tasks within it. What should I do if each of the tasks have their own library dependencies, which might contradict with the dependencies of other Tasks? So in that case, should each of these tasks moved to a new application or this there a work around for that?
To answer your questions:
Does Spring Cloud Task handle orchestration - No. Spring Cloud Task does not handle orchestration of tasks or jobs. The component in this ecosystem that handles the deployment/orchestration of tasks or jobs is really Spring Cloud Data Flow (which is why I asked if you use any type of cloud platform including YARN, Cloud Foundry, Kubernetes, or Mesos...the environments supported by Spring Cloud Data Flow).
What added value does Spring Cloud Task provide over Spring Boot/Spring Batch - Spring Cloud Task is designed to provide a few things:
Similar abilities to Spring Batch with regards to state management without needing to create a batch job. When running a Boot application on a cloud environment, there is no standard way of getting the results from environment to environment (YARN handles job results differently from tasks on Cloud Foundry which is different from jobs on Kubernetes, etc). Spring Batch provides this but now all short lived processes need the overhead of the Batch API so Spring Cloud Task provides a lighter touch to those use cases.
Automatically adds informational listeners. With Spring XD, when you ran a job in an XD container, the XD container automatically added a number of informational listeners that broadcast events that you could listen for. Spring Cloud Task brings the same functionality without the need for the XD container.
Integration with Spring Cloud Stream. Spring Cloud Task provides the ability to launch tasks from messages received from Spring Cloud Stream. Also, the informational messages previously mentioned (both Batch events as well as Task events) are sent via Spring Cloud Stream channels.
The DeployerPartitionHandler. When working in a cloud environment, this PartitionHandler implementation allows you to launch workers for a partitioned batch job as tasks. This allows for the dynamic scaling of partitioned batch jobs instead of the traditional option of pre-deploying workers that listen for work which wastes resources in a modern cloud environment.
How does the packaging of multiple tasks work with dependencies - In short, this is not recommended. The idea of a Spring Cloud Task is that the execution of the Spring Boot application is the Task. While you could package up multiple tasks and using different methods, have them execute based on different stimulus, that goes against the 12 factor application concepts which are essential for correct use of Spring Cloud Task.
My two cents
For the best option for a modern batch platform, you really need to look into some from of platform first and that begins at the Cloud Foundry/Kubernetes/Mesos/YARN layer. Without that, you end up building a large part of the infrastructure yourself. That is why Spring XD evolved into Spring Cloud Data Flow. The added complexity that lived in the containers of Spring XD is removed by requiring a modern platform to run on (since they all handle those guarantees themselves). Without that piece, you're going to spend a lot of time managing the deployment and orchestration of applications that most modern platforms handle for you.
From there, the choice becomes pretty easy IMHO with Spring Cloud Task for simple tasks, Spring Batch for batch jobs, and Spring Cloud Data Flow for orchestration.