I have two jobs defined in two different xmls. Say Job A & Job B.
I need to call Job B on successful completion of Job A.
What is the best approach of doing this.
I am pretty new to spring-batch so looking for the best approach to handle this.
You can create a superjob and execute your job A and Job B as steps in this super job specifying that Job B should be executed on successful completion of Job A
Related
Since there are too many jobs on Azkaban, I have to test new jobs one by one manually.
Assume I upload some new jobs and is it possible to write a Python (or any other language) script to fetch the dependencies between these jobs and then run them on Azkaban in paralell?
For instance, there are there jobs a, b, c and b dependents on a. They are supposed to be scheduled like:
Starts to run job a and job c
When job a finishes, starts to run job b.
I did not find any helpful info or API on the Azkaban official website (Maybe I missed useful info).
Any help is appreciated.
What is the best way to pass parameters between Spring XD jobs within a composed job? Can I get parent's (composed job's) execution context to set job parameters to make them available in the next jobs?
In short...don't. There is no common component between the jobs within a composed job. That feature is really intended for job orchestration. You're better off handling any shared state between the jobs on your own.
We have the below requirement,
In spring xd, we have a job lets assume the job name as MyJob
which will be invoked by another process using the rest service of spring xd, lets assume process name as OutsideProcess (non-spring xd process).
OutsideJob invokes MyJob when ever a file added to a location (lets assume FILES_LOC) to which OutsideJob is listening.
In this scenario, lets assume that MyJob takes 5minutes to complete the job.
At 10:00 AM, there is a file copied to FILES_LOC, then OutsideProcess will trigger MyJob immediately. (approximately it will be completed at 10:05AM)
At 10:01 AM, another file copied to FILES_LOC, then OutsideProcess will trigger one more instance of MyJob at 10:01AM. But the second instance is getting queued and starts the execution once the first instance completes its execution (approximately at 10:05AM).
If we invoke the different jobs at the same time they are getting executed concurrenctly, but the same job multiple instances are not getting executed concurrenctly.
Please let me know how can I execute the same job with multiple instances concurrently.
Thanks in advance.
The only thing I can think of is dynamic deployment of the job and triggering it right away. You can use SpringXD Rest template to create the job definition on the fly and launch them after sleeping a few seconds. And make sure you undeploy/destroy the job when the job completes successfully.
Another solution could be to create a few module instances of your job with different names and use them as your slave processes. You can query status of these job module instances and launch the one that is finished or queue the one that is least recently launched.
Remember you can run jobs with partition support if applicable. This way you will finish your job faster and be able to run more jobs.
Spring Batch jobs I have multiple jobs to be executed sequentially.
I need to pass results of job1 to job2 where it will be processed and then pass the data of job2 to job3. and so on. and may use results of job1 till job5(last job) and write output.
Job1 - is reading from db and and storing results in Hashmap
Job2 read from file and use the job1 hashmap for procesing results.
So please can anyone suggest the best solution for this. I am able to pass data between steps using ExecutionContext & JobPromotionListener, but not sure how to do same between multiple jobs
Instead of having 5 jobs for your batch processing you should have 5 steps of the same job. It is the best way to perform what you are trying to achieve.
Spring Batch framework keeps the state of every step execution so in case one of your steps fails you can relaunch your job that will only process remaining steps. Of course there are customisation options to control how and when a step can be considered failing or relaunchable.
Spring batch does not provide out of box support for the same.Seems like all you need is to configure some steps that can be executed sequentially.you can also divide the steps using the readers ,processor and writers
How do i run four or five jobs using the same quartz cron trigger.Extending the code must be easy as we will keep on adding the jobs as we go on.
So any implementation details for this particular scenario ?
Please help.
I think you have 2 choices here:
1 - create a new trigger based on the same cron expression each time you add a new job. This could be achieved really easily using a wrapper bean (i.e. "MyCronJobScheduler") that has the cron expression as an instance, and use it to create a new trigger + job each time you call the MyCronJobScheduler.addJob() method of that bean...
2 - use a parent / child pattern, where your cron trigger schedules a parent job, whose only purpose is to start the children jobs each time it's executed...(since you can fire jobs from another job, or from a trigger/job listener)
Hope that helps.