I wrote a background job that accepts some parameters, and I scheduled it to run periodically. I now want to schedule it with a different parameter set, but the parse.com console says I have no background jobs.
I've worked around the problem adding the same job multiple times with slightly different names, but this solution is far from ideal. There should be a way to schedule a job with multiple parameter set and different schedules.
Is there a way to schedule the same job multiple times?
Related
I'm a new SLURM user and I'm trying to figure out the best way to submit a job that requires the same command to run 400,000 times with different input files (approximately 200MB memory per CPU, 4 minutes for one instance, each instance runs independently).
I read through the documentation, and so far it seems that arrays are the way to go.
I can use up to 3 nodes on my HPC with 20 cores each, which means that I could run 60 instances of my command at the same time. However, user limit for jobs running at the same time is 10 jobs, with 20 jobs in the queue.
So far, everything I've tried runs each instance of the command as a separate job, thus limiting it to 10 instances in parallel.
How can I fully utilize all available cores in light of the job limits?
Thanks in advance for your help!
You can have a look at tools like GREASY that will allow you to run a single Slurm job and spawn multiple subtasks.
The documentation specifies how to install it and use it and can be found here
You don't even need the job array to attain the defined objective. Firstly submit a job via sbatch job_script command, in the job_script you can customise the job submission. You can use srun parameters & along with the for loop to run the maximum jobs.
I'm looking for a way to monitor several long running jobs in a session in another parallel job, however there is no way to pass the current session's jobs (Get-Job) as a parameter into another job, unless I assign them to a variable and I process them singularly in a pipeline, which is time consuming.
I might need to end up doing something like this, even though it keeps the session busy: https://gallery.technet.microsoft.com/scriptcenter/Monitor-and-display-808ce573 , however the downside of this approach is that I cannot stop jobs and/or interact with them in any way until all of them area completed/failed, thats why I was trying to find a solution in a parallel monitoring job.
this is something that I've been thinking of for a while. How does Laravel's task scheduler handle multiple tasks scheduled at the same time?
Let's say I had 4 different commands, each set to execute at 1:15 AM:
$schedule->command('emails:send')->daily()->at('1:15');
$schedule->command('cache:maintenance')->daily()->at('1:15');
$schedule->command('users:remove-deleted')->daily()->at('1:15');
$schedule->command('users:notification-reminders')->daily()->at('1:15');
Also, for argument's sake, let's say each command took 2-5 minutes to complete. Laravel polls queue:work every minute, so what would happen at 1:16 AM if the first command hasn't completed yet? Does Laravel place the remaining commands into a queue automatically or would I have to explicitly create a queue worker for each command?
If you have a queue set up with multiple drivers, they can run simultaneously. Most production servers can easily handle many tasks at the same time. Otherwise, it will add them to one queue stack and be run in the order specified.
I have a problem with the parse job scheduler. I have made a job that accepts some parameters and I want to run different instances of this job. Is there a way to do this? When I click on the Schedule a Job button it doesn't recognize a job that second time. It only lets me create the first job. And also in the new parse dashboard the schedule a job button is not even there.
As a free users you can only schedule one single job. If you want to schedule more than one job you must pay for it. Take a look here for more information.
In a Sinatra app I need to run on a daily basis a job in the background (I will probably use sidekiq for this) for each User of the app.
I'd like to distribute them evenly during the day according to the number of users. So, for instance if there are 12 users the job has to be executed once every two hour and if there are 240 users the job has to be executed every 6 minutes.
I understand there are some gems that allow you to schedule background jobs (Rufus scheduler, Whenever ...), however I'm not sure they allow to change the internal a job must be executed according to dynamic values such as the number of objects in a collection.
Any idea how I can achieve that?
Using whenever, you could get started like this:
In your user model, after a user is added successfully:
every (1440/User.all.count).to_i.minutes do
add your background command task
end
Also don't forget to update the whenever store which actually updates the cron.
system 'bundle exec whenever --update-crontab store'