Resque - Can be used to execute Rake Tasks? - ruby

I'm studying the possibilitie of using Resque in order to store rake tasks as jobs, and executing them later across multiple workers. Is this possible? I've already read the documentation for redis+resque, and tried to search examples of someone doing this, without success.
The rake tasks that I would like to execute as jobs, would fire cucumber+selenium to perform some web automation tests, that takes some time to complete.
Thanks in Advance!
Rodrigo.

As Amar told, using the invoke worked just fine.

Related

Hangfire is running jobs sequentially

I am using HangFire hosted by IIS with an app pool set to "AlwaysRunning". I am using the Autofac extension for DI. Currently, when running background jobs with HangFire they are executing sequentially. Both jobs are similar in nature and involve File I/O. The first job executes and starts generating the requisite file. The second job executes and it starts executing. It will then stop executing until the first job is complete at which point the second job is resumed. I am not sure if this is an issue related to DI and the lifetime scope. I tend to think not as I create everything with instance per dependency scope. I am using owin to bootstrap hangfire and I am not passing any BackgroundServer options, nor am I applying any hints via attributes. What would be causing the jobs to execute sequentially? I am using the default configuration for workers. I am sending a post request to web api and add jobs to the queue with the following BackgroundJob.Enqueue<ExecutionWrapperContext>(c => c.ExecuteJob(job.SearchId, $"{request.User} : {request.SearchName}"));
Thanks In Advance
I was looking for that exact behavior recently and I manage to have that by using this attribute..
[DisableConcurrentExecution(<timeout>)]
Might be that you had this attribute applied, either in the job or globally?
Is this what you were looking for?
var hangfireJobId = BackgroundJob.Enqueue<ExecutionWrapperContext>(x => x.ExecuteJob1(arguments));
hangfireJobId = BackgroundJob.ContinueWith<ExecutionWrapperContext>(hangfireJobId, x => x.ExecuteJob2(arguments));
This will basically execute the first part and when that is finished it will start the second part

Run Rake task in parallel using different parameters

I've a scenario where data has to be loaded from different input files. So my current approach is to execute the loader script using selenium grid in 10 different systems. Each system will have their own input files and other information like PORT, IP_ADDRESS for grid will also be passed in rake task itself. Now, these information will be saved in an excel file and code has to be written to build n number of rake task with different environment variables and then execute them all together.
I'm unable to come up with a way where all the task will be created automatically and executed as well.
I know it has to be done using 'parallel_test' gem or rake multi-task feature but don't know how exactly this can be achieved. Any other approach is also welcomed.

Start wercker job hourly

I've just started using wercker and I'd like a job to run regularly (e.g. daily, hourly). I realize this may be an anti-pattern, but is it possible? My intent is not to keep the container running indefinitely, just that my workflow is executed on a particular interval.
You can use a call to the Wercker API to trigger a build for any project which is set up already in Wercker.
So maybe set up a cron job somewhere that uses curl to make the right API call?

Multiple Jenkins jobs on multiple platofrms

I am trying to build multiple Jenkins jobs, e.g. job1, job2 where jobs2 is downstream job1, each one needs to run on multiple platforms, e.g. Win, Mac, Unix
I need job2 on Mac to start once job1 on Mac has finished, same for others... but cannot find a simple way to do this simple thing!
I tried the Matrix configuration, parametrized trigger, extended trigger, NodeLabel, but non did the right job
This task looks simple but I could not achieve! Any help is really appreciated
Have you tried the Build Pipeline Plugin or Pipeline Plugin to help address this? You can also consider creating these pipeline dynamically depending on job dependency and then run.
Seems like a fit to me.

Is it possible to list running jobs with DRMAA?

I was wondering if it is possible to list all running jobs in the resource manager, using the DRMAA library, not just the ones started via DRMAA itself?
That is, getting data similar to what is output by the squeue command for the SLURM resource manager.
As far as I know, yes, it is, but only for DRMAAv2, which implements listing and job persistence:
https://github.com/troeger/drmaav2-mock/blob/master/drmaa2-list.c
The python-drmaa module does not implement DRMAAv2 yet, but we might start working soon on it:
https://github.com/drmaa-python
If you want to jump in, you're very welcome! ;)

Resources