i'm setting up a job that run in midnight while we're not at work so in the next morning our job is done. but unfortunately the job is not working.
Use one of the included scripts to run jobs from the system scheduler.
Windows: use kitchen.bat and run it from task scheduler
Linux: use kitchen.sh from crond
Here is the syntax to use:
https://help.pentaho.com/Documentation/8.0/Products/Data_Integration/Command_Line_Tools
Related
I have to run multiple spark job one by one in a sequence, So I am writing a shell script. One way I can do is to check success file in output folder for job status, but i wanna know that is there any other way to check the status of spark-submit job using unix script, where I am running my jobs.
You can use command
yarn application -status <APPLICATIOM ID>
where <APPLICATIOM ID> is your application ID and check for line like:
State : RUNNING
This will give you the status of your application
To check the list of application, run via yarn you can use command
yarn application --list
You can add also -appTypes to limit the listing based on the application type
Scenarios
All batch applications (Spring Batch based) have to deploy to Jboss EAP.
All batch jobs have to be launched & monitored by using the existing enterprise workload/scheduling system, e.g. ASG-Zena via shell scripts.
All batch jobs will have HTTP endpoints for start job, get state of the job, and stop job. The shell scripts will make use of the endpoints to control the batch jobs.
All batch jobs will be launched asynchronously
The shell script will return an exit code to indicate the execution result of the batch job so the enterprise scheduler system can track the success or failure of the batch jobs
[Enterprise Workload/Scheduling][Shell Scripts] <--> [HTTP][[Batch Applications] Jboss EAP]
Questions
As the batch jobs are launched asynchronously via HTTP endpoint, how can the shell script get the execution result of the batch job?
Your shell script will need to poll for the results. The script kicks off the job, then polls for the result.
Is there any? I could find only how to run the tasks in windows scheduler. No utility to run it as job, i.e. via CreateJobObject() - AssignProcessToJobObject().
I need my application killed if it consumes > 1.5 Gb RAM, job would be perfect to do that...
i made one jar
which analyzes system logs .. for running this jar on HADOOP server i can do it using command line like "bin/hadoop jar log.jar"
but my problem is i want to make this jar executable in background as a service on Ubuntu master machine.
can any one help me how can i make HADOOP jar as a service so it can run like a background service on Ubuntu Machine .. runs after every 1hrs.
You have a few options, here's two:
Configure a crontab job to run your job every hour, something like (you'll need to fully qualify the path to hadoop and the jar itself):
0 * * * * /usr/lib/hadoop/bin/hadoop jar /path/to/jar/log.jar
Run an OOZIE server and configure a coordinator to submit the job on an hourly basis. More effort that the above suggestion but worth a look.
I know how to run a bash script on Jenkins. However, if I use qsub to submit the bash script to OGE system, how does Jenkins know that my job terminates or not?
You can use "-sync y" on your qsub to cause the qsub to wait until the job(s) are finished.
Jenkins allows for you to submit the results of a build using a web based API. I currently use this to monitor a job remotely for a grid at my orginization. If you have the ability to perform a web post to the jenkins server you could use the below script to accomplish the job.
#!/bin/bash
MESSAGE="Some message about job success"
RUNTIME="Some calculation to estimate runtime"
USERNAME="userNameForJenkinsLogin"
PASSWORD="passwordForJenkinsLogin"
JENKINS_HOST="URLToJenkins"
TEST_NAME="Name of Test"
curl -i -X post -d "<run><log>$MESSAGE</log><result>0</result><duration>$RUNTIME</duration></run>" http://$USERNAME:$PASSWORD#$JENKINS_HOST/jenkins/job/$TEST_NAME/postBuildResult
The Jenkins SGE Cloud plugin submits builds to the Sun Grid Engine (SGE) batch scheduler. Both the open source version of SGE and the commercial Univa Grid Engine (UGE) are supported.
This plugin adds a new type of build step Run job on SGE that submits batch jobs to SGE. The build step monitors the job status and periodically appends the progress to the build's Console Output. Should the build fail, errors and the exit status of the job also appear. If the job is terminated in Jenkins, it is also terminated in SGE.
Builds are submitted to SGE by a new type of cloud, SGE Cloud. The cloud is given a label like any other slave. When a job with a matching label is run, SGE Cloud submits the build to SGE.