How to lock the folder when the bash script is running inside - bash

I have a task flow which can be divided into step_1 and step_2, both running in bash.
step_1 needs run a bash script in folder_a.
when I submit my jobs into multiple hosts, there are multiple threads which will trigger the bash script in folder_a, this will break the function.
I want my jobs run as following :
every job's step_1 has to run one by one, otherwise it will break function in the folder_a.
every job's step_2 can run parallelly for speed.
How to lock the folder if the bash in folder_a is running and refuse to run for other thread ?
Maybe I can only lock the step_1 bash top script inside folder_a, instead of the folder.

If you are working under a Unix-like system (GNU/Linux, BSD...) you probably have flock and you could wrap the critical jobs in flock calls:
step1_job: | folder_a/.lock
flock $| command [arguments]
folder_a/.lock:
touch $#
See man flock for the details.

Related

is there a way to trigger 10 scripts at any given time in Linux shell scripting?

I have a requirement where I need to trigger 10 shell scripts at a time. I may have 200+ shell scripts to be executed.
e.g. if I trigger 10 jobs and two jobs completed, I need to trigger another 2 jobs which will make number of jobs currently executing to 10.
I need your help and suggestion to cater this requirement.
Yes with GNU Parallel like this:
parallel -j 10 < ListOfJobs.txt
Or, if your jobs are called job_1.sh to job_200.sh:
parallel -j 10 job_{}.sh ::: {1..200}
Or. if your jobs are named with discontiguous, random names but are all shell scripts named with .sh suffix in one directory:
parallel -j 10 ::: *.sh
There is a very good overview here. There are lots of questions and answers on Stack Overflow here.
Simply run them as background jobs:
for i in {1..10}; { ./script.sh & }
Adding more jobs if less than 10 are running:
while true; do
pids=($(jobs -pr))
((${#pids[#]}<10)) && ./script.sh &
done &> /dev/null
There are different ways to handle this:
Launch them together as background tasks (1)
Launch them in parallel (1)
Use the crontab (2)
Use at (3)
Explanations:
(1) You can launch the processes exactly when you like (by launching a command, click a button or whatever event you choose)
(2) The processes will be launched at the same time, every (working) day, periodically.
(3) You choose a time when the processes will be launched together once.
I have used below to trigger 10 jobs a time.
max_jobs_trigger=10
while mapfile -t -n ${max_jobs_trigger} ary && ((${#ary[#]})); do
jobs_to_trigger=`printf '%s\n' "${ary[#]}"`
#Trigger script in background
done

jobs command result is empty when process is run through script

I need to run rsync in background through shell script but once it has started, I need to monitor the status of that jobs through shell.
jobs command return empty when its run in shell after the script exits. ps -ef | grep rsync shows that the rsync is still running.
I can check the status through script but I need to run the script multiple times so it uses a different ip.txt file to push. So I can't have the script running to check jobs status.
Here is the script:
for i in `cat $ip.txt`; do
rsync -avzh $directory/ user#"$i":/cygdrive/c/test/$directory 2>&1 > /dev/null &
done
jobs; #shows the jobs status while in the shell script.
exit 1
Output of jobs command is empty after the shell script exits:
root#host001:~# jobs
root#host001:~#
What could be the reason and how could I get the status of jobs while the rsync is running in background? I can't find an article online related to this.
Since your shell (the one from which you execute jobs) did not start rsync, it doesn't know anything about it. There are different approaches to fixing that, but it boils down to starting the background process from your shell. For example, you can start the script you have using the source BASH command instead of executing it in a separate process. Of course, you'd have to remove the exit 1 at the end, because that exits your shell otherwise.

Running shell script commands sequentially in Jenkins

In Jenkins, I have created a job which runs many shell script commands:
command1
command2
...etc
command1 is an ssh command which calls a shell script file on another server machine. I have to wait until it is finished, and AFTER it, command2 should come.
So, how can I make sure that the script file on the other machine, started by command1, has already finished its jobs, when in the Jenkins job the next command (command2) is started?
Or, alternatively,how can I make sure that command2 won't be started until the shell script on the other machine (started by command1) has already finished?
You can check out "How to send many commands to shell and wait for the command behind ends" in order to chain commands and wait for their completion.
When you execute a command through an ssh session, you might have to wrap that command in a script able to loop/wait for the command completion.
See an example in "How can I make ssh wait until the command exits?".
Or (a simpler wraper): How do I know when a command run over ssh has finished?
#/bin/bash
$#
echo "==== Command Output Finished ===="
look for the string ==== Command Output Finished ==== in your I/O routines to determine where the boundary between command outputs are.
Or you can try isolate those commands in their own Jenkins shell build step.
(Not a different job, just a different build step within the same job)

How to create a job in shell script?

I know that execute a command and add & to the end would create a job and make the command execute in background.
Now I want to create a job in a bash shell. I tried
#!/bin/bash
my-job &
# some other tasks
Then I executed jobs, but I got no output. However, ps aux does show my-job is running in the background.
I want to create a job inside a script, because in some cases I want to bring the job into foreground.
jobs are usually an interactive shell concept, as there is usually a controlling terminal involved.
A shell script is executed in a non-interactive, non-login session of shell, hence no job control by default.
You can force job control inside a script, by setting:
set -m
inside the script.
From help set:
-m Job control is enabled.

Call and start in shell

If I want to call another batch script from within a batch script I could use
CALL File.bat
to pause the execution of the current batch file and wait for the CALLed script to complete.
I can use
START File.bat
if I want them to run simultaneously.
How do I achieve this behavior in a shell script??
If you want to wait:
#!/bin/bash
# do some stuff
/path/to/other/script
# do other stuff
To run it simultaneously (i.e. "in the background"):
#!/bin/bash
# do some stuff
/path/to/other/script &
# do other stuff, then optionally:
wait
# this will wait for all background jobs to finish
There are other ways, and certain things you should consider about input and output redirection for the background process if you want to provide specific input and/or capture output or errors, but that's the basics.
By “shell“, I assume you mean *NIX sh.
To execute another script and wait for it to complete, do
sh file.sh
To start it in background, do
(sh file.sh) &
For bash (and other Bourne shell-compatible shells):
you don't need CALL; invoking another script or program executes it synchronously. someprog.sh
you append & to a command to run it asynchronously; note that the program will halt if it attempts to read from stdin if it's in the background. someprog.sh &

Resources