This question already has answers here:
executing shell command in background from script [duplicate]
(4 answers)
Closed 6 years ago.
I want to repeatedly run multiple commands at a time interval using a script.
I tried this
----------------test_script---------------
while true;do
ls -l >>output.txt
sleep 3
done
while true;do
cat file.txt
sleep 5
done
i want to run both while loops at same time .When i run the above script ,only first while loop is running and the output of ls -l is redirected to the file .How i can execute both while loops simultaneously from the script
One way to do is run one of the loops in the background and other in the fore like below.
#!/bin/bash
while true;do
ls -l >>output.txt
sleep 3
done & # Runs the first while loop in the background and passes to the next while loop
while true;do
cat file.txt
sleep 5
done
Related
This question already has answers here:
History command works in a terminal, but doesn't when written as a bash script
(3 answers)
Closed 2 years ago.
Suppose we have env.sh file that contains:
echo $(history | tail -n2 | head -n1) | sed 's/[0-9]* //' #looking for the last typed command
when executing this script with bash env.sh, the output will be empty:
but when we execute the script with ./env.sh, we get the last typed command:
I just want to know the diffrence between them
Notice that if we add #!/bin/bash at the beginning of the script, the ./env.sh will no longer output anything.
History is disabled by BASH in non-interactive shells by-default. If you want to enable it however, you can do so like this:
#!/bin/bash
echo $HISTFILE # will be empty in non-iteractive shell
HISTFILE=~/.bash_history # set it again
set -o history
# the command will work now
history
The reason this is done is to avoid cluttering the history by any commands being run by any shell scripts.
Adding hashbang (meaning the file is to be interpreted as a script by the program specified in your hashbang) to your script when being run via ./env.sh invokes your script using the binary /bin/bash i.e. run via bash, thus again printing no history.
This question already has an answer here:
How to execute 4 shell scripts in parallel, I can't use GNU parallel?
(1 answer)
Closed 2 years ago.
I want to run multiple bash scripts in parallel.
example of my script running : ./test1.sh $1 and ./test2.sh $1
I tried this: parallel ::: "~/path/test1.sh $1" "~/path/test2.sh $1"
Not working properly, any idea how to fix this?
You could use xargs:
echo "~/path/test1.sh $1 ~/path/test2.sh $1" | xargs -P0 -n2 /bin/bash
-P0 says "run all in parallel"
-n2 passes two arguments to /bin/bash, in this case the script and the parameter
This question already has answers here:
How do I suspend and resume a sequence of commands in Bash?
(1 answer)
Run one command after another, even if I suspend the first one (Ctrl-z)
(2 answers)
Closed 5 years ago.
After running the following command...
$ for i in {1..10}; do sleep 3; echo $i; done
...if I wait a few seconds and hit Ctl+Z, then I get the following:
1
2
^Z
[1]+ Stopped sleep 3
Now if I use fg to resume the process, it resumes the sleep 3 part of the loop, but does not finish the loop:
$ fg
sleep 3
$
Is there a way to stop the process such that the loop can be continued later?
As other mentioned you need to start new sub-shell with (for i in {1..10}; do sleep 3; echo $i; done)
You can suspend with ctrl+z. If you run jobs command, you should see the suspended jobs. Then resume it via fg or bg commands
Jsfiddle:http://jsfiddle.net/lakshmipathi/chccLdLt/3/
This question already has answers here:
Getting ssh to execute a command in the background on target machine
(19 answers)
Closed 6 years ago.
I'm trying to run a bash script on a remote machine, and I'd like to return immediately after running the script in the background of the remote machine. For instance:
$ echo foo.txt
sleep 2000 &
then when I tried to do:
$ ssh x.x.x.x 'bash -s' < foo.txt
the command never returns. Is there a way to make it return while sleep runs in the background on the remote machine?
May by;
echo foo.txt
sleep 2000 >&- 2>&- <&- &
>&- means close stdout.
2>&- means close stderr.
<&- means close stdin.
& means run in the background
This question already has answers here:
Using Bash to display a progress indicator [duplicate]
(12 answers)
Closed 7 years ago.
I have a bash script that ends as follows:
trap "exit" INT
for ((i=0; i < $srccount; i++)); do
echo -e "\"${src[$i]}\" will be synchronized to \"${dest[$i]}\""
echo -e $'Press any key to continue or Ctrl+C to exit...\n'
read -rs -n1
#show_progress_bar()
rsync ${opt1} ${opt2} ${opt3} ${src[$i]} ${dest[$i]}
done
I need a command or a function such as show_progress_bar() that put . (a dot) in the stdout every one second while rsync command is running (or a rotating / that rotates as / - \ | sequence while rsync is running).
Is it possible? Do I need to wrote such function myself, or there is available scripts for this purpose?
It's not pretty, but it works:
~$ while true; do echo -n .; sleep 1; done & sleep 3; kill %-; wait; echo;
[1] 26255
...[1]+ Terminated while true; do
echo -n .; sleep 1;
done
(exchange the "sleep 3" for your actual work)
It works like this:
The while loop runs as a background job.
Meanwhile, your work ("sleep 3" in my example) runs in the foreground.
When the work is done, "kill %-" kills the echo loop.
Then we wait for the job to terminate, and echo a newline, just in case.
Like I said, it's not pretty. And there's probably a much better way to do it. :)
EDIT: For example, like the answer here: Using BASH to display a progress (working) indicator