Running Multiple Bash Scripts parallel [duplicate] - bash

This question already has an answer here:
How to execute 4 shell scripts in parallel, I can't use GNU parallel?
(1 answer)
Closed 2 years ago.
I want to run multiple bash scripts in parallel.
example of my script running : ./test1.sh $1 and ./test2.sh $1
I tried this: parallel ::: "~/path/test1.sh $1" "~/path/test2.sh $1"
Not working properly, any idea how to fix this?

You could use xargs:
echo "~/path/test1.sh $1 ~/path/test2.sh $1" | xargs -P0 -n2 /bin/bash
-P0 says "run all in parallel"
-n2 passes two arguments to /bin/bash, in this case the script and the parameter

Related

Diffrence between bash script.sh and ./script.sh [duplicate]

This question already has answers here:
History command works in a terminal, but doesn't when written as a bash script
(3 answers)
Closed 2 years ago.
Suppose we have env.sh file that contains:
echo $(history | tail -n2 | head -n1) | sed 's/[0-9]* //' #looking for the last typed command
when executing this script with bash env.sh, the output will be empty:
but when we execute the script with ./env.sh, we get the last typed command:
I just want to know the diffrence between them
Notice that if we add #!/bin/bash at the beginning of the script, the ./env.sh will no longer output anything.
History is disabled by BASH in non-interactive shells by-default. If you want to enable it however, you can do so like this:
#!/bin/bash
echo $HISTFILE # will be empty in non-iteractive shell
HISTFILE=~/.bash_history # set it again
set -o history
# the command will work now
history
The reason this is done is to avoid cluttering the history by any commands being run by any shell scripts.
Adding hashbang (meaning the file is to be interpreted as a script by the program specified in your hashbang) to your script when being run via ./env.sh invokes your script using the binary /bin/bash i.e. run via bash, thus again printing no history.

How to embed and run a multi-line perl script stored in a bash variable in a bash script (without immediately running perl) [duplicate]

This question already has answers here:
Run Perl Script From Unix Shell Script
(2 answers)
Closed 3 years ago.
How do I replace [RUN_ABOVE_PERL_SORTING_SCRIPT_HERE] with something that runs this perl script stored in a bash variable?
#!/usr/bin/env bash
# The perl script to sort getfacl output:
# https://github.com/philips/acl/blob/master/test/sort-getfacl-output
find /etc -name .git -prune -o -print | xargs getfacl -peL | [RUN_ABOVE_PERL_SORTING_SCRIPT_HERE] > /etc/.facl.nogit.txt
Notes:
I do not want to employ 2 files (a bash script and a perl script) to solve this problem; I want the functionality to be stored all in one bash script file.
I do not want to immediately run the perl script when storing the perl-script variable, because I want to run it later in the getfacl(1) bash pipeline shown below.
There's many similar stackoverflow questions+answers, but none that I can find (that has clean-reading code, anyway?) that solve both the a) multi-line and b) delayed-execution (or the embedded perl script) portion of this problem.
And to clarify: this problem is not specifically about getfacl(1), which is simply an catalyst to explore how to embed perl scripts--and possibly other scripting languages like python--into bash variables for delayed execution in a bash script.)
Employ the bash read command, which reads the perl script into a variable that's executed later in the bash script.
#!/usr/bin/env bash
# sort getfacl output: the following code is copied from:
# https://github.com/philips/acl/blob/master/test/sort-getfacl-output
read -r -d '' SCRIPT <<'EOS'
#!/usr/bin/env perl -w
undef $/;
print join("\n\n", sort split(/\n\n/, <>)), "\n\n";
EOS
find /etc -name .git -prune -o -print | xargs getfacl -peL | perl -e "$SCRIPT" > /etc/.facl.nogit.txt
This is covered by Run Perl Script From Unix Shell Script.
As they apply here:
You can pass the code to Perl using -e/-E.
perl -e"$script"
or
perl -e"$( curl "$url" )"
You can pass the code via STDIN.
printf %s "$script" | perl -e"$script"
or
curl "$url" | perl
(This won't work for you because you need STDIN.)
You can create a virtual file.
perl <( printf %s "$script" )
or
perl <( curl "$url" )
You can take advantage of perl's -x option.
(Not applicable if you want to download the script dynamically.)
All of the above assume the following command has already been executed:
url='https://github.com/philips/acl/blob/master/test/sort-getfacl-output'
Some of the above assume the following command has already been executed:
script="$( curl "$url" )

Bash Script SSH Commands on Remote Server Not Executing as Expected [duplicate]

This question already has answers here:
Shell script: Run function from script over ssh
(3 answers)
Nested grep with SSH
(1 answer)
Closed 4 years ago.
So I've got a bash script in which I want to SSH onto one of my remote servers and run some commands. This is my code:
MYFUNCTION="
function my_function
{
VAR=$(readlink -f current | sed 's/[^)
}
my_function
"
ssh -l ${USERNAME} ${HOSTNAME} "${MYFUNCTION}"
The problem is that the VAR variable is not being populated with the command output as it should. I've run the exact same command myself, and I get the desired output, but when doing it through SSH in the bash script, it doesn't work as expected. What am I doing wrong here?
You are putting the code in double quotes, so the variables and commands are being executed on your local machine. Do echo "$MYFUNCTION" and you'll probably be surprised.
Try using a quoted here document:
# Note the single quotes in the next line
ssh -l "$USERNAME" "$HOSTNAME" <<'END_CODE'
function my_function
{
cd www
VAR=$(readlink -f current | sed 's/[^0-9]*//g')
VAR2=$(find . -maxdepth 1 ! -newer "$VAR" ! -name "$VAR"| sort | sed '$!d')
}
my_function
END_CODE
Note also all the quoted variables.

Read command shows error as illegal option [duplicate]

This question already has answers here:
read: Illegal option -d
(2 answers)
Closed 5 years ago.
The following is my code
Read file
Count=0
While read -n1 c
Do
Case $c in
.
.
.
.
Esac
Done < $file
Echo"$count"
When I run this code, it shows the error as
read: Illegal option -n
I'm just started learning shell programming.So please help me fix this code
-n is not an option for read in standard Unix sh and (some of) its variants.
read -n runs well on bash, zsh and ksh93, so you may want to select one of them instead of sh or dash (Debian sh), probably by adding a shebang line:
#! /bin/bash
Or run explicitly with bash:
bash foo.sh

Repeatedly running multiple commands at an interval using a script [duplicate]

This question already has answers here:
executing shell command in background from script [duplicate]
(4 answers)
Closed 6 years ago.
I want to repeatedly run multiple commands at a time interval using a script.
I tried this
----------------test_script---------------
while true;do
ls -l >>output.txt
sleep 3
done
while true;do
cat file.txt
sleep 5
done
i want to run both while loops at same time .When i run the above script ,only first while loop is running and the output of ls -l is redirected to the file .How i can execute both while loops simultaneously from the script
One way to do is run one of the loops in the background and other in the fore like below.
#!/bin/bash
while true;do
ls -l >>output.txt
sleep 3
done & # Runs the first while loop in the background and passes to the next while loop
while true;do
cat file.txt
sleep 5
done

Resources