Bash executes comands out of sequance sometimes [duplicate] - bash

This question already has answers here:
Why didn't the shell command execute in order as I expect?
(4 answers)
bash script order of execution
(2 answers)
Closed 3 years ago.
I running multiple commands in bash in parallel and need to get output that is delimited so receiving script could separate the values.
I attempted to do this in few ways but it seems that any echo is executed instantly and anything following after.
So I am trying to find a way to separate input from each output with separator preceding output.
I actually use curl request that may take 50-200ms to respond, but here for simplicity I will give example with time command.
Here is rough example:
echo ">" && time &
echo ">" && time &
echo ">" && time &
wait
This produces >>> time time time
I am looking for a way to make it produce >time>time>time
I had some success trying to call other bash scripts with trailing echo command instead of making actual commands and that works most of the time but inevitably things get mixed up because of timing.
I will post updates as I work on it, thank you for the help

Try this:
echo ">$(time)" &
echo ">$(time) &
echo ">$(time)" &
wait
That tells echo that it needs the output of the time command you have before it can do its thing.

Related

Shell program behaves differently when read from a file vs. pipe [duplicate]

This question already has answers here:
While loop stops reading after the first line in Bash
(5 answers)
Closed 2 years ago.
I made a bash script for my personal usage which sets up selenium webdriver with appropriate options . Here is its raw link - https://del.dog/raw/edivamubos
If i execute this script using curl after writing it to a file first like..
curl https://del.dog/raw/edivamubos -o test.sh && \
chmod u+x test.sh && \
bash test.sh
The script works perfectly as its intended to work
But usually i like to execute scripts directly using curl , so when i do..
curl https://del.dog/raw/edivamubos | bash
The script works very weirdly , it keeps repeating line 22,23 and 29 infinitely on loop. i couldnt beleive it as first so i tested this 3,4 times and can confirm it.
Now
what is the reason for same script acting differently in both cases ?
How do i fix it ( ie make it work correctly even after executing it directly without writing to a file )
Edit -
If someone want they can quickly test this in google colab ( in case someone intending to test but don't want to install any packages locally ) . I am mentioning this thing because you won't be able to reproduce this properly in any bash IDE.
When you pipe the script to bash, this command (line 24):
read -p "Enter your input : " input
reads the next line (i.e. line 25, case $input in) because bash's stdin is connected to curl's stdout, and read reads from the same descriptor as bash.
To avoid that, the developer can change the script so that all input is read from /dev/tty (i.e. the controlling terminal). E.g.:
read -p 'prompt' input </dev/tty
Or the user can use one of the below, so that read reads from the terminal, not the descriptor it was read from.
bash -c "$(curl link)"
bash <(curl link)

Different between " ; " and " && " in bash [duplicate]

This question already has answers here:
What is the difference between double-ampersand (&&) and semicolon (;) in Linux Bash?
(4 answers)
Closed 2 years ago.
I've been doing lots of Linux based stuff with my time and I know that the ; is used to separate commands, and && runs command after the previous one is done.
But if anyone more knowledgeable then me can explain the difference between the two, that would be nice.
Here's a simple example:
whoami ; hostname
whoami && hostname
; will execute the second command whether or not the first returns without error.
&& is the bash AND logical operator, and will execute the second command only if the first returns succesfully without error.
The success of a command is determined by its exit status.

parallel execution in shell scripting hangs

My requirement is to run multiple shell scripts at a time.
After searching on Google could conclude that I can use "&" at the end of filename while triggering the run like:
sh file.sh &
the thing is I have for loop which generates the values and gives runtime parameters for the shell script:
sample code:
declare -a arr=("1" "2")
for ((i=0;i<${#arr[#]};++i));
do
sh fileto_run.sh ${arr[i]}
done
this successfully triggers the fileto_run.sh in parallel but it hangs there itself.. imagine I have echo statement in the script then the following is how the code hangs:
-bash-x.x$ 1
2
until I use ctrl+c the code execution wont exit.
I thought of using a break statement but that breaks the loop.
Am I doing wrong anywhere?
Please do correct me.

Storing execution time of a command in a variable

I am trying to write a task-runner for command line. No rationale. Just wanted to do it. Basically it just runs a command, stores the output in a file (instead of stdout) and meanwhile prints a progress indicator of sorts on stdout and when its all done, prints Completed ($TIME_HERE).
Here's the code:
#!/bin/bash
task() {
TIMEFORMAT="%E"
COMMAND=$1
printf "\033[0;33m${2:-$COMMAND}\033[0m\n"
while true
do
for i in 1 2 3 4 5
do
printf '.'
sleep 0.5
done
printf "\b\b\b\b\b \b\b\b\b\b"
sleep 0.5
done &
WHILE=$!
EXECTIME=$({ TIMEFORMAT='%E';time $COMMAND >log; } 2>&1)
kill -9 $WHILE
echo $EXECTIME
#printf "\rCompleted (${EXECTIME}s)\n"
}
There are some unnecessarily fancy bits in there I admit. But I went through tons of StackOverflow questions to do different kinds of fancy stuff just to try it out. If it were to be applied anywhere, a lot of fat could be cut off. But it's not.
It is to be called like:
task "ping google.com -c 4" "Pinging google.com 4 times"
What it'll do is print Pinging google.com 4 times in yellow color, then on the next line, print a period. Then print another period every .5 seconds. After five periods, start from the beginning of the same line and repeat this until the command is complete. Then it's supposed to print Complete ($TIME_HERE) with (obviously) the time it took to execute the command in place of $TIME_HERE. (I've commented that part out, the current version would just print the time).
The Issue
The issue is that that instead of the execution time, something very weird gets printed. It's probably something stupid I'm doing. But I don't know where that problem originates from. Here's the output.
$ sh taskrunner.sh
Pinging google.com 4 times
..0.00user 0.00system 0:03.51elapsed 0%CPU (0avgtext+0avgdata 996maxresident)k 0inputs+16outputs (0major+338minor)pagefaults 0swaps
Running COMMAND='ping google.com -c 4';EXECTIME=$({ TIMEFORMAT='%E';time $COMMAND >log; } 2>&1);echo $EXECTIME in a terminal works as expected, i.e. prints out the time (3.559s in my case.)
I have checked and /bin/sh is a symlink to dash. (However that shouldn't be a problem because my script runs in /bin/bash as per the shebang on the top.)
I'm looking to learn while solving this issue so a solution with explanation will be cool. T. Hanks. :)
When you invoke a script with:
sh scriptname
the script is passed to sh (dash in your case), which will ignore the shebang line. (In a shell script, a shebang is a comment, since it starts with a #. That's not a coincidence.)
Shebang lines are only interpreted for commands started as commands, since they are interpreted by the system's command launcher, not by the shell.
By the way, your invocation of time does not correctly separate the output of the time builtin from any output the timed command might sent to stderr. I think you'd be better with:
EXECTIME=$({ TIMEFORMAT=%E; time $COMMAND >log.out 2>log.err; } 2>&1)
but that isn't sufficient. You will continue to run into the standard problems with trying to put commands into string variables, which is that it only works with very simple commands. See the Bash FAQ. Or look at some of these answers:
How to escape a variable in bash when passing to command line argument
bash quotes in variable treated different when expanded to command
Preserve argument splitting when storing command with whitespaces in variable
find command fusses on -exec arg
Using an environment variable to pass arguments to a command
(Or probably hundreds of other similar answers.)

Ruby shell script realtime output

script.sh
echo First!
sleep 5
echo Second!
sleep 5
echo Third!
another_script.rb
%x[./script.sh]
I want another_script.rb to print the output of script.sh as it happens. That means printing "First!", waiting five seconds, printing "Second!', waiting 5 seconds, and so on.
I've read through the different ways to run an external script in Ruby, but none seem to do this. How can I fulfill my requirements?
You can always execute this in Ruby:
system("sh", "script.sh")
Note it's important to specify how to execute this unless you have a proper #!/bin/sh header as well as the execute bit enabled.

Resources