Bash script, update output once command is ended - bash

I have tried different ways but none of them have worked so far.
echo "Starting"
checklocation(){
if (command blabla)
then locationOne=$"Found"
else locationOne=$"Not found"
fi
}
checklocation &
echo "Let's check: " $locationOne
echo "Ending"
As my command take long time to provide the results I'd like to proceed to print all the output and show the value of $locationOne once the result is ready. The following code works fine printing all the output at once however the $locationOne doesn't appear. I tried with printf and \r too without luck. Any suggestions?
To clarify, I would like to load the variable value where the arrows are pointing once the command completes

echo "Starting"
checklocation(){
if (command blabla)
then
locationOne="Found"
else
locationOne="Not found"
fi
}
echo "Calling function"
checklocation
echo "Let's check: " $locationOne
echo "Ending"
try following the above corrections,
Remove the "$" when assigning the locationOne variable
Also while calling the function remove "&", ignore this it is considered as an argument.
Goodluck !!

You want to go back and amend screen output later.
This is very difficult in the general case, but you can simplify it dramatically by making sure that the output you write doesn't scroll the screen in such a way that it's hard to predict where the amendment will have to be made.
This example does it by clearing the screen first, so that any extra output is unlikely to scroll. It can then update by coordinates:
#!/bin/bash
check() {
sleep 3
tput sc # Save cursor
tput cup 1 14 # Set y,x coordinates
printf '%s' "Found"
tput rc # Restore cursor
}
check &
clear # Clear screen so we know where we are
echo "Starting"
echo "Let's check: "
echo "Ending"
wait
This shows:
Starting
Let's check:
Ending
for three seconds, then it updates it to:
Starting
Let's check: Found
Ending
Alternative approaches include:
Keeping the data you want on screen in a string, and just clearing+writing whenever you want to update the full screen
Tracking the number of lines you write, then using the "cursor up" ANSI command (tput cuu) to move up to where you believe the line to amend will be.

You can use wait to wait till child process status is changed. (check man wait)
#for demo puposes , I have manually added sleep of 10 sec.
long_running_command()
{
sleep 10
echo "Hey, I am long running command....uh"
}
long_running_command & #<-using & to send this function to BG
echo "I am normal command"
wait #waiting till the child is terminated.
The above script will result in the following output:
I am normal command
Hey, I am long running command....uh

Related

How to cleanly print a message in a bash shell after a background process finishes?

I want to start a lengthy process in the background in my bash shell, get notified when that process is finished, and then be returned to the command line in elegant fashion. Here's what I have so far:
echo $(lengthy_process >/dev/null 2&>1 ; printf "consummatum est.\r" ) &
This almost works. The message "consummatum est" eventually shows, but it leaves my command prompt in an ugly/indeterminate state with the text interjected into what I happen to be typing.
Is there a way to get the background process to print to terminal without interrupting what I'm doing and without requiring a carriage return to get the command prompt into a fresh state?
a more modern take with notify-send
( lengthy_process &>/dev/null; notify-send "done" ) &
otherwise you're asking for the interruption. You may want to display exit status as well.
You can create a script like this (show-msg) that saves the cursor position, prints the message and restores the cursor position after.
#! /bin/bash
tput sc; tput cup 0 0
printf '%s
================
PROCESS FINISHED
================
%s\n' "$(tput setab 13)" "$(tput sgr0)"
tput rc
and then
( lengthy_process &>/dev/null; ./show-msg ) &
which will show the message without interfering with your typing.
I realized the command prompt in the main shell was still active when the final echo prints its message, but it was printing over the prompt and entered text, so you couldn't tell what was happening. (Hitting enter would therefore execute whatever had been written at the command prompt at the moment the sub process completed).
After trying a LOT of different approaches, I eventually settled on one that suits my purposes. IMO, the key thing to do to avoid a confusing mess is to first issue a CTRL-C command from the subshell(s) to the main shell (in order to cancel anything that may have been written at the moment the lengthy process completes), and THEN print the notification to terminal.
Here's what it looks like (with sleep 3 instead of lengthy_process):
TOPSHELLPID=$$; ( (TEMP=$(sleep 3; echo -e "kill -INT $TOPSHELLPID; echo '\n\n===============\nConsummatum Est\n===============\n'; kill -INT $TOPSHELLPID" ); bash -c "$TEMP") & )

How to display a loading status in command prompt while our code is executing in bash?

i am trying to run a code in Linux which might take around 5 minutes, while the code is executing i need to display a loading status in command window. help me in writing this.
I'd say you should add some sort of loading animation which makes it more clear 'stuff' is happening, or at least should be. There are ofcourse many ways to do this, but this is most aesthetically pleasing while being simple at the same time to me.
printf "Loading, please wait a moment.\n\n"
states="/-\|"
while [ -z ${VARIABLENAME+x} ] do
for (( i=0; i<${#states}; i++ )); do
sleep 0.75
echo -en "${states:$i:1}" "\r"
done
done
You should run that, and set a VARIABLENAME to stop the loop and use clear to remove the last loop that executed and the printf.

Stop command after a given time and return its result in Bash

I need to execute several calls to a C++ program that records frames from a videogame. I have about 1800 test games, and some of them work and some of them don't.
When they don't work, the console returns a Segmentation fault error, but when they do work, the program opens a window and plays the game, and at the same time it records every frame.
The problem is that when it does work, this process does not end until you close the game window.
I need to make a Bash script that will test every game I have and write the names of the ones that work in a text file and the names of the ones that don't work in another file.
For the moment I have tried with this, using the timeout command:
count=0
# Run for every file in the ROMS folder
for filename in ../ROMs/*.bin; do
# Increase the counter
(( count++ ))
# Run the command with a timeout to prevent it from being infinite
timeout 5 ./doc/examples/videoRecordingExample "$filename"
# Check if execution succeeds/fails and print in a text file
if [ $? == 0 ]; then
echo "Game $count named $filename" >> successGames.txt
else
echo "Game $count named $filename" >> failedGames.txt
fi
done
But it doesn't seem to be working, because it writes all the names on the same file. I believe this is because the condition inside the if refers to the timeout and not the execution of the C++ program itself.
Then I tried without the timeout and everytime a game worked, I closed manually the window, and then the result was the expected. I tried this with only 10 games, but when I test it with all the 1800 I would need it to be completely automatic.
So, is there any way of making this process automatic? Like some command to stop the execution and at the same time know if it was successful or not?
instead of
timeout 5 ./doc/examples/videoRecordingExample "$filename"
you could try this:
./doc/examples/videoRecordingExample "$filename" && sleep 5 && pkill videoRecordingExample
Swap the arguments in the timeout code. It should be:
timeout 5 "$filename" ./doc/examples/videoRecordingExample
Reason: the syntax for timeout is:
timeout [OPTION] DURATION COMMAND [ARG]...
So the COMMAND should be just after the DURATION. In the code above the presumably non-executable file videoRecordingExample would be the COMMAND, which probably returns an error every time.

Re-start shell script without creating a new process in Linux

I have a shell file which I execute then, at the end, I get the possibility to press ENTER and run it again. The problem is that each time I press ENTER a new process is created and after 20 or 30 rounds I get 30 PIDs that will finally mess up my Linux. So, my question is: how can I make the script run always in the same process, instead of creating a new one each time I press ENTER?
Code:
#!/bin/bash
echo "Doing my stuff here!"
# Show message
read -sp "Press ENTER to re-start"
# Clear screen
reset
# Re-execute the script
./run_this.sh
exec $SHELL
You would need to exec the script itself, like so
#!/bin/bash
echo "Doing my stuff here!"
# Show message
read -sp "Press ENTER to re-start"
# Clear screen
reset
# Re-execute the script
exec bash ./run_this.sh
exec does not work with shell scripts, so you need to use execute bash instead with your script as an argument.
That said, an in-script loop is a better way to go.
while :; do
echo "Doing my stuff here!"
# Show message
read -sp "Press ENTER to re-start"
# Clear screen
reset
done

How to get a stdout message once a background process finishes?

I realize that there are several other questions on SE about notifications upon completion of background tasks, and how to queue up jobs to start after others end, and questions like these, but I am looking for a simpler answer to a simpler question.
I want to start a very simple background job, and get a simple stdout text notification of its completion.
For example:
cp My_Huge_File.txt New_directory &
...and when it done, my bash shell would display a message. This message could just be the completed job's PID, but if I could program unique messages per background process, that would be cool too, so I could have numerous background jobs running without confusion.
Thanks for any suggestions!
EDIT: user000001's answer separates commands with ;. I separated commands with && in my original example. The only difference I notice is that you don't have to surround your base command with braces if you use&&. Semicolons are a bit more flexible, so I've updated my examples.
The first thing that comes to mind is
{ sleep 2; echo "Sleep done"; } &
You can also suppress the accompanying stderr output from the above line:
{ { sleep 2; echo "Sleep done"; } & } 2>/dev/null
If you want to save your program output (stdout) to a log file for later viewing, you can use:
{ { sleep 2; echo "Sleep done"; } & } 2>/dev/null 1>myfile.log
Here's even a generic form you might use (You can even make an alias so that you can run it at any time without having to type so much!):
# dont hesitate to add semicolons for multiple commands
CMD="cp My_Huge_File.txt New_directory"
{ eval $CMD & } 2>/dev/null 1>myfile.log
You might also pipe stdout into another process using | in case you wish to process output in real time with other scripts or software. tee is also a helpful tool in case you wish to use multiple pipes. For reference, there are more examples of I/O redirection here.
You could use command grouping:
{ slow_program; echo ok; } &
or the wait command
slow_program &
wait
echo ok
The most reliable way is to simply have the output from the background process go to a temporary file and then consume the temporary file.
When you have a background process running it can be difficult to capture the output into something useful because multiple jobs will overwrite eachother
For example, if you have two processes which each print out a string with a number "this is my string1" "this is my string2" then it is possible for you to end up with output that looks like this:
"this is mthis is my string2y string1"
instead of:
this is my string1
this is my string2
By using temporary files you guarantee that the output will be correct.
As I mentioned in my comment above, bash already does this kind of notification by default, as far as I know. Here's an example I just made:
$ sleep 5 &
[1] 25301
$ sleep 10 &
[2] 25305
$ sleep 3 &
[3] 25309
$ jobs
[1] Done sleep 5
[2]- Running sleep 10 &
[3]+ Running sleep 3 &
$ :
[3]+ Done sleep 3
$ :
[2]+ Done sleep 10
$

Resources