Print command result while executing from script - bash

I am trying to run from a shell script a C++ program that print some outputs (using std::cout), and I would like to see them in the console while the program is running.
I tried some things like this :
RES=`./program`
RES=$(./program)
But all I can do is to only display the result at the end : echo $RES...
How to display the outputs in run-time in the console, AND in the variable RES ?

TTY=$(tty);
SAVED_OUTPUT=$(echo "my dummy c++ program" | tee ${TTY});
echo ${SAVED_OUTPUT};
prints
my dummy c++ program
my dummy c++ program
First we save off the name of the current terminal (because tty doesn't work in a pipeline).
TTY=$(tty)
Then we "T" the output (a letter T looks like one stream in at the bottom, 2 out at the top, and comes from the same "plumbing" metaphor as "pipe"), which copies it to the filename given; in this case the "file" is really a special device representing our terminal.
echo "my dummy c++ program" | tee ${TTY}

RES=( $(./program) )
echo ${RES[#]}
you can try in this way

You can use a temporary file
./program | tee temp
RES=$(< temp)
rm temp
You can generate a temporary file with unique name by using mktemp.

res=$(sed -n 'p;' <<< $(printf '%s\n' '*' $'hello\t\tworld'; sleep 5; echo "post-streaming content")&;wait)
echo $res
#output
*
hello world
post-streaming content

[sky#kvm35066 tmp]$ cat test.sh
#!/bin/bash
echo $BASH_VERSION
res=$(printf '%s\n' '*' $'hello\t\tworld'; sleep 5; echo "post-streaming content")
echo "$res"
[sky#kvm35066 tmp]$ bash test.sh
4.1.2(1)-release
*
hello world
post-streaming content
i think the result is correct, and it is what you want

Related

Force Bash-Script to wait for a Perl-Script that awaits input

I'm having a Bash-Script that sequentially runs some Perl-Scripts which are read from a file. These scripts require the press of Enter to continue.
Strangely when I run the script it's never waiting for the input but just continues. I assume something in the Bash-Script is interpreted as an Enter or some other Key-Press and makes the Perl continue.
I'm sure there is a solution out there but don't really know what to look for.
My Bash has this while-Loop which iterates through the list of Perl-Scripts (which is listed in seqfile)
while read zeile; do
if [[ ${datei:0:1} -ne 'p' ]]; then
datei=${zeile:8}
else
datei=$zeile
fi
case ${zeile: -3} in
".pl")
perl $datei #Here it just goes on...
#echo "Test 1"
#echo "Test 2"
;;
".pm")
echo $datei "is a Perl Module"
;;
*)
echo "Something elso"
;;
esac
done <<< $seqfile;
You notice the two commented lines With echo "Test 1/2". I wanted to know how they are displayed.
Actually they are written under each other like there was an Enter-Press:
Test 1
Test 2
The output of the Perl-Scripts is correct I just have to figure out a way how to force the input to be read from the user and not from the script.
Have the perl script redirect input from /dev/tty.
Proof of concept:
while read line ; do
export line
perl -e 'print "Enter $ENV{line}: ";$y=<STDIN>;print "$ENV{line} is $y\n"' </dev/tty
done <<EOF
foo
bar
EOF
Program output (user input in bold):
Enter foo: 123
foo is 123
Enter bar: 456
bar is 456
#mob's answer is interesting, but I'd like to propose an alternative solution for your use case that will also work if your overall bash script is run with a specific input redirection (i.e. not /dev/tty).
Minimal working example:
script.perl
#!/usr/bin/env perl
use strict;
use warnings;
{
local( $| ) = ( 1 );
print "Press ENTER to continue: ";
my $resp = <STDIN>;
}
print "OK\n";
script.bash
#!/bin/bash
exec 3>&0 # backup STDIN to fd 3
while read line; do
echo "$line"
perl "script.perl" <&3 # redirect fd 3 to perl's input
done <<EOF
First
Second
EOF
exec 3>&- # close fd 3
So this will work with both: ./script.bash in a terminal and yes | ./script.bash for example...
For more info on redirections, see e.g. this article or this cheat sheet.
Hoping this helps

How can i run an if else statment from the output of a screen?

for example when the terminal screen outputs a certain message you make the script do something, How do you do that?
For doing BOTH writing to the terminal and using the output, use tee and then read the output into a variable with while read var:
commands | tee /dev/tty | while read var ; do
if [ "$var" == "this" ] ; then
echo $var is this
else
echo $var is that
fi
done
assuming your are on a unix-like system ....
Be aware that the pipe creates a subshell!

IPC in bash (using named pipes, not using expect)

I am writing a bash script that should interact (interactively) with an existing (perl) program. Unfortunately I cannot touch the existing perl program nor can I use expect.
Currently the script works along the lines of this stackoverflow answer Is it possible to make a bash shell script interact with another command line program?
The problem is (read: seems to be) that the perl program does not always send a <newline> before asking for input. This means that bash's while ... read on the named pipe does not "get" (read: display) the perl program's output because it keeps waiting for more. At least that is how I understand it.
So basically the perl program is waiting for input but the user does not know because nothing is on the screen.
So what I do in the bash script is about
#!/bin/bash
mkfifo $readpipe
mkfifo $writepipe
[call perl program] < $writepipe &> $readpipe &
exec {FDW}>$writepipe
exec {FDR}<$readpipe
...
while IFS= read -r L
do
echo "$L"
done < $readpipe
That works, unless the perl program is doing something like
print "\n";
print "Choose action:\n";
print "[A]: Action A [B]: Action B\n";
print " [C]: cancel\n";
print " ? ";
print "[C] ";
local $SIG{INT} = 'IGNORE';
$userin = <STDIN> || ''; chomp $userin;
print "\n";
Then the bash script only "sees"
Choose action:
[A]: Action A [B]: Action B
[C]: cancel
but not the
? [C]
This is not the most problematic case, but the one that is easiest to describe.
Is there a way to make sure the ? [C] is printed as well (I played around with cat <$readpipe & but that did not really work)?
Or is there a better approach all together (given the limitation that I cannot modify the perl program nor can I use expect)?
Use read -N1.
Lets try with following example: interact with a program that sends a prompt (not ended by newline), our system must send some command, receive the echo of the command sent. That is, the total output of the child process is:
$ cat example
prompt> command1
prompt> command2
The script could be:
#!/bin/bash
#
cat example | while IFS=$'\0' read -N1 c; do
case "$c" in
">")
echo "received prompt: $buf"
# here, sent some command
buf=""
;;
*)
if [ "$c" == $'\n' ]; then
echo "received command: $buf"
# here, process the command echo
buf=""
else
buf="$buf$c"
fi
;;
esac
done
that produces following output:
received prompt: prompt
received command: command1
received prompt: prompt
received command: command2
This second example is more near to the original question:
$ cat example
Choose action:
[A]: Action A [B]: Action B
[C]: cancel
? [C]
script is now:
#!/bin/bash
#
while IFS=$'\0' read -N1 c; do
case "$c" in
'?')
echo "*** received prompt after: $buf$c ***"
echo '*** send C as option ***'
buf=""
;;
*)
buf="$buf$c"
;;
esac
done < example
echo "*** final buffer is: $buf ***"
and the result is:
*** received prompt after:
Choose action:[A]: Action A [B]: Action B
[C]: cancel
? ***
*** send C as option ***
*** final buffer is: [C]
***

use of ssh variable in the shell script

I want to use the variables of ssh in shell script.
suppose I have some variable a whose value I got inside the ssh and now I want to use that variable outside the ssh in the shell itself, how can I do this ?
ssh my_pc2 <<EOF
<.. do some operations ..>
a=$(ls -lrt | wc -l)
echo \$a
EOF
echo $a
In the above example first echo print 10 inside ssh prints 10 but second echo $a prints nothing.
I would refine the last answer by defining some special syntax for passing the required settings back, e.g. "#SET var=value"
We could put the commands (that we want to run within the ssh session) in a cmdFile file like this:
a=`id`
b=`pwd`
echo "#SET a='$a'"
echo "#SET b='$b'"
And the main script would look like this:
#!/bin/bash
# SSH, run the remote commands, and filter anything they passed back to us
ssh user#host <cmdFile | grep "^#SET " | sed 's/#SET //' >vars.$$
# Source the variable settings that were passed back
. vars.$$
rm -f vars.$$
# Now we have the variables set
echo "a = $a"
echo "b = $b"
If you're doing this for lots of variables, you can add a function to cmdFile, to simplify/encapsulate your special syntax for passing data back:
passvar()
{
var=$1
val=$2
val=${val:-${!var}}
echo "#SET ${var}='${val}'"
}
a=`id`
passvar a
b=`pwd`
passvar b
You might need to play with quotes when the values include whitespace.
A script like this could be used to store all the output from SSH into a variable:
#!/bin/bash
VAR=$(ssh user#host << _EOF
id
_EOF)
echo "VAR=$VAR"
it produces the output:
VAR=uid=1000(user) gid=1000(user) groups=1000(user),4(adm),10(wheel)

bash/shell how to access to memory/buffer to use what I have echoed before

I have 2 scripts.
one of the lines at the first script is
"...
./second_script >> $outputfile
..."
The second script has a lot of calculation and variables. Now at some point I need to use everything I echoed to the outputfile:
".....
echo $var1
echo $var2
.....
echo $var3
echo What I have echoed | script3
..."
What I have echoed - its $var1 $var2 $var3
How can I do it?
Its a big code, so I cannot do something like that for each line
echo $var
echo $var >> tmp
I cannot do that also cause I have like 2000 $var($var isn't realy a variable, its more like "grep......")
echo $var1 $var2 | script3
I need somehow get an access to what in memory/buffer to what I echoed.
Try something like this:
{ echo $var1
echo $var2
echo $var3
...
} | script3
Add this to the beginning of your script:
exec > >( tee tmp )
Everything you write to standard output will also be added to the file "tmp".
Using /bin/sh, you'll need to simulate the process. No guarantees that this is correct:
# Create a named pipe to act as a buffer, and set up a background job
# that continuously duplicates whatever is written to it to both
# a regular file and standard output
mkfifo buffer
( tail -f buffer | tee tmp ) &
# Now, redirect standard output to the named pipe
exec > buffer

Resources