Say I'm running some command foo, which prompts the user for various things. I want to provide values for the first few prompts, but enter the rest manually (i.e. on stdin).
How can I do this? I've tried
echo -e "foo\nbar\nbaz" | foo
This accepts all the inputs, but then gets an EOF from the input stream. I've also tried
foo <(echo -e "foo\nbar\nbaz" & cat /dev/stdin)
which didn't work either.
The main problem here is most likely that foo is not designed to take a filename as an argument. (Keep in mind that <(...) doesn't pass ...'s output on standard-input; rather, it gets expanded to a special filename that can be read from to obtain ...'s output.) To fix this, you can add another <:
foo < <(echo -e "foo\nbar\nbaz" ; cat /dev/stdin)
or use a pipeline:
{ echo -e "foo\nbar\nbaz" ; cat /dev/stdin ; } | foo
(Note that I changed the & to ;, by the way. The former would work, but is a bit strange, given that you intend for echo to handle the first several inputs.)
ask user for what you want and then relay that to your command
echo "Question 1: "; read ans1;
echo "Question 2: "; read ans2;
./foo bar bar "$ans1" baz "$ans2"
maybe like that? it's simple and efficient :)
Related
Why does cat exit a shell script, but only when it's fed by a pipe?
Case in point, take this shell script called "foobar.sh":
#! /bin/sh
echo $#
echo $#
cat $1
sed -e 's|foo|bar|g' $1
And a text file called "foo.txt" which contains only one line:
foo
Now if I type ./foobar.sh foo.txt on the command line, then I'll get this expected output:
1
foo.txt
foo
bar
However if I type cat foo.txt | ./foobar.sh then surprisingly I only get this output:
0
foo
I don't understand. If the number of arguments reported by $# is zero, then how can cat $1 still return foo? And, that being the case, why doesn't sed -e 's|foo|bar|g' $1 return anything since clearly $1 is foo?
This seems an awful lot like a bug, but I'm assuming it's magic instead. Please explain!
UPDATE
Based on the given answer, the following script gives the expected output, assuming a one-line foo.txt:
#! /bin/sh
if [ $# ]
then
yay=$(cat $1)
else
read yay
fi
echo $yay | cat
echo $yay | sed -e 's|foo|bar|g'
No, $1 is not "foo". $1 is
ie, undefined/nothing.
Unlike a programming language, variables in the shell are quite dumbly and literally replaced, and the resulting commands textually executed (well, sorta kinda). In this case, "cat $1" becomes just "cat ", which will take input from stdin. That's terribly convenient to your execution since you've kindly provided "foo" on stdin via your pipe!
See what's happening?
sed likewise will read from stdin, but is already on end of stream, so exits.
When you don't give an argument to cat, it reads from stdin. When $1 isn't given the cat $1 is the same as a simple cat, which reads the text you piped in (cat foo.txt).
Then the sed command runs, and same as cat, it reads from stdin because it has no filename argument. cat has already consumed all of stdin. There's nothing left to read, so sed quits without printing anything.
I isolated a problem in my script to this small example. That's what I get:
$ cmd="test \"foo bar baz\""
$ for i in $cmd; do echo $i; done
test
"foo
bar
baz"
And that's what I expected:
$ cmd="test \"foo bar baz\""
$ for i in $cmd; do echo $i; done
test
"foo bar baz"
How can I change my code to get the expected result?
UPDATE Maybe my first example was not good enough. I looked at the answer of Rob Davis, but I couldn't apply the solution to my script. I tried to simplify my script to describe my problem better. This is the script:
#!/bin/bash
function foo {
echo $1
echo $2
}
bar="b c"
baz="a \"$bar\""
foo $baz
This it the expected output compared to the output of the script:
expected script
a a
"b c" "b
First, you're asking the double-quotes around foo bar baz to do two things simultaneously, and they can't. You want them to group the three words together, and you want them to appear as literals. So you'll need to introduce another pair.
Second, parsing happens when you set cmd, and cmd is set to a single string. You want to work with it as individual elements, so one solution is to use an array variable. sh has an array called #, but since you're using bash you can just set your cmd variable to be an array.
Also, to preserve spacing within an element, it's a good idea to put double quotes around $i. You'd see why if you put more than one space between foo and bar.
$ cmd=(test "\"foo bar baz\"")
$ for i in "${cmd[#]}"; do echo "$i"; done
See this question for more details on the special "$#" or "${cmd[#]}" parsing feature of sh and bash, respectively.
Update
Applying this idea to the update in your question, try setting baz and calling foo like this:
$ baz=(a "\"$bar\"")
$ foo "${baz[#]}"
Why quote it in the first place?
for i in test "foo bar baz"; do echo $i; done
I'm currently using the following to capture everything that goes to the terminal and throw it into a log file
exec 4<&1 5<&2 1>&2>&>(tee -a $LOG_FILE)
however, I don't want color escape codes/clutter going into the log file. so i have something like this that sorta works
exec 4<&1 5<&2 1>&2>&>(
while read -u 0; do
#to terminal
echo "$REPLY"
#to log file (color removed)
echo "$REPLY" | sed -r 's/\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|K]//g' >> $LOG_FILE
done
unset REPLY #tidy
)
except read waits for carriage return which isn't ideal for some portions of the script (e.g. echo -n "..." or printf without \n).
Follow-up to Jonathan Leffler's answer:
Given the example script test.sh:
#!/bin/bash
LOG_FILE="./test.log"
echo -n >$LOG_FILE
exec 4<&1 5<&2 1>&2>&>(tee -a >(sed -r 's/\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|K]//g' > $LOG_FILE))
##### ##### #####
# Main
echo "starting execution"
printf "\n\n"
echo "color test:"
echo -e "\033[0;31mhello \033[0;32mworld\033[0m!"
printf "\n\n"
echo -e "\033[0;36mEnvironment:\033[0m\n foo: cat\n bar: dog\n your wife: hot\n fix: A/C"
echo -n "Before we get started. Is the above information correct? "
read YES
echo -e "\n[READ] $YES" >> $LOG_FILE
YES=$(echo "$YES" | sed 's/^\s*//;s/\s*$//')
test ! "$(echo "$YES" | grep -iE '^y(es)?$')" && echo -e "\nExiting... :(" && exit
printf "\n\n"
#...some hundreds of lines of code later...
echo "Done!"
##### ##### #####
# End
exec 1<&4 4>&- 2<&5 5>&-
echo "Log File: $LOG_FILE"
The output to the terminal is as expected and there is no color escape codes/clutter in the log file as desired. However upon examining test.log, I do not see the [READ] ... (see line 21 of test.sh).
The log file [of my actual bash script] contains the line Log File: ... at the end of it even after closing the 4 and 5 fds. I was able to resolve the issue by putting a sleep 1 before the second exec - I assume there's a race condition or fd shenanigans to blame for it. Unfortunately for you guys, I am not able to reproduce this issue with test.sh but I'd be interested in any speculation anyone may have.
Consider using the pee program discussed in Is it possible to distribute stdin over parallel processes. It would allow you to send the log data through your sed script, while continuing to send the colours to the actual output.
One major advantage of this is that it would remove the 'execute sed once per line of log output'; that is really diabolical for performance (in terms of number of processes executed, if nothing else).
I know it's not a perfect solution, but cat -v will make non visible chars like \x1B to be converted into visible form like ^[[1;34m. The output will be messy, but it will be ascii text at least.
I use to do stuff like this by setting TERM=dumb before running my command. That pretty much removed any control characters except for tab, CR, and LF. I have no idea if this works for your situation, but it's worth a try. The problem is that you won't see color encodings on your terminal either since it's a dumb terminal.
You can also try either vis or cat (especially the -v parameter) and see if these do something for you. You'd simply put them in your pipeline like this:
exec 4<&1 5<&2 1>&2>&>(tee -a | cat -v | $LOG_FILE)
By the way, almost all terminal programs have an option to capture the input, and most clean it up for you. What platform are you on, and what type of terminal program are you using?
You could attempt to use the -n option for read. It reads in n characters instead of waiting for a new line. You could set it to one. This would increase the number of iteration the code runs, but it would not wait for newlines.
From the man:
-n NCHARS read returns after reading NCHARS characters rather than waiting for a complete line of input.
Note: I have not tested this
You can use ANSIFilter to strip or transform console output with ANSI escape sequences.
See http://www.andre-simon.de/zip/download.html#ansifilter
Might not screen -L or the script commands be viable options instead of this exec loop?
I am hoping to do something like:
echo 1 2 | read foo bar
To set two new variables, foo and bar, to the values 1 and 2 respectively. (In reality, "echo 1 2" will be an awk / cut invocation for an external data source.)
I am finding that foo and bar do not exist after this line which makes me wonder if it is a scoping issue? I have only used read in while loops and in those cases the loop body was able to access the variables.
Pipes execute in a sub-shell. As such, the variables foo and bar are created, 1 and 2 are stored in them, then the subshell exits and you return to the parent shell in which these variables do not exist.
One way to read into variables as you appear to want is with a "here string"
read foo bar <<<"1 2"
Which will do what you expected the pipe version to do.
This is non-portable, however, and some shells will not support it. You can use the "here document" form instead, which is broadly supported.
$ read foo bar <<EOF
> 1 2
> EOF
Note that EOF here can be any unique string. A here document will store all lines until one that contains EOF, or whatever marker you chose, and nothing else. In this case the behavior is also identical with the previous example (but harder to copy and paste and longer to type).
What's going on here?
Both the "here document" and the "here string" are ways to represent text passed to standard input without having to enter it interactively. It is functionally equivalent to just saying read foo bar, hitting enter, then manually writing 1 2 and hitting enter again.
Instead of pipe, you can do something like this -
[jaypal:~/Temp] exec 3< <(echo "Jaypal Singh")
[jaypal:~/Temp] while read word1 word2 ; do echo "$word1 $word2"; done <&3
Jaypal Singh
[jaypal:~/Temp] exec 3< <(echo "Jaypal Singh")
[jaypal:~/Temp] while read word1 word2 ; do echo "$word1"; done <&3
Jaypal
Another easy solution - for some cases it might be useful:
echo 1 2 | { read foo bar; echo $foo $bar; }
Of course, like in original question, instead echo commands there may be more complex processing.
thank you bash for running a subshell when piping, now we cannot read anymore multiple variables at the same time !
grep -w regexp file | read var1 var2 var3
there is no solution to replace this KSH functionality. The solution :
'read <<< $(command)'
is bourne and korn shell incompatible.
I can redirect the output and then cat the file and grep/awk the variable, but I would like to use this file for multiple variables.
So If it was one variable say STATUS then i could do some thing like
echo "STATUS $STATUS" >> variable.file
#later perhaps in a remote shell where varible.file was copied
NEW_VAR=`cat variable.file | awk print '{$2}'`
I guess some inline editing with sed would help. The smaller the code the better.
One common way of storing variables in a file is to just store NAME=value lines in the file, and then just source that in to the shell you want to pick up the variables.
echo 'STATUS="'"$STATUS"'"' >> variable.file
# later
. variable.file
In Bash, you can also use source instead of ., though this may not be portable to other shells. Note carefully the exact sequence of quotes necessary to get the correct double quotes printed out in the file.
If you want to put multiple variables at once into the file, you could do the following. Apologies for the quoting contortions that this takes to do properly and portably; if you restrict yourself to Bash, you can use $"" to make the quoting a little simpler:
for var in STATUS FOO BAR
do
echo "$var="'"'"$(eval echo '$'"$var")"'"'
done >> variable.file
The declare builtin is useful here
for var in STATUS FOO BAR; do
declare -p $var | cut -d ' ' -f 3- >> filename
done
As Brian says, later you can source filename
declare is great because it handles quoting for you:
$ FOO='"I'"'"'m here," she said.'
$ declare -p FOO
declare -- FOO="\"I'm here,\" she said."
$ declare -p FOO | cut -d " " -f 3-
FOO="\"I'm here,\" she said."