Related
I'm writing a shell script that should be somewhat secure, i.e., does not pass secure data through parameters of commands and preferably does not use temporary files. How can I pass a variable to the standard input of a command?
Or, if it's not possible, how can I correctly use temporary files for such a task?
Passing a value to standard input in Bash is as simple as:
your-command <<< "$your_variable"
Always make sure you put quotes around variable expressions!
Be cautious, that this will probably work only in bash and will not work in sh.
Simple, but error-prone: using echo
Something as simple as this will do the trick:
echo "$blah" | my_cmd
Do note that this may not work correctly if $blah contains -n, -e, -E etc; or if it contains backslashes (bash's copy of echo preserves literal backslashes in absence of -e by default, but will treat them as escape sequences and replace them with corresponding characters even without -e if optional XSI extensions are enabled).
More sophisticated approach: using printf
printf '%s\n' "$blah" | my_cmd
This does not have the disadvantages listed above: all possible C strings (strings not containing NULs) are printed unchanged.
(cat <<END
$passwd
END
) | command
The cat is not really needed, but it helps to structure the code better and allows you to use more commands in parentheses as input to your command.
Note that the 'echo "$var" | command operations mean that standard input is limited to the line(s) echoed. If you also want the terminal to be connected, then you'll need to be fancier:
{ echo "$var"; cat - ; } | command
( echo "$var"; cat - ) | command
This means that the first line(s) will be the contents of $var but the rest will come from cat reading its standard input. If the command does not do anything too fancy (try to turn on command line editing, or run like vim does) then it will be fine. Otherwise, you need to get really fancy - I think expect or one of its derivatives is likely to be appropriate.
The command line notations are practically identical - but the second semi-colon is necessary with the braces whereas it is not with parentheses.
This robust and portable way has already appeared in comments. It should be a standalone answer.
printf '%s' "$var" | my_cmd
or
printf '%s\n' "$var" | my_cmd
Notes:
It's better than echo, reasons are here: Why is printf better than echo?
printf "$var" is wrong. The first argument is format where various sequences like %s or \n are interpreted. To pass the variable right, it must not be interpreted as format.
Usually variables don't contain trailing newlines. The former command (with %s) passes the variable as it is. However tools that work with text may ignore or complain about an incomplete line (see Why should text files end with a newline?). So you may want the latter command (with %s\n) which appends a newline character to the content of the variable. Non-obvious facts:
Here string in Bash (<<<"$var" my_cmd) does append a newline.
Any method that appends a newline results in non-empty stdin of my_cmd, even if the variable is empty or undefined.
I liked Martin's answer, but it has some problems depending on what is in the variable. This
your-command <<< """$your_variable"""
is better if you variable contains " or !.
As per Martin's answer, there is a Bash feature called Here Strings (which itself is a variant of the more widely supported Here Documents feature):
3.6.7 Here Strings
A variant of here documents, the format is:
<<< word
The word is expanded and supplied to the command on its standard
input.
Note that Here Strings would appear to be Bash-only, so, for improved portability, you'd probably be better off with the original Here Documents feature, as per PoltoS's answer:
( cat <<EOF
$variable
EOF
) | cmd
Or, a simpler variant of the above:
(cmd <<EOF
$variable
EOF
)
You can omit ( and ), unless you want to have this redirected further into other commands.
Try this:
echo "$variable" | command
If you came here from a duplicate, you are probably a beginner who tried to do something like
"$variable" >file
or
"$variable" | wc -l
where you obviously meant something like
echo "$variable" >file
echo "$variable" | wc -l
(Real beginners also forget the quotes; usually use quotes unless you have a specific reason to omit them, at least until you understand quoting.)
Bash seems to behave unpredictably in regards to temporary, per-command variable assignment, specifically with IFS.
I often assign IFS to a temporary value in conjunction with the read command. I would like to use the same mechanic to tailor output, but currently resort to a function or subshell to contain the variable assignment.
$ while IFS=, read -a A; do
> echo "${A[#]:1:2}" # control (undesirable)
> done <<< alpha,bravo,charlie
bravo charlie
$ while IFS=, read -a A; do
> IFS=, echo "${A[*]:1:2}" # desired solution (failure)
> done <<< alpha,bravo,charlie
bravo charlie
$ perlJoin(){ local IFS="$1"; shift; echo "$*"; }
$ while IFS=, read -a A; do
> perlJoin , "${A[#]:1:2}" # function with local variable (success)
> done <<< alpha,bravo,charlie
bravo,charlie
$ while IFS=, read -a A; do
> (IFS=,; echo "${A[*]:1:2}") # assignment within subshell (success)
> done <<< alpha,bravo,charlie
bravo,charlie
If the second assignment in the following block does not affect the environment of the command, and it does not generate an error, then what is it for?
$ foo=bar
$ foo=qux echo $foo
bar
$ foo=bar
$ foo=qux echo $foo
bar
This is a common bash gotcha -- and https://www.shellcheck.net/ catches it:
foo=qux echo $foo
^-- SC2097: This assignment is only seen by the forked process.
^-- SC2098: This expansion will not see the mentioned assignment.
The issue is that the first foo=bar is setting a bash variable, not an environment variable. Then, the inline foo=qux syntax is used to set an environment variable for echo -- however echo never actually looks at that variable. Instead $foo gets recognized as a bash variable and replaced with bar.
So back to your main question, you were basically there with your final attempt using the subshell -- except that you don't actually need the subshell:
while IFS=, read -a A; do
IFS=,; echo "${A[*]:1:2}"
done <<< alpha,bravo,charlie
outputs:
bravo,charlie
For completeness, here's a final example that reads in multiple lines and uses a different output separator to demonstrate that the different IFS assignments aren't stomping on each other:
while IFS=, read -a A; do
IFS=:; echo "${A[*]:1:2}"
done < <(echo -e 'alpha,bravo,charlie\nfoo,bar,baz')
outputs:
bravo:charlie
bar:baz
The answer is a bit simpler than the other answers are presenting:
$ foo=bar
$ foo=qux echo $foo
bar
We see "bar" because the shell expands $foo before setting foo=qux
Simple Command Expansion -- there's a lot to get through here, so bear with me...
When a simple command is executed, the shell performs the following expansions, assignments, and redirections, from left to right.
The words that the parser has marked as variable assignments (those preceding the command name) and redirections are saved for later processing.
The words that are not variable assignments or redirections are expanded (see Shell Expansions). If any words remain after expansion, the first word is taken to be the name of the command and the remaining words are the arguments.
Redirections are performed as described above (see Redirections).
The text after the ‘=’ in each variable assignment undergoes tilde expansion, parameter expansion, command substitution, arithmetic expansion, and quote removal before being assigned to the variable.
If no command name results, the variable assignments affect the current shell environment. Otherwise, the variables are added to the environment of the executed command and do not affect the current shell environment. If any of the assignments attempts to assign a value to a readonly variable, an error occurs, and the command exits with a non-zero status.
If no command name results, redirections are performed, but do not affect the current shell environment. A redirection error causes the command to exit with a non-zero status.
If there is a command name left after expansion, execution proceeds as described below. Otherwise, the command exits. If one of the expansions contained a command substitution, the exit status of the command is the exit status of the last command substitution performed. If there were no command substitutions, the command exits with a status of zero.
So:
the shell sees foo=qux and saves that for later
the shell sees $foo and expands it to "bar"
then we now have: foo=qux echo bar
Once you really understand the order that bash does things, a lot of the mystery goes away.
Short answer: the effects of changing IFS are complex and hard to understand, and best avoided except for a few well-defined idioms (IFS=, read ... is one of the idioms I consider ok).
Long answer: There are a couple of things you need to keep in mind in order to understand the results you're seeing from changes to IFS:
Using IFS=something as a prefix to a command changes IFS only for that one command's execution. In particular, it does not affect how the shell parses the arguments to be passed to that command; that's controlled by the shell's value of IFS, not the one used for the command's execution.
Some commands pay attention to the value of IFS they're executed with (e.g. read), but others don't (e.g. echo).
Given the above, IFS=, read -a A does what you'd expect, it splits its input on ",":
$ IFS=, read -a A <<<"alpha,bravo,charlie"
$ declare -p A
declare -a A='([0]="alpha" [1]="bravo" [2]="charlie")'
But echo pays no attention; it always puts spaces between the arguments it's passed, so using IFS=something as a prefix to it has no effect at all:
$ echo alpha bravo
alpha bravo
$ IFS=, echo alpha bravo
alpha bravo
So when you use IFS=, echo "${A[*]:1:2}", it's equivalent to just echo "${A[*]:1:2}", and since the shell's definition of IFS starts with space, it puts the elements of A together with spaces between them. So it's equivalent to running IFS=, echo "alpha bravo".
On the other hand, IFS=,; echo "${A[*]:1:2}" changes the shell's definition of IFS, so it does affect how the shell puts the elements together, so it comes out equivalent to IFS=, echo "alpha,bravo". Unfortunately, it also affects everything else from that point on so you either have to isolate it to a subshell or set it back to normal afterward.
Just for completeness, here are a couple of other versions that don't work:
$ IFS=,; echo "${A[#]:1:2}"
bravo charlie
In this case, the [#] tells the shell to treat each element of the array as a separate argument, so it's left to echo to merge them, and it ignores IFS and always uses spaces.
$ IFS=,; echo "${A[#]:1:2}"
bravo charlie
So how about this:
$ IFS=,; echo ${A[*]:1:2}
bravo charlie
In this case, the [*] tells the shell to mash all elements together with the first character of IFS between them, giving bravo,charlie. But it's not in double-quotes, so the shell immediately re-splits it on ",", splitting it back into separate arguments again (and then echo joins them with spaces as always).
If you want to change the shell's definition of IFS without having to isolate it to a subshell, there are a few options to change it and set it back afterward. In bash, you can set it back to normal like this:
$ IFS=,
$ while read -a A; do # Note: IFS change not needed here; it's already changed
> echo "${A[*]:1:2}"
> done <<<alpha,bravo,charlie
bravo,charlie
$ IFS=$' \t\n'
But the $'...' syntax isn't available in all shells; if you need portability it's best to use literal characters:
IFS='
' # You can't see it, but there's a literal space and tab after the first '
Some people prefer to use unset IFS, which just forces the shell to its default behavior, which is pretty much the same as with IFS defined in the normal way.
...but if IFS had been changed in some larger context, and you don't want to mess that up, you need to save it and then set it back. If it's been changed normally, this'll work:
saveIFS=$IFS
...
IFS=$saveIFS
...but if someone thought it was a good idea to use unset IFS, this will define it as blank, giving weird results. So you can use this approach or the unset approach, but not both. If you want to make this robust against the unset conflict, you can use something like this in bash:
saveIFS=${IFS:-$' \t\n'}
...or for portability, leave off the $' ' and use literal space+tab+newline:
saveIFS=${IFS:-
} # Again, there's an invisible space and tab at the end of the first line
All in all, it's a lot of mess full of traps for the unwary. I recommend avoiding it whenever possible.
I "inherited" some code in a project, and in one of the Bash scripts, they use echo on a backticks-row, like:
#!/bin/bash
echo `/path/command argument`
What's the difference between that and just running the command itself?
#!/bin/bash
/path/command argument
Both send the command's output to the script's stdout.
So what's the difference? Why use echo?
To combine it with a > seems even worse:
#!/bin/bash
echo `/path/command argument > /var/log/exemple.log`
It doesn't look particularly useful in its current form but one side-effect of using echo with (unquoted) backticks is that whitespace between words is lost. This is because of word splitting - every word in the output of the command is treated as a separate argument to echo, which just outputs them separated by a single space. For example:
$ echo "a b c"
a b c
$ echo `echo "a b c"`
a b c
Note that this applies to all types of whitespace such as tab characters and newlines.
I'm not sure why you'd want to do this deliberately! Personally, I'd be tempted to just run the command normally.
Whenever im trying to insert output with a line break into a variable for instance like this:
Hello
World
Once i do an echo command on the variable i get this:
"Hello World"
Why does it happen and how can i
keep the line break?
In bash, the easiest way to express a literal line break is with $'' syntax:
var=$'Hello\nWorld'
echo "$var"
Note that the quotes around $var are mandatory during expansion if you want to preserve linebreaks or other whitespace! If you only run
echo $var
...then even though a linebreak is stored in your variable, you will see
Hello World
on a single line, as opposed to
Hello
World
on two lines.
This happens because when you don't use quotes, the shell splits the expanded words on whitespace -- including newlines -- and passes each item created by that split as a separate argument. Thus,
echo "$var"
will pass a single string with the entire expansion of $var, whereas
echo $var
will run the equivalent of:
echo "Hello" "World"
...passing each word in the text as a separate argument to echo (whereafter the echo command re-joins its arguments by spaces, resulting in the behavior described).
Line breaks can be embedded in a string in any POSIX compatible shell:
$ str="Hello
> World"
$ echo "$str"
Hello
World
If you hit enter before closing the quotation marks, the shell knows you have not yet finished the quoted material, and prints the secondary prompt (>) and waits for you to finish the command.
How do I echo one or more tab characters using a bash script?
When I run this code
res=' 'x # res = "\t\tx"
echo '['$res']' # expect [\t\tx]
I get this
res=[ x] # that is [<space>x]
echo -e ' \t '
will echo 'space tab space newline' (-e means 'enable interpretation of backslash escapes'):
$ echo -e ' \t ' | hexdump -C
00000000 20 09 20 0a | . .|
Use printf, not echo.
There are multiple different versions of the echo command. There's /bin/echo (which may or may not be the GNU Coreutils version, depending on the system), and the echo command is built into most shells. Different versions have different ways (or no way) to specify or disable escapes for control characters.
printf, on the other hand, has much less variation. It can exist as a command, typically /bin/printf, and it's built into some shells (bash and zsh have it, tcsh and ksh don't), but the various versions are much more similar to each other than the different versions of echo are. And you don't have to remember command-line options (with a few exceptions; GNU Coreutils printf accepts --version and --help, and the built-in bash printf accepts -v var to store the output in a variable).
For your example:
res=' 'x # res = "\t\tx"
printf '%s\n' "[$res]"
And now it's time for me to admit that echo will work just as well for the example you're asking about; you just need to put double quotes around the argument:
echo "[$res]"
as kmkaplan wrote (two and a half years ago, I just noticed!). The problem with your original commands:
res=' 'x # res = "\t\tx"
echo '['$res']' # expect [\t\tx]
isn't with echo; it's that the shell replaced the tab with a space before echo ever saw it.
echo is fine for simple output, like echo hello world, but you should use printf whenever you want to do something more complex. You can get echo to work, but the resulting code is likely to fail when you run it with a different echo implementation or a different shell.
You can also try:
echo Hello$'\t'world.
Put your string between double quotes:
echo "[$res]"
you need to use -e flag for echo then you can
echo -e "\t\t x"
From the bash man page:
Words of the form $'string' are treated specially. The word expands to string, with backslash-escaped characters replaced as specified by the ANSI C standard.
So you can do this:
echo $'hello\tworld'
Use the verbatim keystroke, ^V (CTRL+V, C-v, whatever).
When you type ^V into the terminal (or in most Unix editors), the following character is taken verbatim. You can use this to type a literal tab character inside a string you are echoing.
Something like the following works:
echo "^V<tab>" # CTRL+V, TAB
Bash docs (q.v., "quoted-insert")
quoted-insert (C-q, C-v)
Add the next character that you type to the line verbatim. This is how to insert key sequences like C-q, for example.
side note: according to this, ALT+TAB should do the same thing, but we've all bound that sequence to window switching so we can't use it
tab-insert (M-TAB)
Insert a tab character.
--
Note: you can use this strategy with all sorts of unusual characters. Like a carriage return:
echo "^V^M" # CTRL+V, CTRL+M
This is because carriage return is ASCII 13, and M is the 13th letter of the alphabet, so when you type ^M, you get the 13th ASCII character. You can see it in action using ls^M, at an empty prompt, which will insert a carriage return, causing the prompt to act just like you hit return. When these characters are normally interpreted, verbatim gets you get the literal character.
Using echo to print values of variables is a common Bash pitfall.
Reference link:
http://mywiki.wooledge.org/BashPitfalls#echo_.24foo
If you want to use echo "a\tb" in a script, you run the script as:
# sh -e myscript.sh
Alternatively, you can give to myscript.sh the execution permission, and then run the script.
# chmod +x myscript.sh
# ./myscript.sh
res="\t\tx"
echo -e "[${res}]"