Output redirection using a bash variable - bash

I would like to do something along these lines, but most combinations I have tried have failed:
export no_error="2 > /dev/null"
./some_command $no_error
To run that command and redirect the output using a variable instead of typing the command. How would I go about doing this?

The shell doesn't re-evaluate your no_error variable when you use it like that. It just gets passed to ./some_command as a command-line argument. You can get the behaviour you want by using eval. From the bash manual:
eval [arguments]
The arguments are concatenated together into a single command, which is then read and executed, and its exit status returned as the exit status of eval. If there are no arguments or only empty arguments, the return status is zero.
Here's an example for your case:
export no_error="2>/dev/null"
eval ./some_command $no_error
Note that you can't have a space between the 2 and the >. I'm guessing that's just a typo in your question, though.

Related

Will variable be assigned to empty if the command for command substitution failed in shell scriping?

a=whatever
f(){return 1}
echo $a # this gives whatever
a=$(f)
echo $a # now a is empty
Is this because command returned non zero, so command substitution failed, thus a is set to empty? Is this a well defined behavior? Can't find it anywhere. Could you point me to the doc describing it?
I tried to grep something from a file and assign the result to the variable. So I did a=$(grep pattern file_not_exist). Then I see grep complained that the file does not exist. I wonder why this complaint message is not assigned to the variable rather than being printed out? Is it because of stdout and stderr?
No, it's not because the command failed that a is empty.
The command didn't produce any output data, so there is no data to be captured in a.
Revise the function to read:
f(){ echo Hello; return 1; }
Now run the command substitution. You'll find that a contains Hello. If you check $? immediately after the assignment, it contains 1, the status returned from the function. Exit statuses and command outputs are separate.
The documentation is in the GNU Bash manual for Command Substitution.

Echo-ing an environment variable returns string literal rather than environment variable value

I have two bash scripts. The first listens to a pipe "myfifo" for input and executes the input as a command:
fifo_name="myfifo"
[ -p $fifo_name ] || mkfifo $fifo_name;
while true
do
if read line; then
$line
fi
done <"$fifo_name"
The second passes a command 'echo $SET_VAR' to the "myfifo" pipe:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
As you can see, I want to pass 'echo $SET_VAR' through the pipe. In the listener process, I've set a $SET_VAR environment variable. I expect the output of the command 'echo $SET_VAR' to be 'var_value,' which is the value of the environment variable SET_VAR.
Running the first (the listener) script in one bash process and then passing a command via the second in another process gives the following result:
$SET_VAR
I expected to "var_value" to be printed. Instead, the string literal $SET_VAR is printed. Why is this the case?
Before I get to the problem you're reporting, I have to point out that your loop won't work. The while true part (without a break somewhere in the loop) will run forever. It'll read the first line from the file, loop, try to read a second line (which fails), loop again, try to read a third line (also fails), loop again, try to read a fourth line, etc... You want the loop to exit as soon as the read command fails, so use this:
while read line
do
# something I'll get to
done <"$fifo_name"
The other problem you're having is that the shell expands variables (i.e. replaces $var with the value of the variable var) partway through the process of parsing a command line, and when it's done that it doesn't go back and re-do the earlier parsing steps. In particular, if the variable's value included something like $SET_VAR it doesn't go back and expand that, since it's just finished the bit where it expands variables. In fact, the only thing it does with the expanded value is split it into "words" (based on whitespace), and expand any filename wildcards it finds -- no variable expansions happen, no quote or escape interpretation, etc.
One possible solution is to tell the shell to run the parsing process twice, with the eval command:
while read line
do
eval "$line"
done <"$fifo_name"
(Note that I used double-quotes around "$line" -- this prevents the word splitting and wildcard expansion I mentioned from happening before eval goes through the normal parsing process. If you think of your original code half-parsing the command in $line, without double-quotes it gets one and a half-parsed, which is weird. Double-quotes suppress that half-parsing stage, so the contents of the variable get parsed exactly once.)
However, this solution comes with a big warning, because eval has a well-deserved reputation as a bug magnet. eval makes it easy to do complex things without quite understanding what's going on, which means you tend to get scripts that work great in testing, then fail incomprehensibly later. And in my experience, when eval looks like the best solution, it probably means you're trying to solve the wrong problem.
So, what're you actually trying to do? If you're just trying to execute the lines coming from the fifo as shell commands, then you can use bash "$fifo_name" to run them in a subshell, or source "$fifo_name" to run them in the current shell.
BTW, the script that feeds the fifo:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
Is also a disaster waiting to happen. Putting commands in variables doesn't work very well in the shell (I second chepner's recommendation of BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!), and putting a command to print another command in a variable is just begging for trouble.
bash, by it's nature, reads commands from stdin. You can simply run:
bash < myfifo

How do I redirect output when the command to execute is stored in a variable in a bash script?

Consider the following script:
#!/bin/bash
CMD="echo hello world > /tmp/hello.out"
${CMD}
The output for this is:
hello world > /tmp/hello.out
How can I modify CMD so that the output gets redirected to hello.out?
For my use case, it is not feasible to either do this:
${CMD} > /tmp/hello.out
or to add this at the top of the script:
exec > /tmp/hello.out
No, there is no way to make a redirection happen from a variable.
Why?
The first thing the shell does with a command line is:
Each line that the shell reads from the standard input or a script is called a pipeline; it contains one or more commands separated by zero or
more pipe characters (|). For each pipeline it reads, the shell breaks it up into commands, sets up the I/O for the pipeline, then does the following for each command (Figure 7-1):
From: Learning the bash Shell Unix Shell Programming . Chapter Preview / Figure . Pdf
That means that even before starting with the first word of a command line, the redirections are set up.
The "Parameter Expansion" happens quite a lot latter (in step 6 of the Figure).
There is no way to set up redirections after a variable is expanded.
Unless ...
The "command line is reprocessed" using eval.
eval "$CMD"
But this comes with a lot of danger.
The command line is changed by the first processing in the 12 steps detailed in the book (quotes are removed, variables expanded, words split, etc.).
It is usually quite difficult to estimate all the changes and consequences before the line is actually processed.
And then, it is processed again.
You can use eval to instruct the shell to reinterpret the variable content as a shell command:
eval $CMD

Bash command line arguments, replacing defaults for variables

I have a script which has several input files, generally these are defaults stored in a standard place and called by the script.
However, sometimes it is necessary to run it with changed inputs.
In the script I currently have, say, three variables, $A $B, and $C. Now I want to run it with a non default $B, and tomorrow I may want to run it with a non default $A and $B.
I have had a look around at how to parse command line arguments:
How do I parse command line arguments in Bash?
How do I deal with having some set by command line arguments some of the time?
I don't have enough reputation points to answer my own question. However, I have a solution:
Override a variable in a Bash script from the command line
#!/bin/bash
a=input1
b=input2
c=input3
while getopts "a:b:c:" flag
do
case $flag in
a) a=$OPTARG;;
b) b=$OPTARG;;
c) c=$OPTARG;;
esac
done
You can do it the following way. See Shell Parameter Expansion on the Bash man page.
#! /bin/bash
value=${1:-the default value}
echo value=$value
On the command line:
$ ./myscript.sh
value=the default value
$ ./myscript.sh foobar
value=foobar
Instead of using command line arguments to overwrite default values, you can also set the variables outside of the script. For example, the following script can be invoked with foo=54 /tmp/foobar or bar=/var/tmp /tmp/foobar:
#! /bin/bash
: ${foo:=42}
: ${bar:=/tmp}
echo "foo=$foo bar=$bar"

Get name of last run program in Bash

I have a bash script where I trap errors using the trap command, and I would like to be able to print the name of the last command (the one that failed)
#!/bin/bash
function error
{
# echo program name
}
trap error ERR
# Some commands ...
/bin/false foo # For testing
I'm not sure what to put in the error function. I tried echo $_ but that only works if the command has no arguments. I also tried with !! but that gives me "!!: command not found". At an interactive prompt (also bash) I get:
$ /bin/false foo
$ !!
/bin/false foo
which seems to be pretty much what I want. Why the difference?
What is the easiest way to get the name of the previous command inside a script?
Try echo $BASH_COMMAND in your trap function.
From man bash:
BASH_COMMAND
The command currently being executed or about to be executed,
unless the shell is executing a command as the result of a trap,
in which case it is the command executing at the time of the
trap.
You need to set
set -o history
to quote bash manual page:
When the -o history option to the set builtin is enabled, the shell provides access to the command history, the list of commands previously typed. The value of the HISTSIZE variable is used
as the number of commands to save in a history list. The text of the last HISTSIZE commands (default 500) is saved. The shell stores each command in the history list prior to parameter and
variable expansion (see EXPANSION above) but after history expansion is performed, subject to the values of the shell variables HISTIGNORE and HISTCONTROL.
In general, read the HISTORY and HISTORY EXPANSION sections in bash man page.

Resources