I am trying to write a bash function that takes command as the first variable and output file as the second variable:
my_func() {
$1 ${SOME_OTHER_PARAMS} |
tee $2
}
when I run my_func "python print_hello_to_a_file ~/out.txt" "second_file.txt", it seems to output "hello" to "~/out.txt", not "out.txt" in my home directory, meaning that "~" was not expanded correctly.
I am wondering if it is possible to correctly expand "~" inside this function?
Possible? Yes, but probably not a good idea.
The basic problem: When parsing the command line, Tilde Expansion happens before Parameter Expansion. This means you can't put a tilde inside a variable and have it be replaced by a path to your home directory in the simplest case.
Minimal demo:
[user#host]$ myvar="~"
[user#host]$ echo $myvar
~
[user#host]$ echo ~
/home/user
One possible solution is to use eval to force a second round of parsing before executing the command.
[user#host]$ eval echo $myvar
/home/user
But eval is VERY DANGEROUS and you should not use it without exhausting all other possibilities. Forcing a second parsing of the command line can result in unexpected, confusing, and potentially even unsafe results if you are not extremely familiar with the parsing rules and take sufficient steps to sanitize your inputs before running them through eval.
The more standard solution is to build up your command inside a bash array.
my_func() {
tee_output="${1}"
shift
# expand the inputs all together
# SOME_OTHER_PARAMS should be an array as well
"${#}" "${SOME_OTHER_PARAMS[#]}" | tee "${tee_output}"
}
# build the command up as an array with each word as its own element
# Tilde expansion will occur here and the result will be stored for later
my_command=( "python" "print_hello_to_a_file" ~/"out.txt" )
# expand the array and pass to my_func
# Important that the tee_location goes first here, as we
# shift it off to capture the remaining arguments as a complete command
my_func "${tee_loc}" "${my_command[#]}"
But my_func still only supports simple commands with this approach - no loops or if/case statements, no file redirections, etc. This might be okay if your goal is just to decorate a variety of commands with extra parameters and tee their output somewhere.
Related
The command:
( echo 1 )
works fine when I input it in the command line but if I store it as a variable and call it, it gives the error:
(echo: command not found
Code:
input="( echo 1 )"
$input
Why doesn't it evaluate the parentheses the same way and put it into a subshell when I call it this way?
This is discussed in full detail in BashFAQ #50.
An unquoted expansion goes through only two stages of shell parsing: Field-splitting, and glob expansion. Thus, ( echo 1 ) is first split into fields: (, echo, 1, and ); each is expanded as a glob (moot, as none of them are glob expansions); and then they're run as a command: ( is invoked, with the first argument echo, the second argument 1, and the third argument ).
The Right Way to store code is in a function:
# best-practices approach
input() ( echo 1; )
input
...or, if you want to make it more explicit to a human reader that you really want a subshell and weren't using parens rather than braces by error or habit:
# same, but more explicit about intent
input() { (echo 1); }
input
...if not possible, one can use eval (but be wary of the caveats given in BashFAQ #48):
# avoid this approach if at all possible
input="( echo 1 )"
eval "$input"
If the real reason you're building a command in a string is to parameterize its contents, use an array instead:
input_args=( 1 ) # define an array
input() ( echo "${input_args[#]}" ) # use that array in a function (if needed)
# add things according to conditional logic as appropriate
if (( 2 > 1 )); then
input_args+=( "possible argument here" )
fi
# call the function, or just use the array directly, such as: (echo "$(input_args[#]}" )
input
The parentheses are shell syntax.
To reflect syntax which is stored in variable back into the shell for processing, you must use the eval command.
Merely interpolating the value of the variable into a command line doesn't trigger eval evaluation of the syntax. That would be an unwanted intrusion of eval, which would cause all sorts of problems.
For instance, consider this:
arg="(parenthesized)"
somecommand $arg
We just want somecommand to be invoked with the character string (parenthesized) as its argument; we don't want to run a subshell. That sort of implicit eval would turn harmless data into live code, creating a security hole, not to mention a coding nightmare to try to avoid it.
The rules for how $arg is treated are independent of position; the expansion happens the same way even if $arg is the first element in the command line:
$arg foo
Parameter substitution turns $arg into the text (parenthesized) and then that is tried as a command, not as a shell syntax.
To execute the piece of shell script stored in input, use:
eval "$input"
You could also do it like this:
input="echo 1" # no parentheses
/bin/sh -c "$input" # run in subshell explicitly
Of course, the subshell approach feeds the code fragment to the same shell which is executing the surrounding script, whereas here we chose the /bin/sh which could be different from the invoking shell, or have a different behavior even if it is a symlink to the same shell.
I used this command in my Bash Shell:
printf $VAR1 >> `printf $VAR2`
and it normally worked. But when I write this into the script file and run it in Shell, it does not work. File "script.sh" contains this:
#!/bin/bash
printf $VAR1 >> `printf $VAR2`
and the output in Shell is:
script.sh: line2: `printf $VAR2`: ambiguous redirect
I donĀ“t know, how is this possible, because the command is absolutely the same. And of course, I run the script on the same system and in the same Shell window.
Thank you for your help.
There are 3 points worth addressing here:
Shell variables vs. environment variables:
Scripts (unless invoked with . / source) run in a child process that only sees the parent [shell]'s environment variables, not its regular shell variables.
This is what likely happened in the OP's case: $VAR1 and $VAR2 existed as regular shell variables, but not environment variables, so script script.sh didn't see them.
Therefore, for a child process to see a parent shell's shell variables, the parent must export them first, as a result of which they (also) become environment variables: export VAR1=... VAR2=...
Bash's error messages relating to output redirection (>, >>):
If the filename argument to a an output redirection is an - unquoted command substitution (`...`, or its modern equivalent, $(...)) - i.e., the output from a command - Bash reports error ambiguous redirect in the following cases:
The command output has embedded whitespace, i.e., contains more than one word.
The command output is empty, which is what likely happened in the OP's case.
As an aside: In this case, the error message's wording is unfortunate, because there's nothing ambiguous about a missing filename - it simply cannot work, because files need names.
It is generally advisable to double-quote command substitutions (e.g., >> "$(...)") and also variable references (e.g., "$VAR2"): this will allow you to return filenames with embedded whitespace, and, should the output be unexpectedly empty, you'll get the (slightly) more meaningful error message No such file or directory.
Not double-quoting a variable reference or command substitution subjects its value / to so-called shell expansions: further, often unintended interpretation by the shell.
The wisdom of using a command substitution to generate a filename:
Leaving aside that printf $VAR2 is a fragile way to print the value of variable $VAR2 in general (the robust form again involves double-quoting: printf "$VAR2", or, even more robustly, to rule out inadvertent interpretation of escape sequences in the variable value, printf %s "$VAR2"), there is no good reason to employ a command substitution to begin with if all that's needed is a variable's value:
>> "$VAR2" is enough to robustly specify the value of variable $VAR2 as the target filename.
I tried this on my Mac (10.11.1) in a terminal window and it worked fine.
Are you sure your default shell is bash?
echo $SHELL
Did you use EXPORT to set your shell vars?
$ export VAR1="UselessData"
$ export VAR2="FileHoldingUselessData"
$ ./script.sh
$ cat FileHoldingUselessData
UselessData$
However.... echo I think does a better job since with printf the output terminates with the first space so....
$ cat script.sh
#!/bin/bash
echo $VAR1 >> `printf $VAR2`
$ ./script.sh
$ cat FileHoldingUselessData
Some Useless Data
Which leads me to believe you might want to just use echo instead of printf all together..
#!/bin/bash
echo $VAR1 >> `echo $VAR2`
I have a BASH script called script.sh that takes 3 arguments and runs an executable file with them. The first two are just numbers, but the last is an argument giving the input file. I would like the script to run the executable with the input as an argument of the executable and using the "<" as a replacement for stdin. (i.e.
bash script.sh 5 1 input.txt
calls the BASH script, and the contents of script.sh are as follows:
#!/bin/bash
command1="./l${1}t${2} $3"
command2="./l${1}t${2} < $3"
echo + ${command1}
${command1}
echo + ${command2}
${command2}
When I echo command1 I get
./l5t1 input.txt
which is exactly what I want and it runs just fine.
When I echo command2 I get
./l5t1 < input.txt
which is again what I want. The problem is the actual command the script runs is
./l5t1 '<' input.txt
which of course causes a segmentation fault in my program.
I'd like to know if there is a way I can run command 2 so that it runs the string exactly as it is printed in the echo output. Honestly, I have no idea why the single quotes are even inserted around the < character.
If you want to store commands it's better to use functions than variables. As you've found out, redirections don't work when stored in variables (nor do |, ;, or &).
command1() {
"./l${1}t${2}" "$3"
}
command2() {
"./l${1}t${2}" < "$3"
}
command1 "$#"
command2 "$#"
Here I've defined two functions, which are called with the arguments from the array $#. "$#" forwards the script's arguments to the functions.
Notice also that I've put quotes around "./${1}t${2}" and "$3". Using double quotes allows these parameters to contain spaces. Liberal quoting is a good defensive scripting technique.
(I strongly recommend not doing eval "$command2". Using eval is a really dangerous habit to get into.)
I need to write a bash function which, given a string that represents a command line, returns just the first token in the command line (i.e. the program being called, which may have spaces in its name), dropping any arguments. I want to do this without using sed or awk or anything but bash builtins and variable manipulation.
e.g.:
drop_args "ls" # prints "ls"
drop_args "ls -al" # prints "ls"
drop_args "spaces\ in\ name --bad-idea" # prints "spaces\ in\ name"
What I've tried is:
drop_args () { echo $1; }
and then I call drop_args ls -al, i.e. without quoting the string, and that works nicely for all cases that I can see except for drop_args spaces\ in\ name.
I'm not too terribly concerned if I can't do this in a way that correctly handles the spaces case with the restrictions I have stipulated, but I need to at least reliably detect that situation and display an appropriate error message, I guess.
This is related to my earlier question about dereferencing shell aliases; it's sort of a subproblem of what I am ultimately trying to accomplish there.
drop_args()
{
read cmd args <<< "$1"
echo "$cmd"
}
This loses the backslashes, but they were never really meant to be "there" in the first place.
This question already has answers here:
Propagate all arguments in a Bash shell script
(12 answers)
Closed 3 years ago.
Let's say I have a function abc() that will handle the logic related to analyzing the arguments passed to my script.
How can I pass all arguments my Bash script has received to abc()? The number of arguments is variable, so I can't just hard-code the arguments passed like this:
abc $1 $2 $3 $4
Better yet, is there any way for my function to have access to the script arguments' variables?
The $# variable expands to all command-line parameters separated by spaces. Here is an example.
abc "$#"
When using $#, you should (almost) always put it in double-quotes to avoid misparsing of arguments containing spaces or wildcards (see below). This works for multiple arguments. It is also portable to all POSIX-compliant shells.
It is also worth noting that $0 (generally the script's name or path) is not in $#.
The Bash Reference Manual Special Parameters Section says that $# expands to the positional parameters starting from one. When the expansion occurs within double quotes, each parameter expands to a separate word. That is "$#" is equivalent to "$1" "$2" "$3"....
Passing some arguments:
If you want to pass all but the first arguments, you can first use shift to "consume" the first argument and then pass "$#" to pass the remaining arguments to another command. In Bash (and zsh and ksh, but not in plain POSIX shells like dash), you can do this without messing with the argument list using a variant of array slicing: "${#:3}" will get you the arguments starting with "$3". "${#:3:4}" will get you up to four arguments starting at "$3" (i.e. "$3" "$4" "$5" "$6"), if that many arguments were passed.
Things you probably don't want to do:
"$*" gives all of the arguments stuck together into a single string (separated by spaces, or whatever the first character of $IFS is). This looses the distinction between spaces within arguments and the spaces between arguments, so is generally a bad idea. Although it might be ok for printing the arguments, e.g. echo "$*", provided you don't care about preserving the space within/between distinction.
Assigning the arguments to a regular variable (as in args="$#") mashes all the arguments together like "$*" does. If you want to store the arguments in a variable, use an array with args=("$#") (the parentheses make it an array), and then reference them as e.g. "${args[0]}" etc. Note that in Bash and ksh, array indexes start at 0, so $1 will be in args[0], etc. zsh, on the other hand, starts array indexes at 1, so $1 will be in args[1]. And more basic shells like dash don't have arrays at all.
Leaving off the double-quotes, with either $# or $*, will try to split each argument up into separate words (based on whitespace or whatever's in $IFS), and also try to expand anything that looks like a filename wildcard into a list of matching filenames. This can have really weird effects, and should almost always be avoided. (Except in zsh, where this expansion doesn't take place by default.)
I needed a variation on this, which I expect will be useful to others:
function diffs() {
diff "${#:3}" <(sort "$1") <(sort "$2")
}
The "${#:3}" part means all the members of the array starting at 3. So this function implements a sorted diff by passing the first two arguments to diff through sort and then passing all other arguments to diff, so you can call it similarly to diff:
diffs file1 file2 [other diff args, e.g. -y]
Use the $# variable, which expands to all command-line parameters separated by spaces.
abc "$#"
Here's a simple script:
#!/bin/bash
args=("$#")
echo Number of arguments: $#
echo 1st argument: ${args[0]}
echo 2nd argument: ${args[1]}
$# is the number of arguments received by the script. I find easier to access them using an array: the args=("$#") line puts all the arguments in the args array. To access them use ${args[index]}.
It's worth mentioning that you can specify argument ranges with this syntax.
function example() {
echo "line1 ${#:1:1}"; #First argument
echo "line2 ${#:2:1}"; #Second argument
echo "line3 ${#:3}"; #Third argument onwards
}
I hadn't seen it mentioned.
abc "$#" is generally the correct answer.
But I was trying to pass a parameter through to an su command, and no amount of quoting could stop the error su: unrecognized option '--myoption'. What actually worked for me was passing all the arguments as a single string :
abc "$*"
My exact case (I'm sure someone else needs this) was in my .bashrc
# run all aws commands as Jenkins user
aws ()
{
sudo su jenkins -c "aws $*"
}
abc "$#"
$# represents all the parameters given to your bash script.