I'm just trying to understand what is happening here, so that I understand how to parse strings in shell scripts better.
I know that usually, when you try to pass a string of arguments separated by spaces directly to a command, they will be treated as a single string argument and therefore not recognized:
>check="FileA.txt FileB.txt"
>ls $check
ls: cannot access FileA.txt FileB.txt: No such file or directory
However, in this script two arguments are taken each as space separated strings. In this case, both strings are recognizes as lists of arguments that can be passed to different commands:
testscript.sh
while getopts o:p: arguments
do
case $arguments in
o) olist="$OPTARG";;
p) plist=$OPTARG;;
esac
done
echo "olist"
ls -l $olist
echo "plist"
ls -l $plist
the output is then as follows:
>testscript.sh -o "fileA.txt fileB.txt" -p "file1.txt file2.txt"
Olist
fileA.txt
fileB.txt
plist
file1.txt
file2.txt
What is different here? Why are the space separated strings suddenly recognized as lists?
Your script does not start with a #!-line and does therefore not specify an interpreter. In that case the default is used, which is /bin/sh and not your login shell or the shell you are starting the script from (unless that is /bin/sh of course). Chances are good that /bin/sh is not a zsh, as most distributions and Unices seem to use sh, bash, dash or ksh as default shell. All of which handle parameter expansion such that strings are handles as lists if the parameter was not quoted with double-quotes.
If you want to use zsh as interpreter for your scripts, you have to specify it in the first line of the script:
#!/usr/bin/zsh
Modify the path to wherever your zsh resides.
You can also use env as a wrapper:
#!/usr/bin/env zsh
This makes you more independent of the actual location of zsh, it just has to be in $PATH.
As a matter of fact (using bash)...
sh$ check="FileA.txt FileB.txt"
sh$ ls $check
ls: cannot access FileA.txt: No such file or directory
ls: cannot access FileB.txt: No such file or directory
When you write $check without quotes, the variable is substituted by its content. Insides paces (or to be precises inside occurrences of IFS) are considered as field separators. Just as you where expecting it first.
The only way I know to reproduce your behavior is to set IFS to something else than its default value:
sh$ export IFS="-"
sh$ check="FileA.txt FileB.txt"
sh$ ls $check
ls: cannot access FileA.txt FileB.txt: No such file or directory
Related
ls *.txt shows all files whose name ends with .txt
However if I do the following on a zsh shell: (on macOS 10.15 Catalina in my case)
a=*.txt
b='*.txt'
c="*.txt"
# trying no quotes, single quotes, double quotes
# although doesn't make any difference here
ls $a
ls $b
ls $c
ls "$a"
ls "$b"
ls "$c"
I'm getting
ls: *.txt: No such file or directory
in all cases. How do I include wildcards in a variable and then have commands like ls actually process it as wildcards, rather than literal characters?
You should use ~(tilde) between $ and variable name
to perform globbing in zsh. That is
ls $~a
You can enable wildcards in zsh by using the command:
unsetopt nomatch
If you want to make the change permanent, put the command above into your .zshrc file.
I'm sure this has been asked before but I can't find anything. We have inscrutable login names on a shared machine and want to use shell variables to substitute the hard-to-remember login names for people's real names.
For example, let's say Omar's login name is xyz123. I can do this:
$ omar=xyz123
$ echo ~$omar
and output looks fine:
~xyz123
but if I type this:
$ ls ~$omar
there is an error:
ls: cannot access ~xyz123: No such file or directory
I think it's because tilde expansion happens before variable expansion but can't figure out how to get around this.
Perhaps this answer is related although I'm not sure:
How to manually expand a special variable (ex: ~ tilde) in bash
bash expands the tilde before the variable. See https://www.gnu.org/software/bash/manual/bash.html#Shell-Expansions
The shell will see if the literal characters $ o m a r are a login name. As they are not, the tilde is not expanded. The shell eventually sees $omar as a variable and substitutes that. It then hands the expanded word ~xyz123 to echo which just prints it.
Similarly, it hands the word ~xyz123 to ls. Since ls does not do its own tilde expansion, it is looking for a file in your current directory named ~xyz123 with a literal tilde. Since such a file does not exist you get that error.
If you want ls ~$var to list files, you need eval ls ~$var. Or, since eval is considered very unsafe to use casually, you could do this instead:
ls "$(getent passwd "$omar" | cut -d: -f6)"
I would go with checking if "$omar" is a valid user with id and then using eval to force double expansion. So protect against evil eval and then do it.
if ! id "$omar" >/dev/null 2>&1;
echo "Error: user with the name $omar does not exist!" >&2
exit 1
fi
eval echo "\"~$omar\""
I used this command in my Bash Shell:
printf $VAR1 >> `printf $VAR2`
and it normally worked. But when I write this into the script file and run it in Shell, it does not work. File "script.sh" contains this:
#!/bin/bash
printf $VAR1 >> `printf $VAR2`
and the output in Shell is:
script.sh: line2: `printf $VAR2`: ambiguous redirect
I don´t know, how is this possible, because the command is absolutely the same. And of course, I run the script on the same system and in the same Shell window.
Thank you for your help.
There are 3 points worth addressing here:
Shell variables vs. environment variables:
Scripts (unless invoked with . / source) run in a child process that only sees the parent [shell]'s environment variables, not its regular shell variables.
This is what likely happened in the OP's case: $VAR1 and $VAR2 existed as regular shell variables, but not environment variables, so script script.sh didn't see them.
Therefore, for a child process to see a parent shell's shell variables, the parent must export them first, as a result of which they (also) become environment variables: export VAR1=... VAR2=...
Bash's error messages relating to output redirection (>, >>):
If the filename argument to a an output redirection is an - unquoted command substitution (`...`, or its modern equivalent, $(...)) - i.e., the output from a command - Bash reports error ambiguous redirect in the following cases:
The command output has embedded whitespace, i.e., contains more than one word.
The command output is empty, which is what likely happened in the OP's case.
As an aside: In this case, the error message's wording is unfortunate, because there's nothing ambiguous about a missing filename - it simply cannot work, because files need names.
It is generally advisable to double-quote command substitutions (e.g., >> "$(...)") and also variable references (e.g., "$VAR2"): this will allow you to return filenames with embedded whitespace, and, should the output be unexpectedly empty, you'll get the (slightly) more meaningful error message No such file or directory.
Not double-quoting a variable reference or command substitution subjects its value / to so-called shell expansions: further, often unintended interpretation by the shell.
The wisdom of using a command substitution to generate a filename:
Leaving aside that printf $VAR2 is a fragile way to print the value of variable $VAR2 in general (the robust form again involves double-quoting: printf "$VAR2", or, even more robustly, to rule out inadvertent interpretation of escape sequences in the variable value, printf %s "$VAR2"), there is no good reason to employ a command substitution to begin with if all that's needed is a variable's value:
>> "$VAR2" is enough to robustly specify the value of variable $VAR2 as the target filename.
I tried this on my Mac (10.11.1) in a terminal window and it worked fine.
Are you sure your default shell is bash?
echo $SHELL
Did you use EXPORT to set your shell vars?
$ export VAR1="UselessData"
$ export VAR2="FileHoldingUselessData"
$ ./script.sh
$ cat FileHoldingUselessData
UselessData$
However.... echo I think does a better job since with printf the output terminates with the first space so....
$ cat script.sh
#!/bin/bash
echo $VAR1 >> `printf $VAR2`
$ ./script.sh
$ cat FileHoldingUselessData
Some Useless Data
Which leads me to believe you might want to just use echo instead of printf all together..
#!/bin/bash
echo $VAR1 >> `echo $VAR2`
Everybody says eval is evil, and you should use $() as a replacement. But I've run into a situation where the unquoting isn't handled the same inside $().
Background is that I've been burned too often by file paths with spaces in them, and so like to quote all such paths. More paranoia about wanting to know where all my executables are coming from. Even more paranoid, not trusting myself, and so like being able to display the created commands I'm about to run.
Below I try variations on using eval vs. $(), and whether the command name is quoted (cuz it could contain spaces)
BIN_LS="/bin/ls"
thefile="arf"
thecmd="\"${BIN_LS}\" -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '"/bin/ls" -ld -- "arf"'
./foo.sh: line 8: "/bin/ls": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '"/bin/ls" -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
thecmd="${BIN_LS} -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access "arf": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
$("/bin/ls" -ld -- "${thefile}")
/bin/ls: cannot access arf: No such file or directory
So... this is confusing. A quoted command path is valid everywhere except inside a $() construct? A shorter, more direct example:
$ c="\"/bin/ls\" arf"
$ $($c)
-bash: "/bin/ls": No such file or directory
$ eval $c
/bin/ls: cannot access arf: No such file or directory
$ $("/bin/ls" arf)
/bin/ls: cannot access arf: No such file or directory
$ "/bin/ls" arf
/bin/ls: cannot access arf: No such file or directory
How does one explain the simple $($c) case?
The use of " to quote words is part of your interaction with Bash. When you type
$ "/bin/ls" arf
at the prompt, or in a script, you're telling Bash that the command consists of the words /bin/ls and arf, and the double-quotes are really emphasizing that /bin/ls is a single word.
When you type
$ eval '"/bin/ls" arf'
you're telling Bash that the command consists of the words eval and "/bin/ls" arf. Since the purpose of eval is to pretend that its argument is an actual human-input command, this is equivalent to running
$ "/bin/ls" arf
and the " gets processed just like at the prompt.
Note that this pretense is specific to eval; Bash doesn't usually go out of its way to pretend that something was an actual human-typed command.
When you type
$ c='"/bin/ls" arf'
$ $c
the $c gets substituted, and then undergoes word splitting (see §3.5.7 "Word Splitting" in the Bash Reference Manual), so the words of the command are "/bin/ls" (note the double-quotes!) and arf. Needless to say, this doesn't work. (It's also not very safe, since in addition to word-splitting, $c also undergoes filename-expansion and whatnot. Generally your parameter-expansions should always be in double-quotes, and if they can't be, then you should rewrite your code so they can be. Unquoted parameter-expansions are asking for trouble.)
When you type
$ c='"/bin/ls" arf'
$ $($c)
this is the same as before, except that now you're also trying to use the output of the nonworking command as a new command. Needless to say, that doesn't cause the nonworking command to suddenly work.
As Ignacio Vazquez-Abrams says in his answer, the right solution is to use an array, and handle the quoting properly:
$ c=("/bin/ls" arf)
$ "${c[#]}"
which sets c to an array with two elements, /bin/ls and arf, and uses those two elements as the word of a command.
With the fact that it doesn't make sense in the first place. Use an array instead.
$ c=("/bin/ls" arf)
$ "${c[#]}"
/bin/ls: cannot access arf: No such file or directory
From the man page for bash, regarding eval:
eval [arg ...]:
The args are read and concatenated together into a single command.
This command is then read and executed by the shell, and its exit
status is returned as the value of eval.
When c is defined as "\"/bin/ls\" arf", the outer quotes will cause the entire thing to be processed as the first argument to eval, which is expected to be a command or program. You need to pass your eval arguments in such a way that the target command and its arguments are listed separately.
The $(...) construct behaves differently than eval because it is not a command that takes arguments. It can process the entire command at once instead of processing arguments one at a time.
A note on your original premise: The main reason that people say that eval is evil was because it is commonly used by scripts to execute a user-provided string as a shell command. While handy at times, this is a major security problem (there's typically no practical way to safety-check the string before executing it). The security problem doesn't apply if you are using eval on hard-coded strings inside your script, as you are doing. However, it's typically easier and cleaner to use $(...) or `...` inside of scripts for command substitution, leaving no real use case left for eval.
Using set -vx helps us understand how bash process the command string.
As seen in the picture, "command" works cause quotes will be stripped when processing. However, when $c(quoted twice) is used, only the outside single quotes are removed. eval can process the string as the argument and outside quotes are removed step by step.
It is probably just related to how bash semanticallly process the string and quotes.
Bash does have many weird behaviours about quotes processing:
Bash inserting quotes into string before execution
How do you stop bash from stripping quotes when running a variable as a command?
Bash stripping quotes - how to preserve quotes
I have a file called inp.txt which lists 3 directory names
#!/bin/sh
while read dirname
do
echo $dirname
"ls -l" $dirname
done < inp.txt
When I run the above, I get this error:
line 5: ls -l: command not found
If I do just "ls" instead of "ls -l", it works fine. What am I missing here?
Get rid of the quotes.
while read dirname
do
echo "$dirname"
ls -l "$dirname"
done < inp.txt
When you have quotes you're saying, "treat this as a single word." The shell looks for an executable named ls -l rather than passing the argument -l to the command ls.
Nitpicker's note: If you want to use quotes properly, add them around "$dirname".
Otherwise, if you had a directory named "Music Files" with a space in the name, without quotes your script would treat that as two directory names and print something like:
ls: Music: No such file or directory
ls: Files: No such file or directory
The shell interprets space as argument separaters. By putting the quotes around something, you force shell to see it as one argument. When executing, the shell interprets the first argument as the command to execute. In this case you're telling it to execute the command named "ls -l" with no arguments, instead of "ls" with argument "-l".
You should remove the quotes around ls -l.