This question already has answers here:
How to read a file into a variable in shell?
(9 answers)
Difference between sh and Bash
(11 answers)
Closed 4 years ago.
Setup:
File a contains:
22
File b contains:
12
I have shell script 1.sh:
#!/bin/sh
a=$(< a)
b=$(< b)
echo $(($a*$b)) > c
The script should get values from file a and b, multiply them *, and save to file c.
However after setting permission $ chmod a+rx 1.sh and running it $ ./1.sh it returns an error:
./1.sh: 5: ./1.sh: arithmetic expression: expecting primary: "*"
This error occurs because the variables $a and $b doesn't get value form files a and b.
If I echo $a and echo $b it returns nothing;
If I define a=22 and b=12 values in the script it works;
I also tried other ways of getting contents of files like a=$(< 'a'), a=$(< "a"), a=$(< "~/a"), and even a=$(< cat a). None of those worked.
Plot Twist:
However, if I change shebang line to #!/bin/bash so that Bash shell is used - it works.
Question:
How to properly get data from file in sh?
Ignore everything from file a and b but numbers:
#!/bin/sh
a=$(tr -cd 0-9 < a)
b=$(tr -cd 0-9 < b)
echo $(($a*$b))
See: man tr
If you're looking for "true" Bourne-Shell compatibility, as opposed to Bash's emulation, then you have to go old school:
#!/bin/sh
a=`cat a`
b=`cat b`
expr $a \* $b > c
I tried your original example under #!/bin/sh on both macOS and Linux (FC26), and it behaved properly, assuming a and b had UNIX line-endings. If that can't be guaranteed, and you need to run under #!/bin/sh (as emulated by bash), then something like this will work:
#!/bin/sh
a=$(<a)
b=$(<b)
echo $(( ${a%%[^0-9]*} * ${b%%[^0-9]*} )) > c
There are many ways. One obvious way is to pipe in a sub-process by Command Substitution:
A=$(cat fileA.txt) # 22
B=$(cat fileB.txt) # 12
echo $((A*B))
# <do it in your head!>
If there are any other problems with multiple lines, you need to look into how to use the Bash variable $IFS (Internal File Separator). Usually IFS is defined by: IFS=$' \t\n', so if you need to be able to reliably read lines endings from both Windows and Linux EOL's you may need to modify it.
ADDENDUM:
Process Substitution
Bash, Zsh, and AT&T ksh{88,93} (but not pdksh/mksh) support process
substitution. Process substitution isn't specified by POSIX. You may
use NamedPipes to accomplish the same things. Coprocesses can also do
everything process substitutions can, and are slightly more portable
(though the syntax for using them is not).
This also means that most Android OS does not allow process substitution, since their shells are most often based on mksh.
From man bash:
Process Substitution
Process substitution allows a process's input or output to be referred to using a filename. It takes the form of <(list) or >(list). The
process list is run asynchronously, and its input or output appears as a filename. This filename is passed as an argument to the current
command as the result of the expansion. If the >(list) form is used, writing to the file will provide input for list. If the <(list) form
is used, the file passed as an argument should be read to obtain the output of list. Process substitution is supported on systems that sup-
port named pipes (FIFOs) or the /dev/fd method of naming open files.
When available, process substitution is performed simultaneously with parameter and variable expansion, command substitution, and arithmetic
expansion.
Related
I have this bash script that starts with
for d in /data/mydata/*; do
echo $d
filepath=$(echo $d | tr "/" "\n")
pathArr=($filepath) # fails here
echo ${pathArr[-1]}
It runs fine when I just call in on command line
./run_preprocess.sh
but when I run it using screen
screen -dmSL run_preproc ./run_preprocess.sh
it fails on that pathArr line
./run_preproc.sh: 7: ./run_preproc.sh: Syntax error: "(" unexpected (expecting "done")
is there something I need to do to protect the script code?
Based on the error, looks like you're running your script with POSIX sh, not bash. Arrays are undefined in POSIX sh.
To fix this, add a proper hashbang to your script (e.g. /usr/bin/env bash, or run the script directly with Bash interpreter (e.g. /bin/bash script.sh).
In addition (unrelated to the problem at hand), your script (or the snippet posted) has several potential issues:
variables should be quoted to prevent globbing and word splitting (e.g. consider d - one of your files - containing * -- echo $d will include a list of all files, since * will be expanded)
splitting into array with ($var) is done on any IFS character, not just newlines. IFS includes a space, tab and newline by default. Use of read -a or mapfile is recommended over ($var).
Finally, if all you're trying is get the last component in path (filename), you should consider using basename(1):
$ basename /path/to/file
file
or substring removal syntax of Bash parameter expansion:
$ path=/path/to/file
$ echo "${path##*/}"
file
The POSIX shell standard at
http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html#tag_18_07_04
says in section 2.6.3:
Command substitution allows the output of a command to be substituted in place of the command name itself
This would seem to imply that it is only guaranteed to work, if you substitute for the whole command name; if you substitute for a part of it, or something else, then it may or may not work.
Experimenting:
$ echo ;
$ $(echo echo) ;
$
So far so good...
$ e$(echo cho) ;
$ echo $(echo ';')
;
$ echo $(echo foobar)
foobar
The first and third example above seem to "work" but the second "does not work". Is this all simply undocumented and random behavior, as the standard seems to imply, and in reality for some other POSIX shell, none of these three are guaranteed to "work"?
(By "work", I mean "produce the same result as if the results of the substitution were typed in the command itself on the terminal")
The reason why this:
$ echo $(echo ';')
Does not output the same as this:
$ echo ;
Is the same reason why this:
$ ;
bash: syntax error near unexpected token `;'
Does not output the same as this:
$ `echo ';'`
;: command not found
The latest case, the output from command substitution (the (?)expected-to-be(?) token ;) is interpreted as a command, because it is passed [and interpreted] as a command in a subshell, and not as a command interpreter built-in token.
As my interpretation, this isn't against the standard.
EDIT: Answering the question
You stated:
This would seem to imply that it is only guaranteed to work, if you
substitute for the whole command name; if you substitute for a part of
it, or something else, then it may or may not work.
And the POSIX standard states:
The shell shall expand the command substitution by executing command
in a subshell environment (see Shell Execution Environment) and
replacing the command substitution (the text of command plus the
enclosing "$()" or backquotes) with the standard output of the
command, removing sequences of one or more <newline> characters at the
end of the substitution.
The standard seems clear regarding to what happens. This is not a question of what's present in a command containing a command substitution, only the command that's inside the enclosing $() or or ``.
I used this command in my Bash Shell:
printf $VAR1 >> `printf $VAR2`
and it normally worked. But when I write this into the script file and run it in Shell, it does not work. File "script.sh" contains this:
#!/bin/bash
printf $VAR1 >> `printf $VAR2`
and the output in Shell is:
script.sh: line2: `printf $VAR2`: ambiguous redirect
I don´t know, how is this possible, because the command is absolutely the same. And of course, I run the script on the same system and in the same Shell window.
Thank you for your help.
There are 3 points worth addressing here:
Shell variables vs. environment variables:
Scripts (unless invoked with . / source) run in a child process that only sees the parent [shell]'s environment variables, not its regular shell variables.
This is what likely happened in the OP's case: $VAR1 and $VAR2 existed as regular shell variables, but not environment variables, so script script.sh didn't see them.
Therefore, for a child process to see a parent shell's shell variables, the parent must export them first, as a result of which they (also) become environment variables: export VAR1=... VAR2=...
Bash's error messages relating to output redirection (>, >>):
If the filename argument to a an output redirection is an - unquoted command substitution (`...`, or its modern equivalent, $(...)) - i.e., the output from a command - Bash reports error ambiguous redirect in the following cases:
The command output has embedded whitespace, i.e., contains more than one word.
The command output is empty, which is what likely happened in the OP's case.
As an aside: In this case, the error message's wording is unfortunate, because there's nothing ambiguous about a missing filename - it simply cannot work, because files need names.
It is generally advisable to double-quote command substitutions (e.g., >> "$(...)") and also variable references (e.g., "$VAR2"): this will allow you to return filenames with embedded whitespace, and, should the output be unexpectedly empty, you'll get the (slightly) more meaningful error message No such file or directory.
Not double-quoting a variable reference or command substitution subjects its value / to so-called shell expansions: further, often unintended interpretation by the shell.
The wisdom of using a command substitution to generate a filename:
Leaving aside that printf $VAR2 is a fragile way to print the value of variable $VAR2 in general (the robust form again involves double-quoting: printf "$VAR2", or, even more robustly, to rule out inadvertent interpretation of escape sequences in the variable value, printf %s "$VAR2"), there is no good reason to employ a command substitution to begin with if all that's needed is a variable's value:
>> "$VAR2" is enough to robustly specify the value of variable $VAR2 as the target filename.
I tried this on my Mac (10.11.1) in a terminal window and it worked fine.
Are you sure your default shell is bash?
echo $SHELL
Did you use EXPORT to set your shell vars?
$ export VAR1="UselessData"
$ export VAR2="FileHoldingUselessData"
$ ./script.sh
$ cat FileHoldingUselessData
UselessData$
However.... echo I think does a better job since with printf the output terminates with the first space so....
$ cat script.sh
#!/bin/bash
echo $VAR1 >> `printf $VAR2`
$ ./script.sh
$ cat FileHoldingUselessData
Some Useless Data
Which leads me to believe you might want to just use echo instead of printf all together..
#!/bin/bash
echo $VAR1 >> `echo $VAR2`
On any Unix shell, both the forms <input.txt cat and cat <input.txt seem to work exactly the same. Is there any difference between them?
No, there is no difference between them. See the POSIX shell command language specification, which makes no distinction between a redirection before or following a simple command. (For compound commands, the specification only requires that the shell support redirections at the end).
Redirections at arbitrary points within a simple command are not something the POSIX sh specification requires support for; however, in shells such as bash where they are allowed, these too are syntactically equivalent.
There's no difference if somecommand is a simple command.
If it's a loop, the form with the redirection preceding the command is a syntax error:
For example, this:
while read line ; do echo "$line" ; done < /etc/motd
will print the contents of /etc/motd (handy to know if your system is so badly hosted that you can't execute the cat command), but this:
< /etc/motd while read line ; do echo "line" ; done
gives a syntax error:
bash: syntax error near unexpected token `do'
This is for bash; other shells might behave differently.
Everybody says eval is evil, and you should use $() as a replacement. But I've run into a situation where the unquoting isn't handled the same inside $().
Background is that I've been burned too often by file paths with spaces in them, and so like to quote all such paths. More paranoia about wanting to know where all my executables are coming from. Even more paranoid, not trusting myself, and so like being able to display the created commands I'm about to run.
Below I try variations on using eval vs. $(), and whether the command name is quoted (cuz it could contain spaces)
BIN_LS="/bin/ls"
thefile="arf"
thecmd="\"${BIN_LS}\" -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '"/bin/ls" -ld -- "arf"'
./foo.sh: line 8: "/bin/ls": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '"/bin/ls" -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
thecmd="${BIN_LS} -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access "arf": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
$("/bin/ls" -ld -- "${thefile}")
/bin/ls: cannot access arf: No such file or directory
So... this is confusing. A quoted command path is valid everywhere except inside a $() construct? A shorter, more direct example:
$ c="\"/bin/ls\" arf"
$ $($c)
-bash: "/bin/ls": No such file or directory
$ eval $c
/bin/ls: cannot access arf: No such file or directory
$ $("/bin/ls" arf)
/bin/ls: cannot access arf: No such file or directory
$ "/bin/ls" arf
/bin/ls: cannot access arf: No such file or directory
How does one explain the simple $($c) case?
The use of " to quote words is part of your interaction with Bash. When you type
$ "/bin/ls" arf
at the prompt, or in a script, you're telling Bash that the command consists of the words /bin/ls and arf, and the double-quotes are really emphasizing that /bin/ls is a single word.
When you type
$ eval '"/bin/ls" arf'
you're telling Bash that the command consists of the words eval and "/bin/ls" arf. Since the purpose of eval is to pretend that its argument is an actual human-input command, this is equivalent to running
$ "/bin/ls" arf
and the " gets processed just like at the prompt.
Note that this pretense is specific to eval; Bash doesn't usually go out of its way to pretend that something was an actual human-typed command.
When you type
$ c='"/bin/ls" arf'
$ $c
the $c gets substituted, and then undergoes word splitting (see §3.5.7 "Word Splitting" in the Bash Reference Manual), so the words of the command are "/bin/ls" (note the double-quotes!) and arf. Needless to say, this doesn't work. (It's also not very safe, since in addition to word-splitting, $c also undergoes filename-expansion and whatnot. Generally your parameter-expansions should always be in double-quotes, and if they can't be, then you should rewrite your code so they can be. Unquoted parameter-expansions are asking for trouble.)
When you type
$ c='"/bin/ls" arf'
$ $($c)
this is the same as before, except that now you're also trying to use the output of the nonworking command as a new command. Needless to say, that doesn't cause the nonworking command to suddenly work.
As Ignacio Vazquez-Abrams says in his answer, the right solution is to use an array, and handle the quoting properly:
$ c=("/bin/ls" arf)
$ "${c[#]}"
which sets c to an array with two elements, /bin/ls and arf, and uses those two elements as the word of a command.
With the fact that it doesn't make sense in the first place. Use an array instead.
$ c=("/bin/ls" arf)
$ "${c[#]}"
/bin/ls: cannot access arf: No such file or directory
From the man page for bash, regarding eval:
eval [arg ...]:
The args are read and concatenated together into a single command.
This command is then read and executed by the shell, and its exit
status is returned as the value of eval.
When c is defined as "\"/bin/ls\" arf", the outer quotes will cause the entire thing to be processed as the first argument to eval, which is expected to be a command or program. You need to pass your eval arguments in such a way that the target command and its arguments are listed separately.
The $(...) construct behaves differently than eval because it is not a command that takes arguments. It can process the entire command at once instead of processing arguments one at a time.
A note on your original premise: The main reason that people say that eval is evil was because it is commonly used by scripts to execute a user-provided string as a shell command. While handy at times, this is a major security problem (there's typically no practical way to safety-check the string before executing it). The security problem doesn't apply if you are using eval on hard-coded strings inside your script, as you are doing. However, it's typically easier and cleaner to use $(...) or `...` inside of scripts for command substitution, leaving no real use case left for eval.
Using set -vx helps us understand how bash process the command string.
As seen in the picture, "command" works cause quotes will be stripped when processing. However, when $c(quoted twice) is used, only the outside single quotes are removed. eval can process the string as the argument and outside quotes are removed step by step.
It is probably just related to how bash semanticallly process the string and quotes.
Bash does have many weird behaviours about quotes processing:
Bash inserting quotes into string before execution
How do you stop bash from stripping quotes when running a variable as a command?
Bash stripping quotes - how to preserve quotes