I have this bash script that starts with
for d in /data/mydata/*; do
echo $d
filepath=$(echo $d | tr "/" "\n")
pathArr=($filepath) # fails here
echo ${pathArr[-1]}
It runs fine when I just call in on command line
./run_preprocess.sh
but when I run it using screen
screen -dmSL run_preproc ./run_preprocess.sh
it fails on that pathArr line
./run_preproc.sh: 7: ./run_preproc.sh: Syntax error: "(" unexpected (expecting "done")
is there something I need to do to protect the script code?
Based on the error, looks like you're running your script with POSIX sh, not bash. Arrays are undefined in POSIX sh.
To fix this, add a proper hashbang to your script (e.g. /usr/bin/env bash, or run the script directly with Bash interpreter (e.g. /bin/bash script.sh).
In addition (unrelated to the problem at hand), your script (or the snippet posted) has several potential issues:
variables should be quoted to prevent globbing and word splitting (e.g. consider d - one of your files - containing * -- echo $d will include a list of all files, since * will be expanded)
splitting into array with ($var) is done on any IFS character, not just newlines. IFS includes a space, tab and newline by default. Use of read -a or mapfile is recommended over ($var).
Finally, if all you're trying is get the last component in path (filename), you should consider using basename(1):
$ basename /path/to/file
file
or substring removal syntax of Bash parameter expansion:
$ path=/path/to/file
$ echo "${path##*/}"
file
Related
This question already has answers here:
How to read a file into a variable in shell?
(9 answers)
Difference between sh and Bash
(11 answers)
Closed 4 years ago.
Setup:
File a contains:
22
File b contains:
12
I have shell script 1.sh:
#!/bin/sh
a=$(< a)
b=$(< b)
echo $(($a*$b)) > c
The script should get values from file a and b, multiply them *, and save to file c.
However after setting permission $ chmod a+rx 1.sh and running it $ ./1.sh it returns an error:
./1.sh: 5: ./1.sh: arithmetic expression: expecting primary: "*"
This error occurs because the variables $a and $b doesn't get value form files a and b.
If I echo $a and echo $b it returns nothing;
If I define a=22 and b=12 values in the script it works;
I also tried other ways of getting contents of files like a=$(< 'a'), a=$(< "a"), a=$(< "~/a"), and even a=$(< cat a). None of those worked.
Plot Twist:
However, if I change shebang line to #!/bin/bash so that Bash shell is used - it works.
Question:
How to properly get data from file in sh?
Ignore everything from file a and b but numbers:
#!/bin/sh
a=$(tr -cd 0-9 < a)
b=$(tr -cd 0-9 < b)
echo $(($a*$b))
See: man tr
If you're looking for "true" Bourne-Shell compatibility, as opposed to Bash's emulation, then you have to go old school:
#!/bin/sh
a=`cat a`
b=`cat b`
expr $a \* $b > c
I tried your original example under #!/bin/sh on both macOS and Linux (FC26), and it behaved properly, assuming a and b had UNIX line-endings. If that can't be guaranteed, and you need to run under #!/bin/sh (as emulated by bash), then something like this will work:
#!/bin/sh
a=$(<a)
b=$(<b)
echo $(( ${a%%[^0-9]*} * ${b%%[^0-9]*} )) > c
There are many ways. One obvious way is to pipe in a sub-process by Command Substitution:
A=$(cat fileA.txt) # 22
B=$(cat fileB.txt) # 12
echo $((A*B))
# <do it in your head!>
If there are any other problems with multiple lines, you need to look into how to use the Bash variable $IFS (Internal File Separator). Usually IFS is defined by: IFS=$' \t\n', so if you need to be able to reliably read lines endings from both Windows and Linux EOL's you may need to modify it.
ADDENDUM:
Process Substitution
Bash, Zsh, and AT&T ksh{88,93} (but not pdksh/mksh) support process
substitution. Process substitution isn't specified by POSIX. You may
use NamedPipes to accomplish the same things. Coprocesses can also do
everything process substitutions can, and are slightly more portable
(though the syntax for using them is not).
This also means that most Android OS does not allow process substitution, since their shells are most often based on mksh.
From man bash:
Process Substitution
Process substitution allows a process's input or output to be referred to using a filename. It takes the form of <(list) or >(list). The
process list is run asynchronously, and its input or output appears as a filename. This filename is passed as an argument to the current
command as the result of the expansion. If the >(list) form is used, writing to the file will provide input for list. If the <(list) form
is used, the file passed as an argument should be read to obtain the output of list. Process substitution is supported on systems that sup-
port named pipes (FIFOs) or the /dev/fd method of naming open files.
When available, process substitution is performed simultaneously with parameter and variable expansion, command substitution, and arithmetic
expansion.
The POSIX shell standard at
http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html#tag_18_07_04
says in section 2.6.3:
Command substitution allows the output of a command to be substituted in place of the command name itself
This would seem to imply that it is only guaranteed to work, if you substitute for the whole command name; if you substitute for a part of it, or something else, then it may or may not work.
Experimenting:
$ echo ;
$ $(echo echo) ;
$
So far so good...
$ e$(echo cho) ;
$ echo $(echo ';')
;
$ echo $(echo foobar)
foobar
The first and third example above seem to "work" but the second "does not work". Is this all simply undocumented and random behavior, as the standard seems to imply, and in reality for some other POSIX shell, none of these three are guaranteed to "work"?
(By "work", I mean "produce the same result as if the results of the substitution were typed in the command itself on the terminal")
The reason why this:
$ echo $(echo ';')
Does not output the same as this:
$ echo ;
Is the same reason why this:
$ ;
bash: syntax error near unexpected token `;'
Does not output the same as this:
$ `echo ';'`
;: command not found
The latest case, the output from command substitution (the (?)expected-to-be(?) token ;) is interpreted as a command, because it is passed [and interpreted] as a command in a subshell, and not as a command interpreter built-in token.
As my interpretation, this isn't against the standard.
EDIT: Answering the question
You stated:
This would seem to imply that it is only guaranteed to work, if you
substitute for the whole command name; if you substitute for a part of
it, or something else, then it may or may not work.
And the POSIX standard states:
The shell shall expand the command substitution by executing command
in a subshell environment (see Shell Execution Environment) and
replacing the command substitution (the text of command plus the
enclosing "$()" or backquotes) with the standard output of the
command, removing sequences of one or more <newline> characters at the
end of the substitution.
The standard seems clear regarding to what happens. This is not a question of what's present in a command containing a command substitution, only the command that's inside the enclosing $() or or ``.
I get the set of strings as input in terminal. I need to replace the ".awk" substring to ".sh" in each string using shell and then output modified string.
I wrote such script for doing this:
#!/bin/bash
while read line
do
result=${line/.awk/.sh}
echo $result
done
But it gives me an error: script-ch.sh: 6: script-ch.sh: Bad substitution.
How should I change this simple script to fix error?
UPD: ".awk" may be inside the string. For example: "file.awk.old".
If you are using Bash, then there is nothing wrong with your substitution. There is no reason to spawn an additional subshell and use a separate utility when bash substring replacement was tailor made to do that job:
$ fn="myfile.awk.old"; echo "$fn --> ${fn/.awk/.sh}"
myfile.awk.old --> myfile.sh.old
Note: if you are substituting .sh for .awk, then the . is unnecessary. A simple ${fn/awk/sh} will suffice.
I suspect you have some stray DOS character in your original script.
Not sure why it works for you and not for me.. might be the input you're giving it. It could have a space in it.
This should work:
#!/bin/bash
while read line
do
result=$(echo $line | sed 's/\.awk/\.sh/')
echo $result
done
If you run chmod +x script.sh and then run it with ./script.sh, or if you run it with bash script.sh, it should work fine.
Running it with sh script.sh will not work because the hashbang line will be ignored and the script will be interpreted by dash, which does not support that string substitution syntax.
I cannot believe I've spent 1.5 hours on something as trivial as this. I'm writing a very simple shell script which greps a file, stores the output in a variable, and echos the variable to STDOUT.
I have checked the grep command with the regex on the command line, and it works fine. But for some reason, the grep command doesn't work inside the shell script.
Here is the shell script I wrote up:
#!/bin/bash
tt=grep 'test' $1
echo $tt
I ran this with the following command: ./myScript.sh testingFile. It just prints an empty line.
I have already used chmod and made the script executable.
I have checked that the PATH variable has /bin in it.
Verified that echo $SHELL gives /bin/bash
In my desperation, I have tried all combinations of:
tt=grep 'test' "$1"
echo ${tt}
Not using the command line argument at all, and hardcoding the name of the file tt=grep 'test' testingFile
I found this: grep fails inside bash script but works on command line, and even used dos2unix to remove any possible carriage returns.
Also, when I try to use any of the grep options, like: tt=grep -oE 'test' testingFile, I get an error saying: ./out.sh: line 3: -oE: command not found.
This is crazy.
You need to use command substitution:
#!/usr/bin/env bash
test=$(grep 'foo' "$1")
echo "$test"
Command substitution allows the output of a command to replace the command itself. Command substitution occurs when a command is enclosed like this:
$(command)
or like this using backticks:
`command`
Bash performs the expansion by executing COMMAND and replacing the command substitution with the standard output of the command, with any trailing newlines deleted. Embedded newlines are not deleted, but they may be removed during word splitting.
The $() version is usually preferred because it allows nesting:
$(command $(command))
For more information read the command substitution section in man bash.
Everybody says eval is evil, and you should use $() as a replacement. But I've run into a situation where the unquoting isn't handled the same inside $().
Background is that I've been burned too often by file paths with spaces in them, and so like to quote all such paths. More paranoia about wanting to know where all my executables are coming from. Even more paranoid, not trusting myself, and so like being able to display the created commands I'm about to run.
Below I try variations on using eval vs. $(), and whether the command name is quoted (cuz it could contain spaces)
BIN_LS="/bin/ls"
thefile="arf"
thecmd="\"${BIN_LS}\" -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '"/bin/ls" -ld -- "arf"'
./foo.sh: line 8: "/bin/ls": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '"/bin/ls" -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
thecmd="${BIN_LS} -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access "arf": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
$("/bin/ls" -ld -- "${thefile}")
/bin/ls: cannot access arf: No such file or directory
So... this is confusing. A quoted command path is valid everywhere except inside a $() construct? A shorter, more direct example:
$ c="\"/bin/ls\" arf"
$ $($c)
-bash: "/bin/ls": No such file or directory
$ eval $c
/bin/ls: cannot access arf: No such file or directory
$ $("/bin/ls" arf)
/bin/ls: cannot access arf: No such file or directory
$ "/bin/ls" arf
/bin/ls: cannot access arf: No such file or directory
How does one explain the simple $($c) case?
The use of " to quote words is part of your interaction with Bash. When you type
$ "/bin/ls" arf
at the prompt, or in a script, you're telling Bash that the command consists of the words /bin/ls and arf, and the double-quotes are really emphasizing that /bin/ls is a single word.
When you type
$ eval '"/bin/ls" arf'
you're telling Bash that the command consists of the words eval and "/bin/ls" arf. Since the purpose of eval is to pretend that its argument is an actual human-input command, this is equivalent to running
$ "/bin/ls" arf
and the " gets processed just like at the prompt.
Note that this pretense is specific to eval; Bash doesn't usually go out of its way to pretend that something was an actual human-typed command.
When you type
$ c='"/bin/ls" arf'
$ $c
the $c gets substituted, and then undergoes word splitting (see ยง3.5.7 "Word Splitting" in the Bash Reference Manual), so the words of the command are "/bin/ls" (note the double-quotes!) and arf. Needless to say, this doesn't work. (It's also not very safe, since in addition to word-splitting, $c also undergoes filename-expansion and whatnot. Generally your parameter-expansions should always be in double-quotes, and if they can't be, then you should rewrite your code so they can be. Unquoted parameter-expansions are asking for trouble.)
When you type
$ c='"/bin/ls" arf'
$ $($c)
this is the same as before, except that now you're also trying to use the output of the nonworking command as a new command. Needless to say, that doesn't cause the nonworking command to suddenly work.
As Ignacio Vazquez-Abrams says in his answer, the right solution is to use an array, and handle the quoting properly:
$ c=("/bin/ls" arf)
$ "${c[#]}"
which sets c to an array with two elements, /bin/ls and arf, and uses those two elements as the word of a command.
With the fact that it doesn't make sense in the first place. Use an array instead.
$ c=("/bin/ls" arf)
$ "${c[#]}"
/bin/ls: cannot access arf: No such file or directory
From the man page for bash, regarding eval:
eval [arg ...]:
The args are read and concatenated together into a single command.
This command is then read and executed by the shell, and its exit
status is returned as the value of eval.
When c is defined as "\"/bin/ls\" arf", the outer quotes will cause the entire thing to be processed as the first argument to eval, which is expected to be a command or program. You need to pass your eval arguments in such a way that the target command and its arguments are listed separately.
The $(...) construct behaves differently than eval because it is not a command that takes arguments. It can process the entire command at once instead of processing arguments one at a time.
A note on your original premise: The main reason that people say that eval is evil was because it is commonly used by scripts to execute a user-provided string as a shell command. While handy at times, this is a major security problem (there's typically no practical way to safety-check the string before executing it). The security problem doesn't apply if you are using eval on hard-coded strings inside your script, as you are doing. However, it's typically easier and cleaner to use $(...) or `...` inside of scripts for command substitution, leaving no real use case left for eval.
Using set -vx helps us understand how bash process the command string.
As seen in the picture, "command" works cause quotes will be stripped when processing. However, when $c(quoted twice) is used, only the outside single quotes are removed. eval can process the string as the argument and outside quotes are removed step by step.
It is probably just related to how bash semanticallly process the string and quotes.
Bash does have many weird behaviours about quotes processing:
Bash inserting quotes into string before execution
How do you stop bash from stripping quotes when running a variable as a command?
Bash stripping quotes - how to preserve quotes