This question already has answers here:
Reading quoted/escaped arguments correctly from a string
(4 answers)
Closed 3 years ago.
Got a little bash script like so:
#!/bin/bash
TIME_CMD='/usr/bin/time -f "%E execution time"'
${TIME_CMD} ls
Only problem: doesn't work:
/usr/bin/time: cannot run execution: No such file or directory
Command exited with non-zero status 127
"0:00.00
What am I doing wrong?
Try making it...
#!/bin/bash
TIME_CMD='/usr/bin/time -f "%E execution time"'
eval "$TIME_CMD ls"
This will utilize bash to re-parse the command string after it has been constructed, so that the quoted argument will be recognized properly.
Storing commands in variables is generally a bad idea (see BashFAQ #050 for details). The reason it's not working as you expect is that quoting inside variable values is ignored (unless you run it through something like eval, which then tends to lead to other parsing oddities).
In your case, I see three fairly straightforward ways to do it. First, you can use an alias instead of a variable:
alias TIME_CMD='/usr/bin/time -f "%E execution time"'
TIME_CMD ls
Second, you can use a function:
TIME_CMD() { /usr/bin/time -f "%E execution time" "$#"; }
TIME_CMD ls
Third, you can use an array rather than a simple variable:
TIME_CMD=(/usr/bin/time -f "%E execution time")
"${TIME_CMD[#]}" ls
Note that with an array, you need to expand it with the "${array[#]}" idiom to preserve word breaks properly.
Related
This question already has answers here:
Syntax error in shell script with process substitution
(4 answers)
Closed 3 years ago.
I wonder why it doesn't work.
Please advise me.
1. working
$ nu=`awk '/^Mem/ {printf($2*0.7);}' <(free -m)`
$ echo $nu
1291.5
2. not working
$ cat test.sh
#!/bin/bash
nu=`awk '/^Mem/ {printf($2*0.7);}' <(free -m)`
echo $nu
$ sh test.sh
test.sh: command substitution: line 2: syntax error near unexpected token `('
test.sh: command substitution: line 2: `awk '/^Mem/ {printf($2*0.7);}' <(free -m)'
Could you please try following.
nu=$(free -m | awk '/^Mem/ {print $2*0.7}')
echo "$nu"
Things taken care are:
Use of backtick is depreciated so use $ to store variable's value.
Also first run free command pass its standard output as standard input to awk command by using |(which should be ideal way of sending output of a command to awk in this scenario specially) and save its output to a variable named nu.
Now finally print variable nu by echo.
Since <(...) process substitution is supported by bash not by sh so I am trying to give a solution where it could support without process substitution (which I mentioned a bit earlier too).
The <( ) construct ("process substitution") is not available in all shells, or even in bash when it's invoked with the name "sh". When you run the script with sh test.sh, that overrides the shebang (which specifies bash), so that feature is not available. You need to either run the script explicitly with bash, or (better) just run it as ./test.sh and let the shebang line do its job.
The reason to add a shebang in a script is to define an interpreter directive if the file has execution permission.
Then, you should invoke it by, for example
$ ./test.sh
once you have set the permission
$ chmod +x test.sh
This question already has answers here:
How to run script commands from variables?
(3 answers)
Execute command in a variable don't execute the latter part of a pipe
(1 answer)
Running a command that is stored in a variable (including pipes and redirects)
(3 answers)
Closed 5 years ago.
#!/bin/bash
# 1st part
ret=$(ps aux | grep -v grep) # thats OK
echo $ret
# 2nd part
cmd="ps aux | grep -v grep" # a problem with the pipe |
ret=$($cmd)
echo $ret
How can I use a command-string as I have in the 2nd part? Think the pipe is the problem. Tried to escape it but it did not help. Get some snytax error of ps.
Thanks!
You need eval:
ret=$(eval "$cmd")
Using eval is not recommended here. It can lead to unexpected results, especially when variables can be read from untrusted sources (See BashFAQ/048 - Eval command and security issues.
You can solve this in a simple way by defining and calling a function as below
ps_cmd() {
ps aux | grep -v grep
}
and use it in the script as
output="$(ps_cmd)"
echo "$output"
Also a good read would be to see why storing commands in a variable is not a good idea and has a lot of potential pitfalls - BashFAQ/050 - I'm trying to put a command in a variable, but the complex cases always fail!
This question already has answers here:
How to run script commands from variables?
(3 answers)
Execute command in a variable don't execute the latter part of a pipe
(1 answer)
Running a command that is stored in a variable (including pipes and redirects)
(3 answers)
Closed 5 years ago.
#!/bin/bash
# 1st part
ret=$(ps aux | grep -v grep) # thats OK
echo $ret
# 2nd part
cmd="ps aux | grep -v grep" # a problem with the pipe |
ret=$($cmd)
echo $ret
How can I use a command-string as I have in the 2nd part? Think the pipe is the problem. Tried to escape it but it did not help. Get some snytax error of ps.
Thanks!
You need eval:
ret=$(eval "$cmd")
Using eval is not recommended here. It can lead to unexpected results, especially when variables can be read from untrusted sources (See BashFAQ/048 - Eval command and security issues.
You can solve this in a simple way by defining and calling a function as below
ps_cmd() {
ps aux | grep -v grep
}
and use it in the script as
output="$(ps_cmd)"
echo "$output"
Also a good read would be to see why storing commands in a variable is not a good idea and has a lot of potential pitfalls - BashFAQ/050 - I'm trying to put a command in a variable, but the complex cases always fail!
I want to make a bash script which invokes some simple commands in the following way
./myscript magicword
would invoke
#!/bin/bash
cat * | grep "$#"
However I want to add one more level of complexity, for example triggering case sensitivity or not. Thus I want to include he parsing of arguments such as
./myscript --case-insensitive magicword
which should be interpreted by the script as grep -i "$#".
I have seen many tutorials about bash scripting with arguments, and case structures, but those had to do mostly with each executing different commands altogether, or assigning variables (which are not actually using them as command options).
I didn't find a way of a syntax such as grep $1 "$#" or something, where $1 ought to be null, or anything else.
I want to inject a transparent wrappering command on each shell command in a make file. Something like the time shell command. ( However, not the time command. This is a completely different command.)
Is there a way to specify some sort of wrapper or decorator for each shell command that gmake will issue?
Kind of. You can tell make to use a different shell.
SHELL = myshell
where myshell is a wrapper like
#!/bin/sh
time /bin/sh "$0" "$#"
However, the usual way to do that is to prefix a variable to all command calls. While I can't see any show-stopper for the SHELL approach, the prefix approach has the advantage that it's more flexible (you can specify different prefixes for different commands, and override prefix values on the command line), and could be visibly faster.
# Set Q=# to not display command names
TIME = time
foo:
$(Q)$(TIME) foo_compiler
And here's a complete, working example of a shell wrapper:
#!/bin/bash
RESULTZ=/home/rbroger1/repos/knl/results
if [ "$1" == "-c" ] ; then
shift
fi
strace -f -o `mktemp $RESULTZ/result_XXXXXXX` -e trace=open,stat64,execve,exit_group,chdir /bin/sh -c "$#" | awk '{if (match("Process PID=\d+ runs in (64|32) bit",$0) == 0) {print $0}}'
# EOF
I don't think there is a way to do what you want within GNUMake itself.
I have done things like modify the PATH env variable in the Makefile so a directory with my script linked to all name the bins I wanted wrapped was executed rather than the actual bin. The script would then look at how it was called and exec the actual bin with the wrapped command.
ie. exec time "$0" "$#"
These days I usually just update the targets in the Makefile itself. Keeping all your modifications to one file is usually better IMO than managing a directory of links.
Update
I defer to Gilles answer. It's a better answer than mine.
The program that GNU make(1) uses to run commands is specified by the SHELL make variable. It will run each command as
$SHELL -c <command>
You cannot get make to not put the -c in, since that is required for most shells. -c is passed as the first argument ($1) and <command> is passed as a single argument string as the second argument ($2).
You can write your own shell wrapper that prepends the command that you want, taking into account the -c:
#!/bin/sh
eval time "$2"
That will cause time to be run in front of each command. You need eval since $2 will often not be a single command and can contain all sorts of shell metacharacters that need to be expanded or processed.