How to extend command stored in sh variable with parameters? - bash

I tried to google the problem and play with different approaches but I failed to actually execute the command :/ I want to construct a command in a set of conditional checks. Here is what I want to achieve:
run_script=`command`
if type command &>/dev/null; then
run_script=`command`
else
run_script=`something command`
fi
# while ... do
$params=`-a -b -c` # calculated
$anotherparam = "./test.file" # calculated
# run $run_script+$params+$anotherparam ???
# like we run "command -a -b -c ./test.file" command
# done
Note: it's just an example
How can I do this type of combination? I can's use arrays because I need it to be compatible with sh

Correct POSIX sh way of dynamically constructing a command call with parameters:
#!/bin/sh
run_script='command'
if command -v "$run_script" >/dev/null 2>&1; then
set -- "$run_script"
else
set -- "something" "$run_script"
fi
# while ... do
# $params=`-a -b -c` # calculated
set -- "$#" -a -b -c # calculated
# $anotherparam = "./test.file" # calculated
set -- "$#" "./test.file" # calculated
# run $run_script+$params+$anotherparam ???
"$#" # run command with its parameters
# like we run "command -a -b -c ./test.file" command
# done
Explanations:
if command -v "$run_script" >/dev/null 2>&1: Tests if the "$run_script" command exists.
set -- "$run_script": Sets the arguments array to the "$run_script" command.
set -- "something" "$run_script": Sets the arguments array to the something command with first argument "$run_script".
set -- "$#" -a -b -c: Sets the arguments array with "$#" current content of the arguments array, and -a, -b and -c as additional arguments.
Stand-alone "$#": Calls the command and arguments contained in the arguments array.

You could use:
if type -p command; then
run_script="command"
else
run_script="something command"
fi
params="-a -b -c"
anotherparam="./test.file"
"$run_script" $params $anotherparam
Changes to OP code:
use type -p cmd instead of type cmd &>/dev/null
use quotes instead of backticks for run_script="command"
remove spaces around the equal sign =
remove $ from left side of the equal sign

This will raise strong opinions.
If you are completely sure that you will only receive trusted, non-malicious input, eval is part of the standard shell and a perfect tool for this task.
See: https://unix.stackexchange.com/questions/278427/why-and-when-should-eval-use-be-avoided-in-shell-scripts
The arguments are concatenated together into a single command, which
is then read and executed, and its exit status returned as the exit
status of eval. If there are no arguments or only empty arguments, the
return status is zero.
if type command &>/dev/null; then
run_script="command"
else
run_script="something command"
fi
params="-a -b -c" # calculated
anotherparam="./test.file" # calculated
eval "$run_script $params $anotherparam"

Related

OSX Command line: echo command before running it? [duplicate]

In a shell script, how do I echo all shell commands called and expand any variable names?
For example, given the following line:
ls $DIRNAME
I would like the script to run the command and display the following
ls /full/path/to/some/dir
The purpose is to save a log of all shell commands called and their arguments. Is there perhaps a better way of generating such a log?
set -x or set -o xtrace expands variables and prints a little + sign before the line.
set -v or set -o verbose does not expand the variables before printing.
Use set +x and set +v to turn off the above settings.
On the first line of the script, one can put #!/bin/sh -x (or -v) to have the same effect as set -x (or -v) later in the script.
The above also works with /bin/sh.
See the bash-hackers' wiki on set attributes, and on debugging.
$ cat shl
#!/bin/bash
DIR=/tmp/so
ls $DIR
$ bash -x shl
+ DIR=/tmp/so
+ ls /tmp/so
$
set -x will give you what you want.
Here is an example shell script to demonstrate:
#!/bin/bash
set -x #echo on
ls $PWD
This expands all variables and prints the full commands before output of the command.
Output:
+ ls /home/user/
file1.txt file2.txt
I use a function to echo and run the command:
#!/bin/bash
# Function to display commands
exe() { echo "\$ $#" ; "$#" ; }
exe echo hello world
Which outputs
$ echo hello world
hello world
For more complicated commands pipes, etc., you can use eval:
#!/bin/bash
# Function to display commands
exe() { echo "\$ ${#/eval/}" ; "$#" ; }
exe eval "echo 'Hello, World!' | cut -d ' ' -f1"
Which outputs
$ echo 'Hello, World!' | cut -d ' ' -f1
Hello
You can also toggle this for select lines in your script by wrapping them in set -x and set +x, for example,
#!/bin/bash
...
if [[ ! -e $OUT_FILE ]];
then
echo "grabbing $URL"
set -x
curl --fail --noproxy $SERV -s -S $URL -o $OUT_FILE
set +x
fi
shuckc's answer for echoing select lines has a few downsides: you end up with the following set +x command being echoed as well, and you lose the ability to test the exit code with $? since it gets overwritten by the set +x.
Another option is to run the command in a subshell:
echo "getting URL..."
( set -x ; curl -s --fail $URL -o $OUTFILE )
if [ $? -eq 0 ] ; then
echo "curl failed"
exit 1
fi
which will give you output like:
getting URL...
+ curl -s --fail http://example.com/missing -o /tmp/example
curl failed
This does incur the overhead of creating a new subshell for the command, though.
According to TLDP's Bash Guide for Beginners: Chapter 2. Writing and debugging scripts:
2.3.1. Debugging on the entire script
$ bash -x script1.sh
...
There is now a full-fledged debugger for Bash, available at SourceForge. These debugging features are available in most modern versions of Bash, starting from 3.x.
2.3.2. Debugging on part(s) of the script
set -x # Activate debugging from here
w
set +x # Stop debugging from here
...
Table 2-1. Overview of set debugging options
Short | Long notation | Result
-------+---------------+--------------------------------------------------------------
set -f | set -o noglob | Disable file name generation using metacharacters (globbing).
set -v | set -o verbose| Prints shell input lines as they are read.
set -x | set -o xtrace | Print command traces before executing command.
...
Alternatively, these modes can be specified in the script itself, by
adding the desired options to the first line shell declaration.
Options can be combined, as is usually the case with UNIX commands:
#!/bin/bash -xv
Another option is to put "-x" at the top of your script instead of on the command line:
$ cat ./server
#!/bin/bash -x
ssh user#server
$ ./server
+ ssh user#server
user#server's password: ^C
$
You can execute a Bash script in debug mode with the -x option.
This will echo all the commands.
bash -x example_script.sh
# Console output
+ cd /home/user
+ mv text.txt mytext.txt
You can also save the -x option in the script. Just specify the -x option in the shebang.
######## example_script.sh ###################
#!/bin/bash -x
cd /home/user
mv text.txt mytext.txt
##############################################
./example_script.sh
# Console output
+ cd /home/user
+ mv text.txt mytext.txt
Type "bash -x" on the command line before the name of the Bash script. For instance, to execute foo.sh, type:
bash -x foo.sh
Combining all the answers I found this to be the best, simplest
#!/bin/bash
# https://stackoverflow.com/a/64644990/8608146
exe(){
set -x
"$#"
{ set +x; } 2>/dev/null
}
# example
exe go generate ./...
{ set +x; } 2>/dev/null from https://stackoverflow.com/a/19226038/8608146
If the exit status of the command is needed, as mentioned here
Use
{ STATUS=$?; set +x; } 2>/dev/null
And use the $STATUS later like exit $STATUS at the end
A slightly more useful one
#!/bin/bash
# https://stackoverflow.com/a/64644990/8608146
_exe(){
[ $1 == on ] && { set -x; return; } 2>/dev/null
[ $1 == off ] && { set +x; return; } 2>/dev/null
echo + "$#"
"$#"
}
exe(){
{ _exe "$#"; } 2>/dev/null
}
# examples
exe on # turn on same as set -x
echo This command prints with +
echo This too prints with +
exe off # same as set +x
echo This does not
# can also be used for individual commands
exe echo what up!
For zsh, echo
setopt VERBOSE
And for debugging,
setopt XTRACE
To allow for compound commands to be echoed, I use eval plus Soth's exe function to echo and run the command. This is useful for piped commands that would otherwise only show none or just the initial part of the piped command.
Without eval:
exe() { echo "\$ $#" ; "$#" ; }
exe ls -F | grep *.txt
Outputs:
$
file.txt
With eval:
exe() { echo "\$ $#" ; "$#" ; }
exe eval 'ls -F | grep *.txt'
Which outputs
$ exe eval 'ls -F | grep *.txt'
file.txt
For csh and tcsh, you can set verbose or set echo (or you can even set both, but it may result in some duplication most of the time).
The verbose option prints pretty much the exact shell expression that you type.
The echo option is more indicative of what will be executed through spawning.
http://www.tcsh.org/tcsh.html/Special_shell_variables.html#verbose
http://www.tcsh.org/tcsh.html/Special_shell_variables.html#echo
Special shell variables
verbose
If set, causes the words of each command to be printed, after history substitution (if any). Set by the -v command line option.
echo
If set, each command with its arguments is echoed just before it is executed. For non-builtin commands all expansions occur before echoing. Builtin commands are echoed before command and filename substitution, because these substitutions are then done selectively. Set by the -x command line option.
$ cat exampleScript.sh
#!/bin/bash
name="karthik";
echo $name;
bash -x exampleScript.sh
Output is as follows:

Pass all args to a command called in a new shell using bash -c

I've simplified my example to the following:
file1.sh:
#!/bin/bash
bash -c "./file2.sh $#"
file2.sh:
#!/bin/bash
echo "first $1"
echo "second $2"
I expect that if I call ./file1.sh a b to get:
first a
second b
but instead I get:
first a
second
In other words, my later arguments after the first one are not getting passed through to the command that I'm executing inside a new bash shell. I've tried many variations of removing and moving around the quotation marks in the file1.sh file, but haven't got this to work.
Why is this happening, and how do I get the behavior I want?
(UPDATE - I realize it seems pointless that I'm calling bash -c in this example, my actual file1.sh is a proxy script for a command that gets called locally to run in a docker container so it's actually docker exec -i mycontainer bash -c '')
Change file1.sh to this with different quoting:
#!/bin/bash
bash -c './file2.sh "$#"' - "$#"
- "$#" is passing hyphen to populate $0 and $# is being passed in to populate all other positional parameters in bash -c command line.
You can also make it:
bash -c './file2.sh "$#"' "$0" "$#"
However there is no real need to use bash -c here and you can just use:
./file2.sh "$#"

Bash: redirect to screen or /dev/null depending on flag

I'm trying to come up with a way script to pass a silent flag in a bash so that all output will be directed to /dev/null if it is present and to the screen if it is not.
An MWE of my script would be:
#!/bin/bash
# Check if silent flag is on.
if [ $2 = "-s" ]; then
echo "Silent mode."
# Non-working line.
out_var = "to screen"
else
echo $1
# Non-working line.
out_var = "/dev/null"
fi
command1 > out_var
command2 > out_var
echo "End."
I call the script with two variables, the first one is irrelevant and the second one ($2) is the actual silent flag (-s):
./myscript.sh first_variable -s
Obviously the out_var lines don't work, but they give an idea of what I want: a way to direct the output of command1 and command2 to either the screen or to /dev/null depending on -s being present or not.
How could I do this?
You can use the naked exec command to redirect the current program without starting a new one.
Hence, a -s flag could be processed with something like:
if [[ "$1" == "-s" ]] ; then
exec >/dev/null 2>&1
fi
The following complete script shows how to do it:
#!/bin/bash
echo XYZZY
if [[ "$1" == "-s" ]] ; then
exec >/dev/null 2>&1
fi
echo PLUGH
If you run it with -s, you get XYZZY but no PLUGH output (well, technically, you do get PLUGH output but it's sent to the /dev/null bit bucket).
If you run it without -s, you get both lines.
The before and after echo statements show that exec is acting as described, simply changing redirection for the current program rather than attempting to re-execute it.
As an aside, I've assumed you meant "to screen" to be "to the current standard output", which may or may not be the actual terminal device (for example if it's already been redirected to somewhere else). If you do want the actual terminal device, it can still be done (using /dev/tty for example) but that would be an unusual requirement.
There are lots of things that could be wrong with your script; I won't attempt to guess since you didn't post any actual output or errors.
However, there are a couple of things that can help:
You need to figure out where your output is really going. Standard output and standard error are two different things, and redirecting one doesn't necessarily redirect the other.
In Bash, you can send output to /dev/stdout or /dev/stderr, so you might want to try something like:
# Send standard output to the tty/pty, or wherever stdout is currently going.
cmd > /dev/stdout
# Do the same thing, but with standard error instead.
cmd > /dev/stderr
Redirect standard error to standard output, and then send standard output to /dev/null. Order matters here.
cmd 2>&1 > /dev/null
There may be other problems with your script, too, but for issues with Bash shell redirections the GNU Bash manual is the canonical source of information. Hope it helps!
If you don't want to redirect all output from your script, you can use eval. For example:
$ fd=1
$ eval "echo hi >$a" >/dev/null
$ fd=2
$ eval "echo hi >$a" >/dev/null
hi
Make sure you use double quotes so that the variable is replaced before eval evaluates it.
In your case, you just needed to change out_var = "to screen" to out_var = "/dev/tty". And use it like this command1 > $out_var (see the '$' you are lacking)
I implemented it like this
# Set debug flag as desired
DEBUG=1
# DEBUG=0
if [ "$DEBUG" -eq "1" ]; then
OUT='/dev/tty'
else
OUT='/dev/null'
fi
# actual script use commands like this
command > $OUT 2>&1
# or like this if you need
command 2> $OUT
Of course you can also set the debug mode from a cli option, see How do I parse command line arguments in Bash?
And you can have multiple debug or verbose levels like this
# Set VERBOSE level as desired
# VERBOSE=0
VERBOSE=1
# VERBOSE=2
VERBOSE1='/dev/null'
VERBOSE2='/dev/null'
if [ "$VERBOSE" -gte 1 ]; then
VERBOSE1='/dev/tty'
fi
if [ "$VERBOSE" -gte 2 ]; then
VERBOSE2='/dev/tty'
fi
# actual script use commands like this
command > $VERBOSE1 2>&1
# or like this if you need
command 2> $VERBOSE2

how to silently disable xtrace in a shell script?

I'm writing a shell script that loops over some values and run a long command line for each value. I'd like to print out these commands along the way, just like make does when running a makefile. I know I could just "echo" all commands before running them, but it feels inelegant. So I'm looking at set -x and similar mechanisms instead :
#!/bin/sh
for value in a long list of values
do
set -v
touch $value # imagine a complicated invocation here
set +v
done
My problem is: at each iteration, not only is the interresting line printed out, but also the set +x line as well. Is it somehow possible to prevent that ? If not, what workaround do you recommend ?
PS: the MWE above uses sh, but I also have bash and zsh installed in case that helps.
Sandbox it in a subshell:
(set -x; do_thing_you_want_traced)
Of course, changes to variables or the environment made in that subshell will be lost.
If you REALLY care about this, you could also use a DEBUG trap (using set -T to cause it to be inherited by functions) to implement your own set -x equivalent.
For instance, if using bash:
trap_fn() {
[[ $DEBUG && $BASH_COMMAND != "unset DEBUG" ]] && \
printf "[%s:%s] %s\n" "$BASH_SOURCE" "$LINENO" "$BASH_COMMAND"
return 0 # do not block execution in extdebug mode
}
trap trap_fn DEBUG
DEBUG=1
# ...do something you want traced...
unset DEBUG
That said, emitting BASH_COMMAND (as a DEBUG trap can do) is not fully equivalent of set -x; for instance, it does not show post-expansion values.
You want to try using a single-line xtrace:
function xtrace() {
# Print the line as if xtrace was turned on, using perl to filter out
# the extra colon character and the following "set +x" line.
(
set -x
# Colon is a no-op in bash, so nothing will execute.
: "$#"
set +x
) 2>&1 | perl -ne 's/^[+] :/+/ and print' 1>&2
# Execute the original line unmolested
"$#"
}
The original command executes in the same shell under an identity transformation. Just prior to running, you get a non-recursive xtrace of the arguments. This allows you to xtrace the commands you care about without spamming stederr with duplicate copies of every "echo" command.
# Example
for value in $long_list; do
computed_value=$(echo "$value" | sed 's/.../...')
xtrace some_command -x -y -z $value $computed_value ...
done
Next command disables 'xtrace' option:
$ set +o xtrace
I thought of
set -x >/dev/null 2>1; echo 1; echo 2; set +x >/dev/null 2>&1
but got
+ echo 1
1
+ echo 2
2
+ 1> /dev/null 2>& 1
I'm surprised by these results. .... But
set -x ; echo 1; echo 2; set +x
+ echo 1
1
+ echo 2
2
looks to meet your requirement.
I saw similar results when I put each statement on its only line (excepting the set +x)
IHTH.

What is the purpose of 'set -- $args' after getopt?

The usual example for using getopt in bash is as follow
args=`getopt abo: $*`
errcode=$?
set -- $args
What does that last line achieve?
This explains it very well. Essentially, it is to break a single argument with multiple flags into multiple arguments each with single flag:
Whether you call your script as
script -ab
or as
script -a -b
after the set -- $args, $1 will be -a and $2 will be -b. It makes processing easier.
BTW, getopts is much better
set updates the positional parameters of the script.
#! /bin/bash
echo "$*"
set -- $1 baz
echo "$*"
If this script is invoked with /path/to/script foo bar, the output is:
foo bar
foo baz

Resources