I'm running a bash script using
wget -O - https://myserver/install/Setup.sh | bash
How can I pass a parameter to the above script so it runs? something like
wget -O - https://myserver/install/Setup.sh parameter1 | bash
You can also run your script with:
wget -qO - 'https://myserver/install/Setup.sh' | bash -s parameter1
See: man bash OPTIONS -s
-s If the -s option is present, or if no arguments remain
after option processing, then commands are read from the
standard input. This option allows the positional
parameters to be set when invoking an interactive shell or
when reading input through a pipe.
or alternatively use the -c option.
bash -c "$(wget -qO - 'https://myserver/install/Setup.sh')" '' parameter1
the '' defines the parameter $0 to be empty string. In a normal file based script invocation, the parameter $0 contains the caller script name.
See: man bash OPTIONS -c
-c If the -c option is present, then commands are read from
the first non-option argument command_string. If there are
arguments after the command_string, the first argument is
assigned to $0 and any remaining arguments are assigned to
the positional parameters. The assignment to $0 sets the
name of the shell, which is used in warning and error
messages.
The standard format for the bash (or sh or similar) command is bash scriptfilename arg1 arg2 .... If you leave off all the first argument (the name or path of the script to run), it reads the script from stdin. Unfortunately, there's no way to leave off the firs argument but pass the others. Fortunately, you can pass /dev/stdin as the first argument and get the same effect (at least on most unix systems):
wget -O - https://myserver/install/Setup.sh | bash /dev/stdin parameter1
If you're on a system that doesn't have /dev/stdin, you might have to look around for an alternative way to specify stdin explicitly (/dev/fd/0 or something like that).
Edit: Léa Gris suggestion of bash -s arg1 arg2 ... is probably a better way to do this.
Related
Take the following example:
ls -l | grep -i readme | ./myscript.sh
What I am trying to do is get ls -l | grep -i readme as a string variable in myscript.sh. So essentially I am trying to get the whole command before the last pipe to use inside myscript.sh.
Is this possible?
No, it's not possible.
At the OS level, pipelines are implemented with the mkfifo(), dup2(), fork() and execve() syscalls. This doesn't provide a way to tell a program what the commands connected to its stdin are. Indeed, there's not guaranteed to be a string representing a pipeline of programs being used to generate stdin at all, even if your stdin really is a FIFO connected to another program's stdout; it could be that that pipeline was generated by programs calling execve() and friends directly.
The best available workaround is to invert your process flow.
It's not what you asked for, but it's what you can get.
#!/usr/bin/env bash
printf -v cmd_str '%q ' "$#" # generate a shell command representing our arguments
while IFS= read -r line; do
printf 'Output from %s: %s\n' "$cmd_str" "$line"
done < <("$#") # actually run those arguments as a command, and read from it
...and then have your script start the things it reads input from, rather than receiving them on stdin.
...thereafter, ./yourscript ls -l, or ./yourscript sh -c 'ls -l | grep -i readme'. (Of course, never use this except as an example; see ParsingLs).
It can't be done generally, but using the history command in bash it can maybe sort of be done, provided certain conditions are met:
history has to be turned on.
Only one shell has been running, or accepting new commands, (or failing that, running myscript.sh), since the start of myscript.sh.
Since command lines with leading spaces are, by default, not saved to the history, the invoking command for myscript.sh must have no leading spaces; or that default must be changed -- see Get bash history to remember only the commands run with space prefixed.
The invoking command needs to end with a &, because without it the new command line wouldn't be added to the history until after myscript.sh was completed.
The script needs to be a bash script, (it won't work with /bin/dash), and the calling shell needs a little prep work. Sometime before the script is run first do:
shopt -s histappend
PROMPT_COMMAND="history -a; history -n"
...this makes the bash history heritable. (Code swiped from unutbu's answer to a related question.)
Then myscript.sh might go:
#!/bin/bash
history -w
printf 'calling command was: %s\n' \
"$(history | rev |
grep "$0" ~/.bash_history | tail -1)"
Test run:
echo googa | ./myscript.sh &
Output, (minus the "&" associated cruft):
calling command was: echo googa | ./myscript.sh &
The cruft can be halved by changing "&" to "& fg", but the resulting output won't include the "fg" suffix.
I think you should pass it as one string parameter like this
./myscript.sh "$(ls -l | grep -i readme)"
I think that it is possible, have a look at this example:
#!/bin/bash
result=""
while read line; do
result=$result"${line}"
done
echo $result
Now run this script using a pipe, for example:
ls -l /etc | ./script.sh
I hope that will be helpful for you :)
I'm trying to understand -c option for bash better. The man page says:
-c: If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after the command_string, they are assigned to the positional parameters, starting with $0.
I'm having trouble understanding what this means.
If I do the following command with and without bash -c, I get the same result (example from http://www.tldp.org/LDP/abs/html/abs-guide.html):
$ set w x y z; IFS=":-;"; echo "$*"
w:x:y:z
$ bash -c 'set w x y z; IFS=":-;"; echo "$*"'
w:x:y:z
bash -c isn't as interesting when you're already running bash. Consider, on the other hand, the case when you want to run bash code from a Python script:
#!/usr/bin/env python
import subprocess
fileOne='hello'
fileTwo='world'
p = subprocess.Popen(['bash', '-c', 'diff <(sort "$1") <(sort "$2")',
'_', # this is $0 inside the bash script above
fileOne, # this is $1
fileTwo, # and this is $2
])
print p.communicate() # run that bash interpreter, and print its stdout and stderr
Here, because we're using bash-only syntax (<(...)), you couldn't run this with anything that used POSIX sh by default, which is the case for subprocess.Popen(..., shell=True); using bash -c thus provides access to capabilities that wouldn't otherwise be available without playing with FIFOs yourself.
Incidentally, this isn't the only way to do that: One could also use bash -s, and pass code in on stdin. Below, that's being done not from Python but POSIX sh (/bin/sh, which likewise is not guaranteed to have <(...) available):
#!/bin/sh
# ...this is POSIX sh code, not bash code; you can't use <() here
# ...so, if we want to do that, one way is as follows:
fileOne=hello
fileTwo=world
bash -s "$fileOne" "$fileTwo" <<'EOF'
# the inside of this heredoc is bash code, not POSIX sh code
diff <(sort "$1") <(sort "$2")
EOF
The -c option finds its most important uses when bash is launched by another program, and especially when the code to be executed may or does include redirections, pipelines, shell built-ins, shell variable assignments, and / or non-trivial lists. On POSIX systems that have /bin/sh being an alias for bash, it specifically supports the C library's system() function.
Equivalent behavior is much trickier to implement on top of fork / exec without using -c, though not altogether impossible.
How to execute BASH code from outside the BASH shell?
The answer is, using the -c option, which makes BASH execute whatever that has been passed as an argument to -c.
So, yeah, this is the purpose of this option, to execute BASH code arbitrarily, but just in another way.
I am looking for the quoting/splitting rules for a command passed to script -c command. The man pages just says
-c, --command command: Run the command rather than an interactive shell.
but I want to make sure "command" is properly escaped.
The COMMAND argument is just a regular string that is processed by the shell as if it were an excerpt of a file. We may think of -c COMMAND as being functionally equivalent of
printf '%s' COMMAND > /tmp/command_to_execute.sh
sh /tmp/command_to_execute.sh
The form -c COMMAND is however superior to the version relying of an auxiliary file because it avoids race conditions related to using an auxiliary file.
In the typical usage of the -c COMMAND option we pass COMMAND as a single-quoted string, as in this pseudo-code example:
sh -c '
do_some_complicated_tests "$1" "$2";
if something; then
proceed_this_way "$1" "$2";
else
proceed_that_way "$1" "$2";
fi' ARGV0 ARGV1 ARGV2
If command must contain single-quoted string, we can rely on printf to build the COMMAND string, but this can be tedious. An example of this technique is illustrated
by the overcomplicated grep-like COMMAND defined here:
% AWKSCRIPT='$0 ~ expr {print($0)}'
% COMMAND=$(printf 'awk -v expr="$1" \047%s\047' "$AWKSCRIPT")
% sh -c "$COMMAND" print_matching 'tuning' < /usr/share/games/fortune/freebsd-tips
"man tuning" gives some tips how to tune performance of your FreeBSD system.
Recall that 047 is octal representation of the ASCII code for the single quote character.
As a side note, these constructions are quite command in Makefiles where they can replace shell functions.
The simple script below does not work when, rather than passing a single file name, I want to pass multiple files through expansion characters like *
#!/bin/bash
fgrep -c '$$$$' $1
If I give the command script.sh file.in the script works. If I give the command script.sh *.in it doesn't.
Use "$#" to pass multiple file names to fgrep. $1 only passes the very first file name.
fgrep -c '$$$$' "$#"
I want to inject a transparent wrappering command on each shell command in a make file. Something like the time shell command. ( However, not the time command. This is a completely different command.)
Is there a way to specify some sort of wrapper or decorator for each shell command that gmake will issue?
Kind of. You can tell make to use a different shell.
SHELL = myshell
where myshell is a wrapper like
#!/bin/sh
time /bin/sh "$0" "$#"
However, the usual way to do that is to prefix a variable to all command calls. While I can't see any show-stopper for the SHELL approach, the prefix approach has the advantage that it's more flexible (you can specify different prefixes for different commands, and override prefix values on the command line), and could be visibly faster.
# Set Q=# to not display command names
TIME = time
foo:
$(Q)$(TIME) foo_compiler
And here's a complete, working example of a shell wrapper:
#!/bin/bash
RESULTZ=/home/rbroger1/repos/knl/results
if [ "$1" == "-c" ] ; then
shift
fi
strace -f -o `mktemp $RESULTZ/result_XXXXXXX` -e trace=open,stat64,execve,exit_group,chdir /bin/sh -c "$#" | awk '{if (match("Process PID=\d+ runs in (64|32) bit",$0) == 0) {print $0}}'
# EOF
I don't think there is a way to do what you want within GNUMake itself.
I have done things like modify the PATH env variable in the Makefile so a directory with my script linked to all name the bins I wanted wrapped was executed rather than the actual bin. The script would then look at how it was called and exec the actual bin with the wrapped command.
ie. exec time "$0" "$#"
These days I usually just update the targets in the Makefile itself. Keeping all your modifications to one file is usually better IMO than managing a directory of links.
Update
I defer to Gilles answer. It's a better answer than mine.
The program that GNU make(1) uses to run commands is specified by the SHELL make variable. It will run each command as
$SHELL -c <command>
You cannot get make to not put the -c in, since that is required for most shells. -c is passed as the first argument ($1) and <command> is passed as a single argument string as the second argument ($2).
You can write your own shell wrapper that prepends the command that you want, taking into account the -c:
#!/bin/sh
eval time "$2"
That will cause time to be run in front of each command. You need eval since $2 will often not be a single command and can contain all sorts of shell metacharacters that need to be expanded or processed.