I have this function in my ~/.zshrc
async () {
if ! [[ $# -gt 0 ]];then
echo "Not enough arguments!"
fi
local busus="$IFS"
export IFS=" "
echo "$* &"
command "$* &"
export IFS="$busus"
}
and an alias
alias word='async "libreoffice --writer"'
The echo "$* &" line is used only for debugging.
When I run word, libreoffice --writer & is shown on the screen (no extra spaces or newlines), but nothing happens.
I also tried executing command libreoffice --writer & and it worked perfectly.
(My current shell is zsh)
What is wrong?
Thanks
Usually (especially in bash), the problem is that people aren't using enough double-quotes; in this case, it's the opposite: you're using too many double-quotes. The basic problem is that the command name and each of the arguments to it must be a separate "word" (in shell syntax), but double-quoting something will (usually) make the shell treat it as all one word. Here's a quick demo:
% echo foo
foo
% "echo foo"
zsh: command not found: echo foo
Here, the double-quotes make the shell treat " foo" as part of the command name, rather than as a delimiter and an argument after the command name. Similarly, when you use "$* &", the double-quotes tell the shell to treat the entire thing (including even the ampersand) as a single long word (and pass it as an argument to command). (BTW, the command isn't needed, but isn't causing any harm either.)
The standard way to do this is to use "$#" instead -- here the $# acts specially within double-quotes, making each argument into a separate word. In zsh, you could omit the double-quotes, but that can cause trouble in other shells so I recommend using them anyway.
Also, don't mess with IFS. You don't need to, and it opens a can of worms that's best left closed. And if there are no arguments, you should return immediately, rather than continuing and trying to run an empty command.
But there's another problem: in the alias, you double-quote "libreoffice --writer", which is going to have pretty much the same effect again. So remove those double-quotes. But keep the single-quotes around the alias, so it'll be defined as a single alias.
So here's my proposed correction:
async () {
if ! [[ $# -gt 0 ]];then
echo "Not enough arguments!"
return 1 # Do not continue if there's no command to run!
fi
echo "$* &" # Here quoting is appropriate, so it's a single argument to echo
"$#" &
}
alias word='async libreoffice --writer'
Using "$#" directly is more reliable:
async () { [ "$#" -gt 0 ] && "$#" & }
alias word='async libreoffice --writer'
Related
Basically, having the PS1 prompt set like this
(Assume all varibles and functions escape properly)
PROMPT_COMMAND='\
ret=$?;\
p_colorSetup;\
PS1="[\[${P_TIME}\]\A\[${P_RES}\]]$(p_multiPlex)[\[${P_NAME}\]\u\[${P_RES}\]#\[${P_HOST}\]\h\[${P_RES}\]]: \[${P_DIR}\]\w\[${P_RES}\]\n$(p_returnColor ${ret}) \\$ "'
vs doing this
PROMPT_COMMAND='\
ret=$?;\
p_colorSetup;\
printf "%b" "[${P_TIME}$(date +%H:%M)${P_RES}]$(p_multiPlex)[${P_NAME}$(id -un)${P_RES}#${P_HOST}$(hostname -s)${P_RES}]: ${P_DIR}$(pwd | sed -e "s#${HOME}#~#" )${P_RES} \n";\
PS1="$(p_returnColor ${ret}) \\$ "'
I'm curious if there's any benefits/drawbacks to doing it one way vs the other. As long as I escape everything correctly, I don't seem to run into any issues with wrapping when typing or going through my history.
Is there any reason in this case to use one vs the other?
I would define a function to call from PROMPT_COMMAND, which makes it simpler to quote and split the code across multiple lines. This makes it easier to read, debug, and modify.
make_prompt () {
ret=$?
p_colorSetup
PS1="[\[$P_TIME\]\A\[$P_RES\]]"
PS1+=$(p_multiPlex)
PS1+="[\[$P_NAME\]\u\[$P_RES\]#\[${P_HOST}\]\h\[${P_RES}\]]: "
PS1+="\[${P_DIR}\]\w\[${P_RES}\]\n"
PS1+=$(p_returnColor ${ret})
PS1+=' \$'
}
PROMPT_COMMAND=make_prompt
You have to make sure, that Bash can calculate the length of the prompt, which is the number of printable characters. If you print something, Bash does not know about it. And if you edit multi-line commands, Bash will messes the prompt.
In most cases it is better to use Bash's parameter expansion instead of sed.
I think it is better to use PROMPT_COMMAND just to define some variables, which will be referenced in PS1. For readability it might be better to write a function, which builds PS1.
I do some fancy prompt coloring and length clipping:
export PS1_MAXDIRLEN=25
I create a function for PROMPT_COMMAND, which sets just some variables using Bash's parameter expansion.
PROMPT_COMMAND ()
{
# Exit status
EXIT_STATUS=$?
PS1EXIT=${EXIT_STATUS##0}
# Working directory
PS1CLIP=${PWD: $((-PS1_MAXDIRLEN))}
local p=${PS1CLIP:+${PWD: $((-(PS1_MAXDIRLEN-1)))}}
PS1CLIP=${PS1CLIP:+<}
PS1DIR=${p:-$PWD}
}
export PROMPT_COMMAND=PROMPT_COMMAND
And then I use the variables in my function, which builds PS1.
ps1 ()
{
local bold="\[\e[1m\]"
local black="\[\e[30m\]"
local red="\[\e[31m\]"
local green="\[\e[32m\]"
local yellow="\[\e[33m\]"
local blue="\[\e[34m\]"
local magenta="\[\e[35m\]"
local cyan="\[\e[36m\]"
local reset="\[\e[m\]"
# terminal title
if [ "$TERM" = xterm ]; then
echo -n '\[\e]0;\h${PS1EXIT:+ [$PS1EXIT]}\a\]'
fi
# visible prompt
echo -n $bold
# exit code
echo -n '${PS1EXIT:+'$black'['$red'$PS1EXIT'$black'] }'
# user # host
echo -n $red'\u'$black'#'$magenta'\h'$black':'
# directory
echo -n $red'$PS1CLIP'$blue'${PS1DIR////'$black'/'$blue'}'
# command number
#echo -n $black':'$yellow'\!'
# prompt char
echo -n $black'\$'
# reset colors
echo $reset' '
}
export PS1=$(ps1)
unset ps1
My bash script writes an another bash script using printf.
printf "#!/bin/bash
HOME=${server}
file=gromacs*
file_name=\$(basename "\${file}")
date=\$(date +"\%m_\%d_\%Y")
for sim in \${HOME}/* ; do
if [[ -d \$sim ]]; then
simulation=$(basename "\$sim")
pushd \${sim}
cp \$file \${server}/\${results}/\${file_name}.\${simulation}.\${date}
echo "\${file_name}\ from\ \${simulation}\ has\ been\ collected!"
popd
fi
done" > ${output}/collecter.sh
Here there is a problem in escappiong of the elements within date variable
date=\$(date +"\%m_\%d_\%Y")
where the below part did not work properly
"\%m_\%d_\%Y"
it results in incomplete of a new bash script produced by printf.
How it should be fixed?
Thanks!
Use a quoted heredoc.
{
## print the header, and substitute our own value for HOME
printf '#!/bin/bash\nHOME=%q\n' "$server"
## EVERYTHING BELOW HERE UNTIL THE EOF IS LITERAL
cat <<'EOF'
file=( gromacs* )
(( ${#file[#]} == 1 )) && [[ -e $file ]] || {
echo "ERROR: Exactly one file starting with 'gromacs' should exist" >&2
exit 1
}
file_name=$(basename "${file}")
date=$(date +"%m_%d_%Y")
for sim in "$HOME"/* ; do
if [[ -d $sim ]]; then
simulation=$(basename "$sim")
(cd "${sim}" && exec cp "$file" "${server}/${results}/${file_name}.${simulation}.${date}")
echo "${file_name} from ${simulation} has been collected!"
fi
done
EOF
} >"${output}/collecter.sh"
Note:
Inside a quoted heredoc (cat <<'EOF'), no substitutions are performed, so no escaping is needed. We're thus able to write our code exactly as we want it to exist in the generated file.
When generating code, use printf %q to escape values in such a way as to evaluate back to their original values. Otherwise, a variable containing $(rm -rf ~) could cause the given command to be run (if it were substituted inside literal single quotes, making the contents $(rm -rf ~)'$(rm -rf ~)' would escape them).
Glob expansions return a list of results; the proper data type in which to store their results is an array, not a string. Thus, file=( gromacs* ) makes the storage of the result in an array explicit, and the following code checks for both the case where we have more than one result, and the case where our result is the original glob expression (meaning no matches existed).
All expansions need to be quoted to prevent string-splitting. This means "$HOME"/*, not $HOME/* -- otherwise you'll have problems whenever a user has a home directory containing whitespace (and yes, this does happen -- consider Windows-derived platforms where you have /Users/Firstname Lastname, or sites where you've mounted a volume for home directories off same).
pushd and popd are an interactive extension, as opposed to a tool intended for writing scripts. Since spawning an external program (such as cp) involves a fork() operation, and any directory change inside a subshell terminates when that subshell does, you can avoid any need for them by spawning a subshell, cd'ing within that subshell, and then using the exec builtin to replace the subshell's PID with that of cp, thus preventing the fork that would otherwise have taken place to allow cp to be started in a separate process from the shell acting as its parent.
You have to escapes in printf with esscaps, e. g.:
date=\$(date +"%%m_%%d_%%Y")
should print date=\$(date +"%m_%d_%Y"). But you should avoid using printf, because you don't use it's capabilities. Instead you could cat the string to the file:
cat > ${output}/collecter.sh <<'END'
HOME=${server}
...
done
END
This would allow you to avoid many escapes, and make the code more readable.
Try this
date=\$(date +\"%%m_%%d_%%Y\")"
I'm trying to pass an argument to a shell script via exec, within another shell script. However, I get an error that the script does not exist in the path - but that is not the case.
$ ./run_script.sh
$ blob has just been executed.
$ ./run_script.sh: line 8: /home/s37syed/blob.sh test: No such file or directory
For some reason it's treating the entire execution as one whole absolute path to a script - it isn't reading the string as an argument for blob.sh.
Here is the script that is being executed.
#!/bin/bash
#run_script.sh
blobPID="$(pgrep "blob.sh")"
if [[ -z "$blobPID" ]]
then
echo "blob has just been executed."
#execs as absolute path - carg not read at all
( exec "/home/s37syed/blob.sh test" )
#this works fine, as exepcted
#( exec "/home/s37syed/blob.sh" )
else
echo "blob is currently running with pid $blobPID"
ps $blobPID
fi
And the script being invoked by run_script.sh, not doing much, just emulating a long process/task:
#!/bin/bash
#blob.sh
i=0
carg="$1"
if [[ -z "$carg" ]]
then
echo "nothing entered"
else
echo "command line arg entered: $carg"
fi
while [ $i -lt 100000 ];
do
echo "blob is currently running" >> test.txt
let i=i+1
done
Here is the version of Bash I'm using:
$ bash --version
GNU bash, version 4.2.37(1)-release (x86_64-pc-linux-gnu)
Any advice/comments/help on why this is happening would be much appreciated!
Thanks in advance,
s37syed
Replace
exec "/home/s37syed/blob.sh test"
(which tries to execute a command named "/home/s37syed/blob.sh test" with no arguments)
by
exec /home/s37syed/blob.sh test
(which executes "/home/s37/syed/blob.sh" with a single argument "test").
Aside from the quoting problem Cyrus pointed out, I'm pretty sure you don't want to use exec. What exec does is replace the current shell with the command being executed (rather than running the command as a subprocess, as it would without exec). Putting parentheses around it makes it execute that section in a subshell, thus effectively cancelling out the effect of exec.
As chepner said, you might be thinking of the eval command, which performs an extra parsing pass before executing the command. But eval is a huge bug magnet. It's incredibly easy to use eval in unsafe ways (see BashFAQ #48). If you need to construct a command, see BashFAQ #50 for better ways to do it.
I have a python script. Normally I would run this like this:
./make_graph data_directory "wonderful graph title"
I have to run this script through the scheduler. I am using -v to pass the arguments for the python script through qsub.
qsub make_graph.pbs -v ARGS="data_directory \"wonderful graph title\""
I have tried many combinations of ', ", \" escaping and I just can't get it right. The quoting around 'wonderful graph title' is always either lost or mangled.
Here is an excerpt from the pbs script
if [ -z "${ARGS+xxx}" ]; then
echo "NO ARGS SPECIFIED!"
exit 1
fi
CMD="/path/make_graph $ARGS"
echo "CMD: $CMD"
echo "Job started on `hostname` at `date`"
${CMD}
What is the proper way to pass a string parameter that contains spaces through qsub as an environment variable? Is there a better way to do this? Maybe this is a more general bash problem.
Update: This answer is based on SGE qsub rather than TORQUE qsub, so the CLI is somewhat different. In particular, TORQUE qub doesn't seem to support direct argument passing, so the second approach doesn't work.
This is mainly a problem of proper quoting and has little to do with grid engine submission itself. If you just want to fix your current script, you should use eval "${CMD}" rather than ${CMD}. Here's a detailed analysis of what happens when you do ${CMD} alone (in the analysis we assume there's nothing funny in path):
Your qsub command line is processed and quotes removed, so the ARGS environment variable passed is data_directory "wonderful graph title".
You did CMD="/path/make_graph $ARGS", so the value of CMD is /path/make_graph data_directory "wonderful graph title" (I'm presenting the string literal without quoting, that is, the value literally contains the quote characters).
You did ${CMD}. Bash performs a parameter expansion on this, which amounts to:
Expanding ${CMD} to its value /path/make_graph data_directory "wonderful graph title";
Since ${CMD} is not quoted, perform word splitting, so in the end the command line has five words: /path/make_graph, data_directory, "wonderful, graph, title". The last four are treated as arguments to your make_graph, which is certainly not what you want.
On the other hand, if you use eval "${CMD}", then it is as if you typed /path/make_graph data_directory "wonderful graph title" into an interactive shell, which is the desired behavior.
You should read more about eval, parameter expansion, etc. in the Bash Reference Manual.
The corrected script:
#!/usr/bin/env bash
[[ -z ${ARGS+xxx} ]] && { echo "NO ARGS SPECIFIED!" >&2; exit 1; }
CMD="/path/make_graph ${ARGS}"
echo "CMD: ${CMD}"
echo "Job started on $(hostname) at $(date)" # backticks are deprecated
eval "${CMD}"
By the way, to test this, you don't need to submit it to the grid engine; just do
ARGS="data_directory \"wonderful graph title\"" bash make_graph.pbs
Okay, I just pointed out what's wrong and patched it. But is it really the "proper way" to pass arguments to grid engine jobs? No, I don't think so. Arguments are arguments, and should not be confused with environment variables. qsub allows you to pass arguments directly (qsub synopsis: qsub [ options ] [ command | -- [ command_args ]]), so why encode them in an env var and end up worrying about quoting?
Here's a better way to write your submission script:
#!/usr/bin/env bash
[[ $# == 0 ]] && { echo "NO ARGS SPECIFIED!" >&2; exit 1; }
CMD="/path/make_graph $#"
echo "CMD: ${CMD}"
echo "Job started on $(hostname) at $(date)" # backticks are deprecated
/path/make_graph "$#"
Here "$#" is equivalent to "$1" "$2" ... — faithfully passing all arguments as is (see relevant section in the Bash Reference Manual).
One thing unfortunate about this, though, is that although the command executed is correct, the one printed may not be properly quoted. For instance, if you do
qsub make_graph.pbs data_directory "wonderful graph title"
then what gets executed is make_graph.pbs data_directory "wonderful graph title", but the printed CMD is make_graph.pbs data_directory wonderful graph title. And there's no easy way to fix this, as far as I know, since quotes are always removed from arguments no matter how word splitting is done. If the command printed is really important to you, there are two solutions:
Use a dedicated "shell escaper" (pretty easy to write one for yourself) to quote the arguments before printing;
Use another scripting language where shell quoting is readily available, e.g., Python (shlex.quote) or Ruby (Shellwords.shellescape).
From this web page :
http://tldp.org/LDP/abs/html/abs-guide.html
It's mentioned the usage of the if bracket then convention which need a space after the semicolon :
;
Command separator [semicolon]. Permits putting two or more commands on the same line.
echo hello; echo there
if [ -x "$filename" ]; then # Note the space after the semicolon.
#+ ^^
echo "File $filename exists."; cp $filename $filename.bak
else # ^^
echo "File $filename not found."; touch $filename
fi; echo "File test complete."
Note that the ";" sometimes needs to be escaped.
Does anyone know where is this coming from and if this is needed at all by certain shells?
This has become the style in the last few years:
if [ -x "$filename" ]; then
echo "hi"
fi
However, back when dinosaurs like Burroughs and Sperry Rand ruled the earth, I learned to write if statements like this:
if [ -x "$filename" ]
then
echo "hi"
fi
Then, you don't even need a semicolon.
The new style with then on the same line as the if started in order to emulate the way C and other programming languages did their if statements:
if (! strcmp("foo", "bar")) {
printf "Strings equal\n";
}
These programming languages put the opening curly brace on the same line as the if.
Semicolon ; is an operator (not a keyword, like braces { }or a bang !) in Shell, so it doesn't need to be delimited with white space to be recognized in any POSIX-compliant shell.
However, doing so improves readability (for my taste).
Semicolon needs to be escaped if you mean a symbol "semicolon", not an operator.
The space after the semicolon is not required by the syntax for any shell I know of, but it's good style and makes the code easier to read.
I suppose the "sometimes needs to be escaped" wording refers to cases like echo foo\;bar, where you don't want the semicolon to be interpreted as a separator by the shell.
I do not believe that the space should be necessary there. There's nothing about requiring spaces in the POSIX sh spec.
Empirically, the following works fine in both bash 4.1.5(1) and dash:
$ if true;then echo hi;else echo bye;fi
hi
$
I've never came across a shell that required a space in that context.
Just to make sure, I've asked on c.u.s., you can read the replies here.