How can I pass an unquoted string with spaces as a quoted argument? - bash

Better explained with an example. I am writing a simple wrapper (a function in my .bashrc) around the mail command.
Here is my current function which doesn't work correctly:
function email_me() { echo "$#" | mail -s "\"$#\"" myaddress#email.com; }
Here is my desired usage - this would send an email with both the subject and body set to testing 1 2 3. Note I specifically do not want to have to put quotes in manually.
~$ email_me testing 1 2 3
Thus I want the string replacement to occur like this:
echo "testing 1 2 3" | mail -s "testing 1 2 3" myaddress#email.com
However no matter what I try, it's as though the -s argument doesn't have quotes around it, and email an email with the subject "testingis sent to the following recipients: 1, 2, 3, and myaddress#email.com
How can I make the -s argument consider "testing 1 2 3" to be a single string?

I would suggest using
function email_me() { printf %s\\n "$*" | mail -s "$*" myaddress#email.com; }
"$*" is indeed the special variable containing all arguments together in one string
using printf instead of echo saves you from suprises with -n -e and whatever else your implementation of echo supports.
Still, there will be situations where you'll have to quote the arguments to email_me to avoid globbing and preserve whitespace:
email_me 2 * 2 = 4
[sends you all file names in current directory]
email_me a b
[sends "a b" with only one space]

Related

Parameter expansion with replacement, avoid additional variable

I'm trying to join input $* which is one parameter consisting of all the parameters added together.
This works.
#!/bin/bash
foo() {
params="${*}"
echo "${params//[[:space:]]/-}"
}
foo 1 2 3 4
1-2-3-4
However, is it possible to skip the assignment of variable?
"${"${*}"//[[:space:]]/-}"
I'm getting bad substitution error.
I can also do
: "${*}"
echo "${_//[[:space:]]/-}"
But it feels hacky.
One option could be to set bash's internal field separator, IFS, to - locally and just echo "$*":
foo() {
local IFS=$'-'
echo "$*"
}
To answer your question, you can do global pattern substitutions on the positional parameters like this:
${*//pat/sub}
${#//pat/sub}
And also arrays like this:
${arr[*]//pat/sub}
${arr[#]//pat/sub}
This won’t join the parameters, but substitute inside them.
Setting IFS to dash adds a dash in between each parameter for echo "$*", or p=$*, but won’t replace anything inside a parameter.
Eg:
$ set -- aa bb 'cc cc'
$ IFS=-
$ echo "$*"
aa-bb-cc cc
To remove all whitespace, including inside a parameter, you can combine them:
IFS=-
echo "${*//[[:space:]]/-}"
Or just assign to a name first, like you were doing:
no_spaces=$*
echo "${no_spaces//[[:space:]]/-}"

bash how to input * from a file

The problem:
I have a simple compiler that compiles simple RPN expression. The compiler is invoked like this:
./compiler 1 2 3 + "*"
This works fine.
Now, let's say I've put
1 2 3 + "*"
into a file called input. Then when I invoke the compiler like this:
./compiler $(cat input)
My compiler will complain: unknown symbol: "*"
If I remove the double quote around *, the * gets expanded to file names. I've also tried '' and ``, no good.
So, how can I input a normal * from a file?
zoo.sh with content
#!/bin/bash
set -f
echo $#
m.txt with content:
1 2 3 + *
In a shell do:
set -f
./zoo.sh 1 2 4 + *
1 2 4 + *
./zoo.sh $(cat m.txt)
1 2 3 + *
The shell is doing the expansion for you. This happens before the command runs. If you want it to stop you need to explicitly tell it. Read about it here:
http://www.gnu.org/software/bash/manual/bash.html#The-Set-Builtin
Above the script also sets this to prevent the echo inside the script to do the expansion and make this work as I imagine your code works. Your compiler probably does not (need) do this.
Remember to do a set +f to restore the filename expansion.
The only way to remove quoting characters from a variable that contains them:
a='1 2 3 + "*"'
is with shell "quote removal". That is done every time a string is parsed by the shell (if it is not the result of some other expansion). So, this will not remove the quotes, even if un-quoted:
$ echo $a
1 2 3 + "*"
Even repeated, quotes will not be removed:
$ echo $(echo $( echo $a ) )
1 2 3 + "*"
But quotes will be removed if we re-parse the string with eval:
$ eval echo $a
1 2 3 + *
Or if the string is sent to an external program (not a builtin), it will be re-parsed on its return. That is what happens with xargs:
$ xargs <<<$a
1 2 3 + *
Now, for your specific case, either do:
$ eval echo $(<file.csv)
or
$ xargs <file.csv
In this case, xargs is executing a default echo, so is basically doing the same as above.
Please be aware that either commands work with this specific code, but are a source for serious security risks. Remember: eval is evil.

How to parse a string into variables?

I know how to parse a string into variables in the manner of this SO question, e.g.
ABCDE-123456
becomes:
var1=ABCDE
var2=123456
via, say, cut. I can do that in one script, no problem.
But I have a few dozen scripts which parse strings/arguments all in the same fashion (same arguments & variables, i.e. same parsing strategy).
And sometimes I need to make a change or add a variable to the parsing mechanism.
Of course, I could go through every one of my dozens of scripts and change the parsing manually (even if just copy & paste), but that would be tedious and more error-prone to bugs/mistakes.
Is there a modular way to do parse strings/arguments as such?
I thought of writing a script which parses the string/args into variables and then exports, but the export command does not work form child-to-parent, (only vice-versa).
Something like this might work:
parse_it () {
SEP=${SEP--}
string=$1
names=${#:2}
IFS="$SEP" read $names <<< "$string"
}
$ parse_it ABCDE-123456 var1 var2
$ echo "$var1"
ABCDE
$ echo "$var2"
123456
$ SEP=: parse_it "foo:bar:baz" id1 id2 id3
$ echo $id2
bar
The first argument is the string to parse, the remaining arguments are names of variables that get passed to read as the variables to set. (Not quoting $names here is intentional, as we will let the shell split the string into multiple words, one per variable. Valid variable names consist of only _, letters, and numbers, so there are no worries about undesired word splitting or pathname generation by not quoting $names). The function assumes the string uses a single separator of "-", which can be overridden via the environment.
For more complex parsing, you may want to use a custom regular expression (bash 4 or later required for the -g flag to declare):
parse_it () {
reg_ex=$1
string=$2
shift 2
[[ $string =~ $reg_ex ]] || return
i=1
for name; do
declare -g "$name=${BASH_REMATCH[i++]}"
done
}
$ parse_it '(.*)-(.*):(.*)' "abc-123:xyz" id1 id2 id3
$ echo "$id2"
123
I think what you really want is to write your function in one script and include it in all of your other scripts.
You can include other shell scripts by the source or . command.
For example, you can define your parse function in parseString.sh
function parseString {
...
}
And then in any of your other script, do
source parseString.sh
# now we can call parseString function
parseString abcde-12345

sed backreferences and command interpolation

I am having an interesting issue using only sed to substitute short month strings (ex "Oct") with the corresponding number value (ex "10) given a string such as the following:
Oct 14 09:23:35 some other input
To be replaced directly via sed with:
14-10-2013 09:23:25 some other input
None of the following is actually relevant to solving the trivial problem of month string -> number conversion; I'm more interested in understanding some weird behavior I encountered while trying to solve this problem entirely with sed.
Without any attempt of this string substitution (the echo statement is in lieu of the actual input in my script):
...
MMM_DD_HH_mm_SS="([A-Za-z]{3}) ([0-9]{2}) (.+:[0-9]{2})"
echo "Oct 14 09:23:35 some other input" | sed -r "s/$MMM_DD_HH_mm_ss (.+)/\2-\1-\3 \4/"
Then how to transform the backreference \1 into a number. Of course one thinks of using command interpolation with the backreference as an argument:
...
TestFunc()
{
echo "received input $1$1"
}
...
echo "Oct 14 09:23:35 some other input" | sed -r "s/$MMM_DD_HH_mm_ss (.+)/\2-$(TestFunc \\1)-\3 \4/"
Where TestFunc would be a variation of the date command (as proposed by Jotne below) with the echo'd date-time group as an input. Here TestFunc is just an echo because I'm much more interested in the behavior of what the function believes to be the value of $1.
In this case the sed with TestFunc produces the output:
14-received input OctOct-09:23:35 some other input
Which suggests that sed actually is inserting backreference \1 into the command substitution $(...) for handling by TestFunc (which appears to receive \1 as the local variable $1).
However, all attempts to do anything more with the local $1 fail. For example:
TestFunc()
{
echo "processed: $1$1" > tmp.txt # Echo 1
if [ "$1" == "Oct" ]; then
echo "processed: 10"
else
echo "processed: $1$1" # Echo 2
fi
}
Returns:
14-processed: OctOct-09:23:35 some other input
$1 has been substituted into Echo 2, yet tmp.txt contains the value processed: \1\1; as if the backreference is not being inserted into the command substitution. Even weirder, the if condition fails with $1 != "Oct", yet it falls through to an echo statement which indicates $1 = "Oct".
My question is why is the backreference insertion working in the case of Echo 2 but not Echo 1? I suspect that the backreference insertion isn't working at all (given the failure of the if statement in TestFunc) but rather something subtle is going on that makes the substitution appear to work correctly in the case of Echo 2; what is that subtlety?
Solution
On further reflection I believe I understand what is going on:
\\1 is passed to the command substitution subroutine / child function as the literal \1. This is why equality test within the child function is failing.
however the echo function is correctly handling the string \\1 as $1. So echo "aa$1aa" returns the result of the command substitution to sed as aa\1aa. Other functions such as rev also "see" $1 as \1.
sed then interpolates \1 in aa\1aa as Oct or whatever the backreference is, to return aaOctaa to the user.
Since command substitution within regexes clearly works, it would be really cool if sed replaced the value of \\1 (or \1, whatever) with the backreference before executing the command substitution $(...); this would significantly increase sed's power...
This might work for you (GNU sed):
s/$/\nJan01...Oct10Nov11Dec12/;s/(...) (..) (..:..:.. .*)\n.*\1(..).*/\2-\4-2013 \3/;s/\n.*//' file
Add a lookup to the end of the line and use the back reference to match on it making sure to remove the lookup table in all cases.
Here's an example of passing a backreference to a function:
f(){ echo "x$1y$1z"; }
echo a b c | sed -r 's/(.) (.) (.)/'"$(f \\2)"'/'
returns:
xbybz
HTH
Use the correct tool:
date -d "Oct 14 09:23:35" +"%d-%m-%Y %H:%M:%S"
14-10-2013 09:23:35
Date does read your input and convert it to any format you like

bash: calling a scripts with double-quote argument

I have a bash scripts which an argument enclosed with double quotes, which creates a shape-file of map within the given boundries, e.g.
$ export_map "0 0 100 100"
Within the script, there are two select statements:
select ENCODING in UTF8 WIN1252 WIN1255 ISO-8859-8;
...
select NAV_SELECT in Included Excluded;
Naturally, these two statements require the input to enter a number as an input. This can by bypassed by piping the numbers, followed by a newline, to the script.
In order to save time, I would like to have a script that would create 8 maps - for each combination of ENCODING (4 options) and NAV_SELECT (2 options).
I have written another bash script, create_map, to server as a wrapper:
#!/bin/bash
for nav in 1 2 3 4;
do
for enc in 1 2;
do
printf "$nav\n$enc\n" | /bin/bash -c "./export_map.sh \"0 0 100 100\""
done
done
**This works (thanks, Brian!), but I can't find a way to have the numeric argument "0 0 100 100" being passed from outside the outer script. **
Basically, I'm looking for way to accept an argument within double quotes to a wrapper bash script, and pass it - with the double quotes - to an inner script.
CLARIFICATIONS:
export_map is the main script, being called from create_map 8 times.
Any ideas?
Thanks,
Adam
If I understand your problem correctly (which I'm not sure about; see my comment), you should probably add another \n to your printf; printf does not add a trailing newline by default the way that echo does. This will ensure that the second value will be read properly by the select command which I'm assuming appears in export_map.sh.
printf "$nav\n$enc\n" | /bin/bash -c "./export_map.sh \"100 200 300 400\""
Also, I don't think that you need to add the /bin/bash -c and quote marks. The following should be sufficient, unless I'm missing something:
printf "$nav\n$enc\n" | ./export_map.sh "100 200 300 400"
edit Thanks for the clarification. In order to pass an argument from your wrapper script, into the inner script, keeping it as a single argument, you can pass in "$1", where the quotes indicate that you want to keep this grouped as one argument, and $1 is the first parameter to your wrapper script. If you want to pass all parameters from your outer script in to your inner script, each being kept as a single parameter, you can use "$#" instead.
#!/bin/bash
for nav in 1 2 3 4;
do
for enc in 1 2;
do
printf "$nav\n$enc\n" | ./export_map.sh "$1"
done
done
Here's a quick example of how "$#" works. First, inner.bash:
#!/bin/bash
for str in "$#"
do
echo $str
done
outer.bash:
#!/bin/bash
./inner.bash "$#"
And invoking it:
$ ./outer.bash "foo bar" baz "quux zot"
foo bar
baz
quux zot

Resources