I've just written a bash script that takes some info from the mysql database and reads it line by line, extracting tab-separated columns into separate variables, something like this:
oldifs=$IFS
result=result.txt
$mysql -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server > $result
cat $result | grep -e ^[0-9].*$ | while IFS=$'\t' read id foo bar baz
do
# some code
done
IFS=$oldifs
Now, while this works OK and I'm satisfied with the result (especially since I'm going to move the query t oanother script and let cron regenerate the result.txt file contents once a week or so, since I'm dealing with a table that changes maybe once or twice a year), I'm curious about the possibility of putting the query's result in a variable instead of a file.
I have noticed that in order to echo out backslash-excaped characters, I need to tell the command explicitly to interpret such characters as special chars:
echo -e "some\tstring\n"
But, being a bash noob that I am, I have no idea how to place the backslash escaped characters (the tabs and newlines from the query) inside a variable and just work with it the same way I'm working with the external file (just changing the cat with echo -e). I tried this:
result=`$mysql -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server`
but the backslash escaped characters are converted into spaces this way :(. How can I make it work?
To get the output of a command, use $(...). To avoid wordsplitting and other bash processing you will need to quote. Single quotes ('$(...)') will not work as the quoting is too strong.
Note that once the output is in your variable, you will probably need to (double) quote it wherever you use it if you need to preserve anything that's in $IFS.
$ listing="$(ls -l)"
$ echo "$listing"
Could you try to set double quotes around $result - thus echo -e "$result"?
% awk '/^[0-9]/ { print $2, $3, $4, $5 }' <<SQL | set -- -
> $("${mysql}" -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server)
> SQL
% printf '%s\t' "${#}"
<id> <foo> <bar> <baz>
You might get some use out of this. The heredoc should obviate any escaping issues, awk will separate on tabs by default, and set accepts the input as a builtin argv array. printf isn't necessary, but it's better than echo - especially when working with escape characters.
You could also use read as you did above - but to better handle backslashes use the -r argument if you go that route. The above method would work best as a function and you could then iterate over your variables with shift and similar.
-Mike
Related
I'm writing a shell script that should be somewhat secure, i.e., does not pass secure data through parameters of commands and preferably does not use temporary files. How can I pass a variable to the standard input of a command?
Or, if it's not possible, how can I correctly use temporary files for such a task?
Passing a value to standard input in Bash is as simple as:
your-command <<< "$your_variable"
Always make sure you put quotes around variable expressions!
Be cautious, that this will probably work only in bash and will not work in sh.
Simple, but error-prone: using echo
Something as simple as this will do the trick:
echo "$blah" | my_cmd
Do note that this may not work correctly if $blah contains -n, -e, -E etc; or if it contains backslashes (bash's copy of echo preserves literal backslashes in absence of -e by default, but will treat them as escape sequences and replace them with corresponding characters even without -e if optional XSI extensions are enabled).
More sophisticated approach: using printf
printf '%s\n' "$blah" | my_cmd
This does not have the disadvantages listed above: all possible C strings (strings not containing NULs) are printed unchanged.
(cat <<END
$passwd
END
) | command
The cat is not really needed, but it helps to structure the code better and allows you to use more commands in parentheses as input to your command.
Note that the 'echo "$var" | command operations mean that standard input is limited to the line(s) echoed. If you also want the terminal to be connected, then you'll need to be fancier:
{ echo "$var"; cat - ; } | command
( echo "$var"; cat - ) | command
This means that the first line(s) will be the contents of $var but the rest will come from cat reading its standard input. If the command does not do anything too fancy (try to turn on command line editing, or run like vim does) then it will be fine. Otherwise, you need to get really fancy - I think expect or one of its derivatives is likely to be appropriate.
The command line notations are practically identical - but the second semi-colon is necessary with the braces whereas it is not with parentheses.
This robust and portable way has already appeared in comments. It should be a standalone answer.
printf '%s' "$var" | my_cmd
or
printf '%s\n' "$var" | my_cmd
Notes:
It's better than echo, reasons are here: Why is printf better than echo?
printf "$var" is wrong. The first argument is format where various sequences like %s or \n are interpreted. To pass the variable right, it must not be interpreted as format.
Usually variables don't contain trailing newlines. The former command (with %s) passes the variable as it is. However tools that work with text may ignore or complain about an incomplete line (see Why should text files end with a newline?). So you may want the latter command (with %s\n) which appends a newline character to the content of the variable. Non-obvious facts:
Here string in Bash (<<<"$var" my_cmd) does append a newline.
Any method that appends a newline results in non-empty stdin of my_cmd, even if the variable is empty or undefined.
I liked Martin's answer, but it has some problems depending on what is in the variable. This
your-command <<< """$your_variable"""
is better if you variable contains " or !.
As per Martin's answer, there is a Bash feature called Here Strings (which itself is a variant of the more widely supported Here Documents feature):
3.6.7 Here Strings
A variant of here documents, the format is:
<<< word
The word is expanded and supplied to the command on its standard
input.
Note that Here Strings would appear to be Bash-only, so, for improved portability, you'd probably be better off with the original Here Documents feature, as per PoltoS's answer:
( cat <<EOF
$variable
EOF
) | cmd
Or, a simpler variant of the above:
(cmd <<EOF
$variable
EOF
)
You can omit ( and ), unless you want to have this redirected further into other commands.
Try this:
echo "$variable" | command
If you came here from a duplicate, you are probably a beginner who tried to do something like
"$variable" >file
or
"$variable" | wc -l
where you obviously meant something like
echo "$variable" >file
echo "$variable" | wc -l
(Real beginners also forget the quotes; usually use quotes unless you have a specific reason to omit them, at least until you understand quoting.)
I am trying to use sed to use as input for a variable. The user will choose from a list of files that have numbers before each to identify individual files. Then they choose a number corresponding to a name. I need to get the name of that file. My code is:
for entry in *; do
((i++))
echo "$i) $entry: "
done
echo What file # do you want to choose?:
read filenum
fileName=$(./myscript.sh | sed -n "${filenum}p")
echo $fileName ###this is to see if anything goes into fileName. nothing is ever output
echo What do you want to do with $fileName?
Ideally I would use () instead of the backtick but I can't seem to figure out how. I've looked at the links below, but can't get those ideas to work. I believe a problem may be that I am trying to include the filenum variable inside my sed.
https://www.linuxquestions.org/questions/linux-newbie-8/storing-output-of-sed-in-a-variable-in-shell-script-499997/
Store output of sed into a variable
Don't put backticks around $filenum. That will try to execute the contents of $filenum as a command. Put variables inside double quotes.
And if you do want to nest a backtick expression inside another set of backticks, you have to escape them. That's where $() becomes useful -- they nest without any hassle.
When you use sed -n, you need to use the p command to print the lines that you want to show in the output.
fileName=$(sed -n "${filenum}p" myscript.sh)
This will put the contents of line $filenum of myscript.sh in the variable.
If you actually wanted to execute myscript.sh and print the selected line of its output, you need to pipe to sed:
fileName=$(./myscript.sh | sed -n "${filenum}p")
I want to prepend a string to all the files in a directory. What I want to do is something like:
echo string{$(ls some_dir)}
This won't work because ls separates words with spaces, and brace expansion requires commas. So I thought I'd use tr to replace the spaces with commas, like:
echo string{$(ls some_dir) | tr ' ' ','}
But that doesn't work either because the pipe takes precedence.
What's the correct way to do this? I know I could probably use a sane language like Python, but it's frustrating that Bash can't even do something as simple as that.
If you really want to interpolate the contents of a directory (which is what $(ls some_dir) would give you) then you can do
printf 'string%s ' some_dir/*
IRL, you probably want it to end with a newline.
{ printf 'string%s ' some_dir/*; echo; }
You can generalize this to the output of any glob or brace expansion:
printf 'foo%d\n' {11..22}
Edit
Based on your comment, you want to eliminate the "some_dir/" part, you can't merely do that with printf. You can either cd to the directory so the globs expand as desired, or use parameter expansion to clean up the leading directory name:
( cd some_dir && printf 'string%s ' *; echo )
or
{ cd some_dir && printf 'string%s ' * && cd - >/dev/null; echo; }
or
names=( some_dir/* ) names=( "${names[#]#some_dir/}" )
{ printf 'string%s ' "${names[#]}"; echo; }
One way to do it, which will deal gracefully with whitespace in filenames:
files=("$dir"/*); files=("${files[#]/#"$dir"\//"$prefix"}")
That will store the prefixed strings in the array $files; you could iterate over them using an array expansion:
for file in "${files[#]}"; do
# Something with file
done
or print them out using printf:
printf "%s\n" "${files[#]}"
The advantage of using the array expansion is that it does not involve word-splitting, so even if the elements have whitespace in them, the array expansion will contain each element as a single word.
Of course, bash can do it.
Let's go step by step.
1. fix an issue in your second example
This is your second example
echo string{$(ls some_dir) | tr ' ' ','}
You put pipe outside the command substitution, which is totally wrong.
I believe you want to pipe the stream from ls output to tr input, so it's obvious that the pipe is supposed to be put inside the command substitution, like this
echo string{$(ls some_dir | tr ' ' ',')}
2. output of ls is separated by newline rather than whitespace
so here we go
echo string{$(ls some_dir | tr '\n' ',')}
3. brace expansion is performed prior to command substitution
In the other word, after command substitution is expanded to f1,f2,f3,d1,, the brace expansion will not be performed any more.
So, no doubt, the command will print string{f1,f2,f3,d1,}.
The solution is letting bash evaluate it again.
eval echo string{$(ls some_dir | tr '\n' ',')}
OK, up to now, the result looks very good (try it yourself, you'll get it), it is very close to what you were looking for, except one tiny spot.
You may already noticed the comma at the end of the output I demonstrated above. The comma results in an unnecessary string appearing at the end of the final output.
So let's make it done.
4. remove the ending comma
eval echo string{$(echo -n "$(ls some_dir)" | tr '\n' ',')}
OK, this is it.
Oh... BTW., this is just an specific solution for your specific question. You may develop new variants of your question, and this specific solution may not fit your new question. If so, I suggest you run man bash, and read it from head to toe very very carefully, then you will become unstoppable.
echo $fbname | awk -F'[__]' '{print $2 $A_name = $2}'
echo $A_name
I am trying to extract a name within the fbname variable like for example,
newlist_sammy_card.csv So I am trying to get the name sammy which is between the two underscores and assign it to a variable I can use for rest of the script.
The first line prints out sammy, which is what I need, but the second line does not.
Can anyone show me where I am not assinging the variable correctly?
There is a fundamental flaw in your understanding and reasoning. If you invoke awk in your script it is spawned as a program in its own individual right. Therefore all the variables that exist in your current shell are not available to awk and vice versa. As such you can not 'define' variables in awk that are then visible to your shell. What you should do is 'capture' the output of awk, by using the notation $(), and assign it to a variable. Consider this example:
var=$(awk '{print "test"}')
echo $var
This will output
test
Now in your case, we are actually facing an xy-problem. You want to extract sammy from the string newlist_sammy_card.csv and use that as a variable. One possible solution in pure bash is the following:
name="newlist_sammy_card.csv"
temp=${name#*_}
var=${temp%%_*}
echo $var
This will output
sammy
There's a LOT of ways to do what you're asking for, e.g. here's a couple more in addition to the other ideas you've received so far:
$ fbname='newlist_sammy_card.csv'
$ A_name=$(echo "$fbname" | cut -d_ -f2)
$ echo "$A_name"
sammy
or:
$ IFS=_
$ set -- $fbname
$ A_name="$2"
$ echo "$A_name"
sammy
but I wonder if you're approaching your problem completely the wrong way. We can't tell without more info on what you're trying to do though.
You can simply use bash:
str1="newlist_sammy_card.csv"
# replace 'newlist_' from front of the string
str2=${str1#*_}
# replace '_card.csv' from back of the string:
str2=${str2%%_*}
echo "$str2" # Output: sammy
Unfortunately it can't be done in a single run in bash. However it should still perform a lot better than launching any kind of external program.
Pankrates's answer explains the problem with the OP's approach well and offers a pure shell solution using shell parameter expansion.
Here's another pure shell solution, using a single command based on the read builtin:
Using bash, with a here-string:
IFS=_ read -r _ A_name _ <<<"$fbname"
POSIX-compliant equivalent, using a here-doc:
IFS=_ read -r _ A_name _ <<EOF
$fbname
EOF
If $fbname contains 'newlist_sammy_card.csv', $A_name will contain 'sammy' afterward.
IFS=_ tells read to split the input into tokens by _ instances.
Note that by directly prepending IFS=... to read, the effect of setting $IFS is localized to the read command - no need to restore the original $IFS later.
read -r _ A_name _ ... reads input unmodified (-r - no interpretation of backslash escape sequences)
Note that _ on either side of A_name is the - customary - name of a dummy variable that is used for input that is of no interest, but is needed to positionally extract the token of interest.
It is a mere coincidence in this case that the name of this dummy variable is the same as the $IFS character.
In this case: $_ receives the 1st field (before the first _ char. in the input), and is then overwritten with any remaining fields after the 2nd field, where the 2nd field is read into the variable of interest, $A_name.
I am trying to write a bash script so that I will use to replace my egrep command. I want to be able to take the exact same input that is given to my script and feed it to egrep.
i.e.
#!/bin/bash
PARAMS=$#
`egrep "$PARAMS"`
But I have noticed that if I echo what I am executing, that the quotes have been removed as follows:
./customEgrep -nr "grep my ish" *
returns
egrep -nr grep my ish (file list from the expanded *)
Is there a way that I can take the input literally so I can use it directly with egrep?
You want this:
egrep "$#"
The quotes you type are not passed to the script; they're used to determine word boundaries. Using "$#" preserves those word boundaries, so egrep will get the same arguments as it would if you ran it directly. But you still won't see quotation marks if you echo the arguments.
" is a special char. you need to use escape character in order to retrieve "
use
./customEgrep -nr "\"grep my ish\"" *
If you don't need to do any parameter expansion in the argument, you can use
single quotes to avoid the need to escape the double quotes:
./customerEgrep -nr '"grep my ish"' *
$# is special when quoted. Try:
value=$( egrep "$#" )
It's not clear to me why you are using bacticks and ignoring the result, so I've used the $() syntax and assigned the value.
If for some reason you want to save the parameters to use later, you can also do things like:
for i; do args="$args '$i'"; done # Save the arguments
eval grep $args # Pass the arguments to grep without resetting $1,$2,...
eval set $args # Restore the arguments
grep "$#" # Use the restored arguments