I want to prepend a string to all the files in a directory. What I want to do is something like:
echo string{$(ls some_dir)}
This won't work because ls separates words with spaces, and brace expansion requires commas. So I thought I'd use tr to replace the spaces with commas, like:
echo string{$(ls some_dir) | tr ' ' ','}
But that doesn't work either because the pipe takes precedence.
What's the correct way to do this? I know I could probably use a sane language like Python, but it's frustrating that Bash can't even do something as simple as that.
If you really want to interpolate the contents of a directory (which is what $(ls some_dir) would give you) then you can do
printf 'string%s ' some_dir/*
IRL, you probably want it to end with a newline.
{ printf 'string%s ' some_dir/*; echo; }
You can generalize this to the output of any glob or brace expansion:
printf 'foo%d\n' {11..22}
Edit
Based on your comment, you want to eliminate the "some_dir/" part, you can't merely do that with printf. You can either cd to the directory so the globs expand as desired, or use parameter expansion to clean up the leading directory name:
( cd some_dir && printf 'string%s ' *; echo )
or
{ cd some_dir && printf 'string%s ' * && cd - >/dev/null; echo; }
or
names=( some_dir/* ) names=( "${names[#]#some_dir/}" )
{ printf 'string%s ' "${names[#]}"; echo; }
One way to do it, which will deal gracefully with whitespace in filenames:
files=("$dir"/*); files=("${files[#]/#"$dir"\//"$prefix"}")
That will store the prefixed strings in the array $files; you could iterate over them using an array expansion:
for file in "${files[#]}"; do
# Something with file
done
or print them out using printf:
printf "%s\n" "${files[#]}"
The advantage of using the array expansion is that it does not involve word-splitting, so even if the elements have whitespace in them, the array expansion will contain each element as a single word.
Of course, bash can do it.
Let's go step by step.
1. fix an issue in your second example
This is your second example
echo string{$(ls some_dir) | tr ' ' ','}
You put pipe outside the command substitution, which is totally wrong.
I believe you want to pipe the stream from ls output to tr input, so it's obvious that the pipe is supposed to be put inside the command substitution, like this
echo string{$(ls some_dir | tr ' ' ',')}
2. output of ls is separated by newline rather than whitespace
so here we go
echo string{$(ls some_dir | tr '\n' ',')}
3. brace expansion is performed prior to command substitution
In the other word, after command substitution is expanded to f1,f2,f3,d1,, the brace expansion will not be performed any more.
So, no doubt, the command will print string{f1,f2,f3,d1,}.
The solution is letting bash evaluate it again.
eval echo string{$(ls some_dir | tr '\n' ',')}
OK, up to now, the result looks very good (try it yourself, you'll get it), it is very close to what you were looking for, except one tiny spot.
You may already noticed the comma at the end of the output I demonstrated above. The comma results in an unnecessary string appearing at the end of the final output.
So let's make it done.
4. remove the ending comma
eval echo string{$(echo -n "$(ls some_dir)" | tr '\n' ',')}
OK, this is it.
Oh... BTW., this is just an specific solution for your specific question. You may develop new variants of your question, and this specific solution may not fit your new question. If so, I suggest you run man bash, and read it from head to toe very very carefully, then you will become unstoppable.
Related
I have a problem with the following for loop:
X="*back* OLD"
for P in $X
do
echo "-$P"
done
I need it to output just:
-*back*
-OLD
However, it lists all files in the current directory matching the *back* pattern. For example it gives the following:
-backup.bkp
-backup_new.bkp
-backup_X
-OLD
How to force it to output the exact pattern?
Use an array, as unquoted parameter expansions are still subject to globbing.
X=( "*back*" OLD )
for P in "${X[#]}"; do
printf '%s\n' "$P"
done
(Use printf, as echo could try to interpret an argument as an option, for example, if you had n in the value of X.)
Use set -o noglob before your loop and set +o noglob after to disable and enable globbing.
To prevent filename expansion you could read in the string as a Here String.
To iterate over the items, you could turn them into lines using parameter expansion and read them linewise using read. In order to be able to put a - sign as the first character, use printf instead of echo.
X="*back* OLD"
while read -r x
do printf -- '-%s\n' "$x"
done <<< "${X/ /$'\n'}"
Another way could be to use tr to transform the string into lines, then use paste with the - sign as delimiter and "nothing" from /dev/null as first column.
X="*back* OLD"
tr ' ' '\n' <<< "$X" | paste -d- /dev/null -
Both should output:
-*back*
-OLD
I'm writing a shell script that should be somewhat secure, i.e., does not pass secure data through parameters of commands and preferably does not use temporary files. How can I pass a variable to the standard input of a command?
Or, if it's not possible, how can I correctly use temporary files for such a task?
Passing a value to standard input in Bash is as simple as:
your-command <<< "$your_variable"
Always make sure you put quotes around variable expressions!
Be cautious, that this will probably work only in bash and will not work in sh.
Simple, but error-prone: using echo
Something as simple as this will do the trick:
echo "$blah" | my_cmd
Do note that this may not work correctly if $blah contains -n, -e, -E etc; or if it contains backslashes (bash's copy of echo preserves literal backslashes in absence of -e by default, but will treat them as escape sequences and replace them with corresponding characters even without -e if optional XSI extensions are enabled).
More sophisticated approach: using printf
printf '%s\n' "$blah" | my_cmd
This does not have the disadvantages listed above: all possible C strings (strings not containing NULs) are printed unchanged.
(cat <<END
$passwd
END
) | command
The cat is not really needed, but it helps to structure the code better and allows you to use more commands in parentheses as input to your command.
Note that the 'echo "$var" | command operations mean that standard input is limited to the line(s) echoed. If you also want the terminal to be connected, then you'll need to be fancier:
{ echo "$var"; cat - ; } | command
( echo "$var"; cat - ) | command
This means that the first line(s) will be the contents of $var but the rest will come from cat reading its standard input. If the command does not do anything too fancy (try to turn on command line editing, or run like vim does) then it will be fine. Otherwise, you need to get really fancy - I think expect or one of its derivatives is likely to be appropriate.
The command line notations are practically identical - but the second semi-colon is necessary with the braces whereas it is not with parentheses.
This robust and portable way has already appeared in comments. It should be a standalone answer.
printf '%s' "$var" | my_cmd
or
printf '%s\n' "$var" | my_cmd
Notes:
It's better than echo, reasons are here: Why is printf better than echo?
printf "$var" is wrong. The first argument is format where various sequences like %s or \n are interpreted. To pass the variable right, it must not be interpreted as format.
Usually variables don't contain trailing newlines. The former command (with %s) passes the variable as it is. However tools that work with text may ignore or complain about an incomplete line (see Why should text files end with a newline?). So you may want the latter command (with %s\n) which appends a newline character to the content of the variable. Non-obvious facts:
Here string in Bash (<<<"$var" my_cmd) does append a newline.
Any method that appends a newline results in non-empty stdin of my_cmd, even if the variable is empty or undefined.
I liked Martin's answer, but it has some problems depending on what is in the variable. This
your-command <<< """$your_variable"""
is better if you variable contains " or !.
As per Martin's answer, there is a Bash feature called Here Strings (which itself is a variant of the more widely supported Here Documents feature):
3.6.7 Here Strings
A variant of here documents, the format is:
<<< word
The word is expanded and supplied to the command on its standard
input.
Note that Here Strings would appear to be Bash-only, so, for improved portability, you'd probably be better off with the original Here Documents feature, as per PoltoS's answer:
( cat <<EOF
$variable
EOF
) | cmd
Or, a simpler variant of the above:
(cmd <<EOF
$variable
EOF
)
You can omit ( and ), unless you want to have this redirected further into other commands.
Try this:
echo "$variable" | command
If you came here from a duplicate, you are probably a beginner who tried to do something like
"$variable" >file
or
"$variable" | wc -l
where you obviously meant something like
echo "$variable" >file
echo "$variable" | wc -l
(Real beginners also forget the quotes; usually use quotes unless you have a specific reason to omit them, at least until you understand quoting.)
I have a variable with some lines in it and I would like to pad it with a number of newlines defined in another variable. However it seems that the subshell may be stripping the trailing newlines. I cannot just use '\n' with echo -e as the lines may already contain escaped chars which need to be printed as is.
I have found I can print an arbitrary number of newlines using this.
n=5
yes '' | sed -n "1,${n}p;${n}q"
But if I run this in a subshell to store it in the variable, the subshell appears to strip the trailing newlines.
I can approximate the functionality but it's clumsy and due to the way I am using it I would much rather be able to just call echo "$var" or even use $var itself for things like string concatenation. This approximation runs into the same issue with subshells as soon as the last (filler) line of the variable is removed.
This is my approximation
n=5
var="test"
#I could also just set n=6
cmd="1,$((n+1))p;$((n+1))q"
var="$var$(yes '' | sed -n $cmd; echo .)"
#Now I can use it with
echo "$var" | head -n -1
Essentially I need a good way of appending a number of newlines to a variable which can then be printed with echo.
I would like to keep this POSIX compliant if at all possible but at this stage a bash solution would also be acceptable. I am also using this as part of a tool for which I have set a challenge of minimizing line and character count while maintaining readability. But I can work that out once I have a workable solution
Command substitutions with either $( ) or backticks will trim trailing newlines. So don't use them; use the shell's built-in string manipulation:
n=5
var="test"
while [ "$n" -gt 0 ]; do
var="$var
"
n=$((n-1))
done
Note that there must be nothing after the var="$var (before the newline), and nothing before the " on the next line (no indentation!).
A sequence of n newlines:
printf -v spaces "%*s" $n ""
newlines=${spaces// /$'\n'}
I am using the bash shell and want to execute a command that takes filenames as arguments; say the cat command. I need to provide the arguments sorted by modification time (oldest first) and unfortunately the filenames can contain spaces and a few other difficult characters such as "-", "[", "]". The files to be provided as arguments are all the *.txt files in my directory. I cannot find the right syntax. Here are my efforts.
Of course, cat *.txt fails; it does not give the desired order of the arguments.
cat `ls -rt *.txt`
The `ls -rt *.txt` gives the desired order, but now the blanks in the filenames cause confusion; they are seen as filename separators by the cat command.
cat `ls -brt *.txt`
I tried -b to escape non-graphic characters, but the blanks are still seen as filename separators by cat.
cat `ls -Qrt *.txt`
I tried -Q to put entry names in double quotes.
cat `ls -rt --quoting-style=escape *.txt`
I tried this and other variants of the quoting style.
Nothing that I've tried works. Either the blanks are treated as filename separators by cat, or the entire list of filenames is treated as one (invalid) argument.
Please advise!
Using --quoting-style is a good start. The trick is in parsing the quoted file names. Backticks are simply not up to the job. We're going to have to be super explicit about parsing the escape sequences.
First, we need to pick a quoting style. Let's see how the various algorithms handle a crazy file name like "foo 'bar'\tbaz\nquux". That's a file name containing actual single and double quotes, plus a space, tab, and newline to boot. If you're wondering: yes, these are all legal, albeit unusual.
$ for style in literal shell shell-always shell-escape shell-escape-always c c-maybe escape locale clocale; do printf '%-20s <%s>\n' "$style" "$(ls --quoting-style="$style" '"foo '\''bar'\'''$'\t''baz '$'\n''quux"')"; done
literal <"foo 'bar' baz
quux">
shell <'"foo '\''bar'\'' baz
quux"'>
shell-always <'"foo '\''bar'\'' baz
quux"'>
shell-escape <'"foo '\''bar'\'''$'\t''baz '$'\n''quux"'>
shell-escape-always <'"foo '\''bar'\'''$'\t''baz '$'\n''quux"'>
c <"\"foo 'bar'\tbaz \nquux\"">
c-maybe <"\"foo 'bar'\tbaz \nquux\"">
escape <"foo\ 'bar'\tbaz\ \nquux">
locale <‘"foo 'bar'\tbaz \nquux"’>
clocale <‘"foo 'bar'\tbaz \nquux"’>
The ones that actually span two lines are no good, so literal, shell, and shell-always are out. Smart quotes aren't helpful, so locale and clocale are out. Here's what's left:
shell-escape <'"foo '\''bar'\'''$'\t''baz '$'\n''quux"'>
shell-escape-always <'"foo '\''bar'\'''$'\t''baz '$'\n''quux"'>
c <"\"foo 'bar'\tbaz \nquux\"">
c-maybe <"\"foo 'bar'\tbaz \nquux\"">
escape <"foo\ 'bar'\tbaz\ \nquux">
Which of these can we work with? Well, we're in a shell script. Let's use shell-escape.
There will be one file name per line. We can use a while read loop to read a line at a time. We'll also need IFS= and -r to disable any special character handling. A standard line processing loop looks like this:
while IFS= read -r line; do ... done < file
That "file" at the end is supposed to be a file name, but we don't want to read from a file, we want to read from the ls command. Let's use <(...) process substitution to swap in a command where a file name is expected.
while IFS= read -r line; do
# process each line
done < <(ls -rt --quoting-style=shell-escape *.txt)
Now we need to convert each line with all the quoted characters into a usable file name. We can use eval to have the shell interpret all the escape sequences. (I almost always warn against using eval but this is a rare situation where it's okay.)
while IFS= read -r line; do
eval "file=$line"
done < <(ls -rt --quoting-style=shell-escape *.txt)
If you wanted to work one file at a time we'd be done. But you want to pass all the file names at once to another command. To get to the finish line, the last step is to build an array with all the file names.
files=()
while IFS= read -r line; do
eval "files+=($line)"
done < <(ls -rt --quoting-style=shell-escape *.txt)
cat "${files[#]}"
There we go. It's not pretty. It's not elegant. But it's safe.
Does this do what you want?
for i in $(ls -rt *.txt); do echo "FILE: $i"; cat "$i"; done
I've just written a bash script that takes some info from the mysql database and reads it line by line, extracting tab-separated columns into separate variables, something like this:
oldifs=$IFS
result=result.txt
$mysql -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server > $result
cat $result | grep -e ^[0-9].*$ | while IFS=$'\t' read id foo bar baz
do
# some code
done
IFS=$oldifs
Now, while this works OK and I'm satisfied with the result (especially since I'm going to move the query t oanother script and let cron regenerate the result.txt file contents once a week or so, since I'm dealing with a table that changes maybe once or twice a year), I'm curious about the possibility of putting the query's result in a variable instead of a file.
I have noticed that in order to echo out backslash-excaped characters, I need to tell the command explicitly to interpret such characters as special chars:
echo -e "some\tstring\n"
But, being a bash noob that I am, I have no idea how to place the backslash escaped characters (the tabs and newlines from the query) inside a variable and just work with it the same way I'm working with the external file (just changing the cat with echo -e). I tried this:
result=`$mysql -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server`
but the backslash escaped characters are converted into spaces this way :(. How can I make it work?
To get the output of a command, use $(...). To avoid wordsplitting and other bash processing you will need to quote. Single quotes ('$(...)') will not work as the quoting is too strong.
Note that once the output is in your variable, you will probably need to (double) quote it wherever you use it if you need to preserve anything that's in $IFS.
$ listing="$(ls -l)"
$ echo "$listing"
Could you try to set double quotes around $result - thus echo -e "$result"?
% awk '/^[0-9]/ { print $2, $3, $4, $5 }' <<SQL | set -- -
> $("${mysql}" -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server)
> SQL
% printf '%s\t' "${#}"
<id> <foo> <bar> <baz>
You might get some use out of this. The heredoc should obviate any escaping issues, awk will separate on tabs by default, and set accepts the input as a builtin argv array. printf isn't necessary, but it's better than echo - especially when working with escape characters.
You could also use read as you did above - but to better handle backslashes use the -r argument if you go that route. The above method would work best as a function and you could then iterate over your variables with shift and similar.
-Mike