I'm trying to write a bash script that iterates over the arguments and builds a string like the following:
Usage:
./myScript a b c d
Expected output:
-e "a" -e "b" -e "c" -e "d"
The script looks like the following:
#!/bin/bash
pattern=""
for arg in "$#" do
pattern=$pattern" -e \"$arg\""
done
echo $pattern
The actual output misses the first -e, i.e., the output is:
"a" -e "b" -e "c" -e "d"
What am I doing wrong? What is the correct way to append -e?
You are doing nothing wrong. It is just that echo takes -e as an argument *.
$ pattern='-e asdf -e ghjk'
$ echo $pattern
asdf -e ghjk
If you quote the variable it works as expected.
$ echo "$pattern"
-e asdf -e ghjk
* man echo
-e enable interpretation of backslash escapes
Related
I'm trying to find a number of lines that match a regex pattern in grep received as a variable. When I do the grep with the pattern directly in the command substitution, it works. When I use a variable for the pattern, it doesn't.
#!/bin/bash
pattern="'^\\\".*\\\"$'"
echo "pattern : $(echo $pattern)"
NB=$(grep -c -E -v -e ${pattern} abc.txt)
NB2=$(grep -v -c -E -e '^\".*\"$' abc.txt)
echo " -> $NB , $NB2"
Besides what's in the code, I've tried:
NB=$(grep -c -E -v -e $(echo $pattern) abc.txt)
No success.
cmd="grep -c -E -v -e ${pattern} abc.txt"
NB="$($cmd)"
No success.
In the example, abc.txt file contains 3 lines:
"abc"
"abc
abc"
The pattern in the variable seems ok:
pattern : '^\".*\"$'
I'm expecting that the 2 numbers in NB and NB2 are the same. If you look in the code, the actual result is:
pattern : '^\".*\"$'
-> 3 , 2
I expect:
pattern : '^\".*\"$'
-> 2 , 2
NB2=$(grep -v -c -E -e '^\".*\"$' abc.txt)
If that works, then assign that exact regex to $pattern. Don't add more backslashes and quotes.
pattern='^\".*\"$'
It's always a good idea to quote variable expansions to prevent unwanted wildcard expansion and word splitting.
NB=$(grep -c -E -v -e "${pattern}" abc.txt)
# ^ ^
I have a shell script with up to 5 parameters. There is files with placeholders. I would like to use sed script file depending of the one variable. The problem is that when I have variables defined in the sed script - values of those variables are not put in placeholders.
#!/bin/bash
A=$1
B=$2
echo "Some string with _PH1_ place holders _PH2_"|sed -i -f script1.sed >> file.txt
one of the sed scripts file
#Sed script 1
s/_PH1_/${A}/g
s/_PH2_/${B}/g
If your sed script is that short, you might as well inline it in bash:
#!/bin/bash
A=$1
B=$2
#your sed commands but still part of the bash script:
sed -i -e "s/_PH1_/${A}/g" file.txt
sed -i -e "/_PH2_/${B}/g" file.txt
You need to create the sed script as part of your bash script so the variable substitution takes place:
#!/bin/bash
A=$1
B=$2
cat >> script1.sed << EOF
s/_PH1_/${A}/g
s/_PH2_/${B}/g
EOF
echo "Some string with _PH1_ place holders _PH2_"|sed -i -f script1.sed >> file.txt
$ cat tst.sh
#!/bin/bash
set -a
A="$1"
B="$2"
sedScript=$(mktemp)
printf 'sed "\n' >> "$sedScript"
cat script1.sed >> "$sedScript"
printf '"\n' >> "$sedScript"
echo "Some string with _PH1_ place holders _PH2_" | "$sedScript"
rm -f "$sedScript"
$ ./tst.sh foo bar
Some string with foo place holders bar
I would like to use curly expansion to save some typing.
My desire expansion is:
-e uncore_imc0/cas_count_read/ -e uncore_imc1/cas_count_read/ -e uncore_imc2/cas_count_read/ -e uncore_imc3/cas_count_read/ -e uncore_imc4/cas_count_read/ -e uncore_imc5/cas_count_read/ -e uncore_imc6/cas_count_read/ -e uncore_imc7/cas_count_read/
I've tried:
-e uncore_imc{0..7}/cas_count_read/
but this only expand to (with -e only in the beginning)
-e uncore_imc0/cas_count_read/ uncore_imc1/cas_count_read/ uncore_imc2/cas_count_read/ uncore_imc3/cas_count_read/ uncore_imc4/cas_count_read/ uncore_imc5/cas_count_read/ uncore_imc6/cas_count_read/ uncore_imc7/cas_count_read/
If I tried:
{-e, uncore_imc{0..7}/cas_count_read/}
or
"-e uncore_imc{0..7}/cas_count_read/"
Neither would expand.
You can use printf:
printf -- '-e uncore_imc%d/cas_count_read/ ' {0..7}; echo
-e uncore_imc0/cas_count_read/ -e uncore_imc1/cas_count_read/ -e uncore_imc2/cas_count_read/ -e uncore_imc3/cas_count_read/ -e uncore_imc4/cas_count_read/ -e uncore_imc5/cas_count_read/ -e uncore_imc6/cas_count_read/ -e uncore_imc7/cas_count_read/
You can also store this expansion in a variable like this:
printf -v arg -- '-e uncore_imc%d/cas_count_read/ ' {0..7}
If this is really for a script, don't use brace expansion. You only have to write the code once, so readability should be a higher priority. Create an array instead:
opts=()
for((i=0; i < 8; i++)); do
opts+=(-e "uncore_imc$i/cas_count_read/")
done
someCommand "${opts[#]}"
I need to substitute a unique string in a json file: {FILES} by a bash variable that contains thousands of paths: ${FILES}
sed -i "s|{FILES}|$FILES|" ./myFile.json
What would be the most elegant way to achieve that ? The content of ${FILES} is a result of an "aws s3" command. The content would look like :
FILES="/file1.ipk, /file2.ipk, /subfolder1/file3.ipk, /subfolder2/file4.ipk, ..."
I can't think of a solution where xargs would help me.
The safest way is probably to let Bash itself expand the variable. You can create a Bash script containing a here document with the full contents of myFile.json, with the placeholder {FILES} replaced by a reference to the variable $FILES (not the contents itself). Execution of this script would generate the output you seek.
For example, if myFile.json would contain:
{foo: 1, bar: "{FILES}"}
then the script should be:
#!/bin/bash
cat << EOF
{foo: 1, bar: "$FILES"}
EOF
You can generate the script with a single sed command:
sed -e '1i#!/bin/bash\ncat << EOF' -e 's/\$/\\$/g;s/{FILES}/$FILES/' -e '$aEOF' myFile.json
Notice sed is doing two replacements; the first one (s/\$/\\$/g) to escape any dollar signs that might occur within the JSON data (replace every $ by \$). The second replaces {FILES} by $FILES; the literal text $FILES, not the contents of the variable.
Now we can combine everything into a single Bash one-liner that generates the script and immediately executes it by piping it to Bash:
sed -e '1i#!/bin/bash\ncat << EOF' -e 's/\$/\\$/g;s/{FILES}/$FILES/' -e '$aEOF' myFile.json | /bin/bash
Or even better, execute the script without spawning a subshell (useful if $FILES is set without export):
sed -e '1i#!/bin/bash\ncat << EOF' -e 's/\$/\\$/g;s/{FILES}/$FILES/' -e '$aEOF' myFile.json | source /dev/stdin
Output:
{foo: 1, bar: "/file1.ipk, /file2.ipk, /subfolder1/file3.ipk, /subfolder2/file4.ipk, ..."}
Maybe perl would have fewer limitations?
perl -pi -e "s#{FILES}#${FILES}#" ./myFile.json
It's a little gross, but you can do it all within shell...
while read l
do
if ! echo "$l" | grep -q '{DATA}'
then
echo "$l"
else
echo "$l" | sed 's/{DATA}.*$//'
echo "$FILES"
echo "$l" | sed 's/^.*{DATA}//'
fi
done <./myfile.json >newfile.json
#mv newfile.json myfile.json
Obviously I'd leave the final line commented until you were confident it worked...
Maybe just don't do it? Can you just :
echo "var f = " > myFile2.json
echo $FILES >> myFile2.json
And reference myFile2.json from within your other json file? (You should put the global f variable into a namespace if this works for you.)
Instead of putting all those variables in an environment variable, put them in a file. Then read that file in perl:
foo.pl:
open X, "$ARGV[0]" or die "couldn't open";
shift;
$foo = <X>;
while (<>) {
s/world/$foo/;
print;
}
Command to run:
aws s3 ... >/tmp/myfile.$$
perl foo.pl /tmp/myfile.$$ <myFile.json >newFile.json
Hopefully that will bypass the limitations of the environment variable space and the argument length by pulling all the processing within perl itself.
I am trying to pass in a string containing a newline to a PHP script via BASH.
#!/bin/bash
REPOS="$1"
REV="$2"
message=$(svnlook log $REPOS -r $REV)
changed=$(svnlook changed $REPOS -r $REV)
/usr/bin/php -q /home/chad/www/mantis.localhost/scripts/checkin.php <<< "${message}\n${changed}"
When I do this, I see the literal "\n" rather than the escaped newline:
blah blah issue 0000002.\nU app/controllers/application_controller.rb
Any ideas how to translate '\n' to a literal newline?
By the way: what does <<< do in bash? I know < passes in a file...
try
echo -e "${message}\n${changed}" | /usr/bin/php -q /home/chad/www/mantis.localhost/scripts/checkin.php
where -e enables interpretation of backslash escapes (according to man echo)
Note that this will also interpret backslash escapes which you potentially have in ${message} and in ${changed}.
From the bash manual:
Here Strings
A variant of here documents, the format is:
<<<word
The word is expanded and supplied to the command on its standard input.
So I'd say
the_cmd <<< word
is equivalent to
echo word | the_cmd
newline=$'\n'
... <<< "${message}${newline}${changed}"
The <<< is called a "here string". It's a one line version of the "here doc" that doesn't require a delimiter such as "EOF". This is a here document version:
... <<EOF
${message}${newline}${changed}
EOF
in order to avoid interpretation of potential escape sequences in ${message} and ${changed}, try concatenating the strings in a subshell (a newline is appended after each echo unless you specify the -n option):
( echo "${message}" ; echo "${changed}" ) | /usr/bin/php -q /home/chad/www/mantis.localhost/scripts/checkin.php
The parentheses execute the commands in a subshell (if no parentheses were given, only the output of the second echo would be piped into your php program).
It is better to use here-document syntax:
cat <<EOF
copy $VAR1 $VAR2
del $VAR1
EOF
You can use magical Bash $'\n' with here-word:
cat <<< "copy $VAR1 $VAR2"$'\n'"del $VAR1"
or pipe with echo:
{ echo copy $VAR1 $VAR2; echo del $VAR1; } | cat
or with printf:
printf "copy %s %s\ndel %s" "$VAR1" "$VAR2" "$VAR1" | cat
Test it:
env VAR1=1 VAR2=2 printf "copy %s %s\ndel %s" "$VAR1" "$VAR2" "$VAR1" | cat