sed - How to ignore many special characters together in a string - shell

I have many shell commands in a large shell script. I want to comment many of them. For eg
exec_cmd "mkdir -p $dockerHome/devicemapper/devicemapper"
I am able to replace this command with :
sed -i -e "s/exec_cmd \"mkdir \-p \$dockerHome\/devicemapper\/devicemapper\"/\#exec_cmd \"mkdir \-p \$dockerHome\/devicemapper\/devicemapper\"/g" check
Now, there are quite a few commands like this in a file. Is there a way to write a sed command which ignores all the special characters at once since otherwise the alternative seems to be putting a backslash in front of all special characters in the strings to be replaced.

Perl to the rescue.
Its quotemeta function quotes all the metacharacters for you.
#! /usr/bin/perl -pl
use warnings;
BEGIN {
$regex = join '|', map quotemeta, split /\n/, << '__LIST__';
exec_cmd "mkdir -p $dockerHome/devicemapper/devicemapper"
another command to be commented
__LIST__
$regex = qr/^($regex)$/;
}
s/$regex/$1 ? "# $1" : $_/e
Just fill in the script lines to comment before the __LIST__. Save to a file, run as
perl script.pl input-file > output-file
To edit the input file directly, you can use the -i option.

Related

Perl one-liner substitution without further expansion of a shell variable?

I have a script to replace a specific email address in various files. The replacement address is the first parameter to the script:
#!/bin/bash
perl -pi -e s/'name\#domain\.org'/$1/ file-list
This doesn't work, as the # character in $1 is substituted by perl. Is there a straightforward fix for this? Running the script as subst foo\#bar.com, subst foo\\#bar.com, subst "foo#bar.com", and so on, doesn't work. Is there a sed script that could handle this more easily?
Instead of expanding a shell variable directly in the perl code you can pass it as an argument by setting the s switch:
#!/usr/bin/env bash
perl -i -spe 's/name\#domain\.org/$replacement/' -- -replacement="$1" file1.txt file2.txt
In perl's s///, without the use of e or ee modifiers, variables in the replacement part are treated as literals so you don't need to escape them.
This works, but needs you to pass the new mail address to the script with the # character preceded by \\:
#!/bin/bash
perl -pi -e "s/name\#domain.org/$1/" file-list
If the script is subst, run as:
subst newname\\#example.com
This is a better alternative, which uses sed to carry out the escaping:
#!/bin/bash
ADR="$(echo "$1" | sed -e 's/#/\\\#/')"
perl -pi -e "s/name\#domain.org/$ADR/" file-list
Of course, in this case it's probably better to use sed to do the whole thing.

How to replace a hashtag curly bracket string with an environment variable by using sed

I have being trying to write a bash script that can search recursively in a directory and replace multiple strings e.g. #{DEMO_STRING_1} etc with an environment variable e.g. $sample1.
Full script:
#!/bin/sh
find /my/path/here -type f -name '*.js' -exec sed -i \
-e 's/#{DEMO_STRING_1}/'"$sample1"'/g' \
-e 's/#{DEMO_STRING_2}/'"$sample2"'/g' \
-e 's/#{DEMO_STRING_3}/'"$sample3"'/g' \
-e 's/#{DEMO_STRING_4}/'"$sample4"'/g' \
-e 's/#{DEMO_STRING_5}/'"$sample5"'/g' \
-e 's/#{DEMO_STRING_6}/'"$sample6"'/g' \
-e 's/#{DEMO_STRING_7}/'"$sample7"'/g' \
-e 's/#{DEMO_STRING_8}/'"$sample8"'/g' \
{} +
I can not figure out how to replace strings with hashtag with curly brackets.
I tried this example: sed find and replace with curly braces or Environment variable substitution in sed but I can not figure out how to combine them.
What I am missing? I searched also for characters that need to be escaped e.g. What characters do I need to escape when using sed in a sh script? but again not the characters that I need.
The specific format is throwing the following error:
sed: bad option in substitution expression
Where am I going so wrong?
Update: Sample of environment variables:
https://www.example.com
/sample string/
12345-abcd-54321-efgh
base64 string
All the cases above are environment variables that I would like to replace. All environment variables are within double quotes.
It is important to understand that the environment variable references are expanded by the shell, as it prepares to execute the command, not by the command itself (sed in this case). The command sees only the results of the expansions.
In your case, that means that if any of the environment variables' values contain characters that are meaningful to sed in context, such as unescaped (to sed) slashes (/), then sed will attribute special significance to them instead of interpreting them as ordinary characters. For example, given a sed command such as
sed -e "s/X/${var}/" <<EOF
Replacement: X
EOF
, if the value of $var is Y then the output will be
Replacement: Y
, but if the value of $var is /path/to/Y then sed will fail with the same error you report. This happens because the sed command actually run is the same as if you had typed
sed -e s/X//path/to/Y
, which contains an invalid s instruction. Probably the best alternative would be to escape the replacement-string characters that otherwise would be significant to sed. You can do that by interposing a shell function:
escape_replacement() {
# replace all \ characters in the first argument with double backslashes.
# Note that in order to do that here, we need to escape them from the shell
local temp=${1//\\/\\\\}
# Replace all & characters with \&
temp=${temp//&/\\&}
# Replace all / characters with \/, and write the result to standard out.
# Use printf instead of echo to avoid edge cases in which the value to print
# is interpreted to be or start with an option.
printf -- "%s" "${temp//\//\\/}"
}
Then the script would use it like this:
find /my/path/here -type f -name '*.js' -exec sed -i \
-e 's/#{DEMO_STRING_1}/'"$(escape_replacement "$sample1")"'/g' \
...
Note that you probably also want to use a shebang line that explicitly specifies a shell that supports substitution references (${parameter/pattern/replacement}), because these are not required by POSIX, and you might run into a system where /bin/sh is a shell that does not support them. If you're willing to rely on Bash then that should be reflected in your shebang line. Alternatively, you could prepare a version of the escape_replacement function that does not rely on substitution references.
If you use perl - you don't need to escape anything.
With your shell variable exported you can access it via $ENV{name} inside perl.
examples:
samples=(
https://www.example.com
'/sample string/'
12345-abcd-54321-efgh
'base64 string'
$'multi\nline'
)
for sample in "${samples[#]}"
do
echo '---'
export sample
echo 'A B #{DEMO_STRING_1} C' |
perl -pe 's/#{DEMO_STRING_1}/$ENV{sample}/g'
done
echo '---'
Output:
---
A B https://www.example.com C
---
A B /sample string/ C
---
A B 12345-abcd-54321-efgh C
---
A B base64 string C
---
A B multi
line C
---
To add the -i option you can: perl -pi -e 's///'

replace all $ signs by `jQuery` with bash

I need to write a bash script that will first combine a few files into one and then replace all $ signs with string jQuery. I'm doing well till the replacing:
#!/usr/bin/env bash
# Run this script in this directory to compile the fixed query widget
# where we'll write the output to
OUT="compliled.js"
# path to dependencies
# combine all the dependencies into a single file in the right order (without jquery)
cat first_component.js <(echo) \
second_component.js <(echo) \
third_component.js <(echo) \
> $OUT
#replace all $ with jQuery
sed -i 's/$/jQuery/g' $OUT
But this way the result is jQuery string at the end of each line and all $'s untouched. Could someone please explain me what is going on here and maybe how to fix it?
$ is a special character denoting the end of each line in regular expressions, so you should escape it with a backslash:
sed -i 's/\$/jQuery/g' $OUT
^^

replace a variable content with another variable content in a file using perl

I am writing a shell script to replace a variable content which is an integer with another, using perl in a shell script.
#!/bin/sh
InitialFileStep="$1"
CurrentStage=$((($i*(9*4000))+$InitialFileStep))
PreviousStage=$((($(($(($i-1))*(9*4000)))) + (InitialFileStep)))
perl -pi -e 's/$PreviousStage/$CurrentStage/g' file.txt
echo "Hey!"
It seems it cannot find the variable content in the file.
I don't know what is the problem, is it the because the variables are integers and not strings?
The shell does not interpret anything inside single quote. '$ThisIsASimpleStringForTheShell' so try:
perl -pi -e 's/'"$PreviousStage"'/'"$CurrentStage"'/g' file.txt
The double quotes prevents possible spaces to mess up your command.
Mixing single and double quotes gives you the possibility to add regex operator to your command, preventing shell to interpret them before perl. This command substitute the contents of $PreviousStage with the contents of $CurrentStage only if the former is alone in a single line:
perl -pi -e 's/^'"$PreviousStage"'$/'"$CurrentStage"'/g' file.txt
The variables only exist in your shell script; you can't directly use them in your Perl script.
I hate attempting to generate Perl code from the shell as the previous solutions suggest. That way madness lies (though it works here since you're just dealing with integers). Instead, pass the values as arguments or some other way.
perl -i -pe'BEGIN { $S = shift; $R = shift; } s/\Q$S/$R/g' \
"$PreviousStage" "$CurrentStage" file.txt
or
export PreviousStage
export CurrentStage
perl -i -pe's/\Q$ENV{PreviousStage}/$ENV{CurrentStage}/g' file.txt
or
S="$PreviousStage" R="$CurrentStage" perl -i -pe's/\Q$ENV{S}/$ENV{R}/g' file.txt

perl inside bash: How to call perl on a script saved in a string

I need to execute the same perl script, multiple times, on different files.
To ease the process, I am trying to save the perl script as a bash string,
and call perl over the string, as in the "doesn't work" part of the code below:
#!/bin/sh
# works
perl -e 'print 1;'
# doesn't work
S="'print 1;'"
perl -e $S
perl -e $S
I get the following output:
1Can't find string terminator "'" anywhere before EOF at -e line 1.
Can't find string terminator "'" anywhere before EOF at -e line 1.
What am I doing wrong here? Can I achieve the same effect in some other way?
You simply have too many quotes in your string $S:
#!/bin/sh
# works
perl -e 'print 1;'
# also works
S='print 1;'
perl -e "$S"
I have also added some double quotes around "$S", which prevents problems with word splitting.
Another option is to use the -x switch to Perl:
#!/bin/sh
perl -x "$0"
echo <<EOF >/dev/null
#!/usr/bin/env perl
my $a=5;
print "$a\n";
__END__
EOF
echo 'something else'
$0 is the name of the current script, so Perl looks for the first line starting with #! and containing perl and interprets everything up to __END__ as a Perl script. The echo >/dev/null prevents the Perl script from being interpreted by the shell.

Resources