calling and changing a file using sed command within a function - bash

Hi I have wrapped a sed command (which works out it's own) within a shell function.
#!/bin/bash
snp2fasta() {
sed -i "s/^\(.\{'$2'\}\)./\1'$3'/" $1;
}
and call it with
$ ./snp2fasta input.txt 45 A
no changes are made to input.txt
However if I simply do
$ sed -i 's/^\(.\{45\}\)./\1A/' input.txt
then this works and the file is changed by changing the 45th character to an A.
However when wrapping into a shell script (to handle command line variables) the shell script snp2fasta.sh runs fine, but no changes are made to the file.
why is this?

if you put it into a script, no more need of the function call ouside the script, use it directly intor the script.
Like on the other related post ( Use argument to...) about state it (to secure thje $1,2 and 3 content)
#!/bin/bash
# argument passed to script (or any other source if needed like intern to script)
File=$1
Place=$2
NewChar=$3
# sed with unambigous variable content
sed -i "s/^\(.\{${Place}\}\)./\1${NewChar}/" ${File}

Related

Perl replace content of a file with variable

I have a a.md file:
Start file
```
replace content here by a variable or result of a command such as pwd,
the whole content in this code block should be something like /Users/danny/MyProjects/CurrentDir
```
End file
And a script in update.sh to update file above:
#!/bin/sh
t=$(pwd)
echo $t
perl -i -p0e 's/(?<=```shell\n)[\s\S]*(?=\n```)/$t/s' a.md
It works fine with a string but I cannot replace with a variable such as $(pwd).
A Perl command-line program in that shell script cannot use variables from the script just so but we need to pass variables to it somehow.
There are a few ways to do that,† and perhaps using -s switch is simplest here
#!/bin/bash
t=$(pwd)
echo $t
perl -i -0777 -s -pe's/(?<=```shell\n)[\s\S]*(?=\n```)/$d/s' -- -d="$t" a.md
# or
# perl -i -0777 -s -pe'...' -- -d="$(pwd)" a.md
The -s for perl enables a basic support for switches for the program itself.
So that -d=... becomes available as $d in the program, either with the assigned value, here the value of the bash variable $t, or 1 if not assigned (-d). We can pass multiple variables this way, each in its own switch.
The -- after the program mark the start of arguments.
We do not need a shell variable but can directly use a command output, -var="$(pwd)".
† Aside from using storage (files, databases, etc) or pipes, there are two more ways to directly pass arguments given to this command-line ("one-liner") program
Pass the variable as an argument and read it from #ARGV in the program
t=$(pwd)
perl -i -0777 -s -pe'BEGIN { $d = shift }; s/.../$d/s' "$t" a.md
We need to also remove it from #ARGV so that the files can then be processed, which is what shift does, and need to do this in a BEGIN block since -p sets a loop.
Export a variable in bash making it an environment variable and a Perl script can then use that via %ENV variable
export t=$(pwd)
perl -i -0777 -s -pe's/.../$ENV{t}/s' a.md
Note, it's t in the %ENV hash ($ENV{t}), the name of the variable, not $t (value)
See for example this post and this post

bash: filenames as parameter, perform action in cycle

My current script goes like:
#!/bin/bash
victims=*asci*
for f in $victims ; do
awk /some blah blah here/ ;done
so basically takes all files containing ascii in their name and performs an action on them.
I wanted, however, the filenames be entered as a parameter. Like:
bash myscript.sh *log* for example.
When using
#!/bin/bash
victims="$1"
for f in $victims ; do
awk /some blah blah here/ ;done
it doesnt do what expected. Performs only on the first file (as far as I remember).
May I ask for a help? Want the script to perform a function over a bunch of files that contain the parameter in their filename. Im not very experienced in bash, honestly. Thanks, cheers!
If you're just calling awk then you don't even need the for loop. Just pass it all of the file names at once.
awk '/whatever/' "$#"
If you do want to loop over all the command-line arguments, write:
for f in "$#"; do
...
done
Or, since in "$#" is implied:
for f; do
...
done
If you want to store them in an intermediate variable, you need to use an array:
victims=("$#")
for f in "${victims[#]}"; do
...
done
Also, you should avoid explicitly invoking bash. Run the script directly so it can use whatever shell's listed in its shebang line.
bash myscript.sh *log*
./myscript.sh *log*
You need to watch out how you call your script. Suppose your script myscript.sh is simply
victims="$1"
echo "$victims"
and your cwd contains files a.log, another.log and logmore.txt.
Then, executing
myscript.sh *log*
Wil result in simply
a.log
because "*log*" is interpreted by the shell before calling myscript.sh. In fact, you're executing
myscript.sh a.log another.log logmore.txt
and your script only handles the first parameter. Also very funny is, when your cwd contains no file with "log" in its name, your script will result in:
*log*
So, your call should be:
myscript.sh "*log*"
and your script should handle the fact that its input may be a regulare expression iso. an existing filename.

Procmail - passing body to bash script

I have one liner mails that I wish to send from procmail into a bash script. I only want the body to be sent, nothing else.
Currently my .procmailrc looks like this:
:0
*^ Subject.*Xecute Command$
{
:0 bf
| /bin/bash /path/to/script
}
And my Bash script is simple:
#!/bin/bash
echo -e "\rLT 4>$0\r\r" > /dev/ttyS1
I don't get any input or output from anywhere.
Any pointers?
If the intention is to add some decorations to the email message and print it to a serial port (?), try a recipe like
:0b
*^ Subject.*Xecute Command$
| ( printf '\rLT 4>'; cat -; printf '\r\r' ) > /dev/ttyS1
The b flag applies to the action line if the condition matches, so you don't need the braces and a new conditionless recipe; the f flag makes no sense at all in this context. (Though if you want to keep the message for further processing, you'll want to add a c flag.)
Also, for the record, $0 in Bash is the name of the currently running script (or bash itself, if not running a script), and $# is the list of command-line arguments. But Procmail doesn't use either of these when it pipes a message to a script; it is simply being made available on standard input.
If you want the action in an external script, that's fine, too, of course; but a simple action like this is probably better written inline. You don't want or need to specify bash explicitly if the script file is executable and has a proper shebang; the reason to have a shebang is to make the script self-contained.
In response to comments, here is a simple Perl script to extract the first line of the first body part, and perform substitutions.
perl -nle 'next if 1../^$/;
s/\<foo\>/bar/g;
print "\rLT 4>$_\r\r"; exit(0)'
This would not be hard to do in sed either, provided your sed understands \r.
Write your script like that:
{
echo -e "\rLT 4>"
cat
echo -e "\r\r"
} > /dev/ttyS1
formail is your friend!
Pipe the message into:
:0
*^ Subject.*Xecute Command$
| formail -I "" | your-bash-script

How to pass a shell script argument as a variable to be used when executing grep command

I have a file called fruit.txt which contains a list of fruit names (apple, banana.orange,kiwi etc). I want to create a script that allows me to pass an argument when calling the script i.e. script.sh orange which will then search the file fruit.txt for the variable (orange) using grep. I have the following script...
script name and argument as follows:
script.sh orange
script snippet as follows:
#!/bin/bash
nameFind=$1
echo `cat` "fruit.txt"|`grep` | $nameFind
But I get the grep info usage command and it seems that the script is awaiting some additional command etc. Advice greatly appreciated.
The piping syntax is incorrect there. You are piping the output of grep as input to the variable named nameFind. So when the grep command tries to execute it is only getting the contents of fruit.txt. Do this instead:
#!/bin/bash
nameFind=$1
grep "$nameFind" fruit.txt
Something like this should work:
#!/bin/bash
name="$1"
grep "$name" fruit.txt
There's no need to use cat and grep together; you can simply pass the name of the file as the third argument, after the pattern to be matched. If you want to match fixed strings (i.e. no regular expressions), you can also use the -F modifier:
grep -F "$name" fruit.txt

Passing multiple arguments in a bash script

The simple script below does not work when, rather than passing a single file name, I want to pass multiple files through expansion characters like *
#!/bin/bash
fgrep -c '$$$$' $1
If I give the command script.sh file.in the script works. If I give the command script.sh *.in it doesn't.
Use "$#" to pass multiple file names to fgrep. $1 only passes the very first file name.
fgrep -c '$$$$' "$#"

Resources