bash script + grep + perl commands - bash

I'm working on a bash script to grep a file then to run a perl command. I know the commands work since I have been using them, but I can't seem to get them to work for the bash script. I would appreciate any help.
When I output $1 and so on it has the values, but when I output the grep command with them in it. I get file can't be found or blank spaces.
#! /bin/bash
usage()
{
echo "Usage: $0"
echo $'\a'Supply 4 arguments
echo $0 arg1 arg2 arg3 arg4
echo "1. search parameters for bookmarks"
echo "2. What file to grep"
echo "3. file to temporaly store results"
echo "4. what file to store the final version"
exit 1
}
main()
{
# if [ "$#" -lt "4" ]
# then
# usage
# else
echo "beginning grep"
grep -ir "$1" "$2" >> "$3"
echo "grep complete"
# echo "beginning perl command"
# perl -0ne 'print "$2\n$1\n" while (/a href=\"(.*?)\">(.*?)<\/a>/igs)' "$3" >> "$4"
# echo "perl command complete"
# echo "done"
# exit 1
# fi
}
main
echo $0
echo $1
echo $2
echo $3
echo $4

Remember that when a bash function is called, the positional parameters are temporarily replaced with the function's parameters. So either don't make your mainline a function or pass your main function the input parameters. To pass the script's parameters to your main function, do this:
main "$#"

Related

Parse multiple echo values in bash script

I am trying to return a value from one script to another. However, in the child script there are multiple echos, so am not sure how to retrieve a specific one in the parent scrip as if I try to do return_val = $(./script.sh) then return_val will have multiple arguments. Any solution here?
script 1:
status=$(script2.sh)
if [ $status == "hi" ]; then
echo "success"
fi
script 2:
echo "blah"
status="hi"
echo $status
Solution 1) for this specific case, you could get the last line that was printed by the script 2, using the tail -1 command. Like this:
script1.sh
#!/bin/bash
status=$( ./script2.sh | tail -1 )
if [ $status == "hi" ]; then
echo "success"
fi
script2.sh
#!/bin/bash
echo "blah"
status="hi"
echo $status
The restriction is that it will only work for the cases where you need to check the last string printed by the called script.
Solution 2) If the previous solution doesn't apply for your case, you could also use an identifier and prefix the specific string that you want to check with that. Like you can see below:
script1.sh
#!/bin/bash
status=$( ./script2.sh | grep "^IDENTIFIER: " | cut -d':' -f 2 )
if [ $status == "hi" ]; then
echo "success"
fi
script2.sh
#!/bin/bash
echo "blah"
status="hi"
echo "IDENTIFIER: $status"
The grep "^IDENTIFIER: " command will filter the strings from the called script, and the cut -d':' -f 2 will split the "IDENTIFIER: hi" string and get the second field, separated by the ':' character.
You might capture the output of script2 into a bash array and access the element in the array you are interested in.
Contents of script2:
#!/bin/bash
echo "blah"
status="hi"
echo $status
Contents of script1:
#!/bin/bash
output_arr=( $(./script2) )
if [[ "${output_arr[1]}" == "hi" ]]; then
echo "success"
fi
Output:
$ ./script1
success
Script1:-
#!/bin/sh
cat > ed1 <<EOF
1p
q
EOF
next () {
[[ -z $(ed -s status < ed1 | grep "hi") ]] && main
[[ -n $(ed -s status < ed1 | grep "hi") ]] && end
}
main () {
sleep 1
next
}
end () {
echo $(ed -s status < ed1)
exit 0
}
Script2:-
#!/bin/sh
echo "blah"
echo "hi" >> status

Run script against a directory or a file

I have a working script that parses a text file and creates a new file from the output. How do I run this script against a single file OR a directory of files instead? Below is a general overview of the working script. Thank you for the help.
#!/usr/bin/env bash
if [ -f "$1" ]; then
*Run Some Commands against file* "$1" >> NewFile.txt
echo "Complete. Check NewFile.txt"
else
echo "Expected a file at $1, but it doesn't exist." >&2
fi
You could check if the passed argument is a directory and if so, write a loop to process the files in that directory:
#!/usr/bin/env bash
if (($# = 0)); then
echo "No arguments given" >&2
exit 2
fi
arg=$1
if [ -f "$arg" ]; then
*Run Some Commands against file* "$1" >> NewFile.txt
echo "Complete. Check NewFile.txt"
elif [ -d "$arg" ]; then
shopt -s nullglob
for file in "$arg"/*; do
# run command against "$file"
done
else
echo "Expected a file or directory as $1, but it doesn't exist." >&2
fi
An easier solution (which also recurses) is making it X-dimensional:
#!/usr/bin/env bash
if [ -d $1 ]; then
for i in $1/*; do
# start another instance of this script
$0 $1/$i
done
fi
if [ -f "$1" ]; then
*Run Some Commands against file* "$1" >> NewFile.txt
echo "Complete. Check NewFile.txt"
else
echo "Expected a file at $1, but it doesn't exist." >&2
fi

How to pass asterisk as argument to another script in bash

At first I want to assure that I was looking for the answer for a few hours by now and I've read a lot similar questions but none of them solved my problem.
Straight to the point now:
I have two scripts in bash: one is "tool" that do some stuff for me and the second one is main "for user" script.
I want to pass to the tool script various patterns (like "[A-Za-z0-9]*" or "&")
And here is some code:
#!/bin/bash
SET() {
wz1=`./PREP2.sh $1 $2 '[0-9A-Za-z]\*'`
wz2=`./PREP2.sh $1 $2 '&'`
echo $wz1
echo $wz2
}
SET $1 $2
Tool script is actually working if I declare patterns inside like this:
line='[0-9A-Za-z]*'
But when I pass the same pattern with
'\*'
I can't get rid of "\" without interpreting "*" as "show all files in catalog".
I've been trying to use eval inside the tool like this:
eval echo '$3'
But it didn't work.
Full code follow.
User script:
#!/bin/bash
SET() {
#echo '[0-9A-Za-z]*'
wzor1=$(./PREP2.sh "$1" "$2" '[0-9A-Za-z]*')
wzor2=`./PREP2.sh $1 $2 '&'`
echo $wzor1
echo $wzor2
}
SET $1 $2 $4
Tool code
#!/bin/bash
PREP2() {
#echo "$3"
wzor="`./PREP.sh $1 $2 | tee linie.txt`"
#tmp="`echo $wzor | sed 's/,/,%/'`"
#echo $tmp;
./ZAMIEN_WSZYSTKIE_WYSTAPIENIA.sh linie.txt , #%
#tmp="`echo $wzor | tr '#' '\n x' | tee linie.txt`"
tmp="`tr '#' '\n x' < linie.txt | tee linie.txt`"
llini=`echo "$tmp" | wc -l`
#echo liczba lini $llini
i=1
wzor=""
while [ $i -le $llini ];
do
linia="`eval sed -n -e $i\p linie.txt | cut -d '%' -f2`"
if [ -z "$linia" ];then
#linia='[0-9A-Za-z]*'
linia=`eval '$3'`
#echo $linia
fi
if [ $i -ne 1 ];then
#echo "kolejna wartosc"
wzor=$wzor\,$linia
else
#echo "pierwsza wartosc"
wzor=$linia
fi
i=`expr $i + 1`
done
echo $wzor
#wynik="`grep -v "$wzor" $1`"
#echo "$wynik" > $1
#echo $nowy_wpis >> $1
}
eval echo "$3"
#PREP2 $1 $2 $3
And just to clear things up I don't actually go into procedure because I know it is working weird because of the arguments I put into it.
Quotes, quotes, quotes and more quotes. And prefer $() to backticks, that saves some quoting problems.
#!/bin/bash
SET() {
wz1=$(./PREP2.sh "$1" "$2" '[0-9A-Za-z]*')
wz2=$(./PREP2.sh "$1" "$2" '&')
echo "$wz1"
echo "$wz2"
}
SET "$1" "$2"
(BTW: it's unusual to have function names all uppercase. That's usually for environment variables.)

Shell quoting with quotation marks in bash

I want to implement a bash function that runs its arguments as a command, while (maybe optionally) printing the command before. Think of an installation script or test runner script.
Just using
function run () {
echo "Running $#"
"$#"
}
would not allow me to distinguish a call from run foo arg1 arg2 and run foo "arg1 arg2", so I need to properly escape arguments.
My best shot so far is
function run () {
echo -n "Running"
printf " %q" "$#"
echo
"$#"
}
Which works:
$ run echo "one_argument" "second argument" argument\"with\'quotes
Running echo one_argument second\ argument argument\"with\'quotes
one_argument second argument argument"with'quotes
but is not very elegant. How can I achieve an output of
$ run echo "one_argument" "second argument" argument\"with\'quotes
Running echo one_argument "second argument" "argument\"with'quotes"
one_argument second argument argument"with'quotes
i.e. how can I make printf to put quotation marks around arguments that need quoting, and properly escape quotes therein, so that the output can be copy’n’pasted correctly?
This will quote everything:
run() {
printf "Running:"
for arg; do
printf ' "%s"' "${arg//\"/\\\"}"
done
echo
"$#"
}
run echo "one_argument" "second argument" argument\"with\'quotes
Running: "echo" "one_argument" "second argument" "argument\"with'quotes"
one_argument second argument argument"with'quotes
This version only quotes arguments containing double quotes or whitespace:
run() {
local fmt arg
printf "Running:"
for arg; do
[[ $arg == *[\"[:space:]]* ]] && fmt=' "%s"' || fmt=" %s"
printf "$fmt" "${arg//\"/\\\"}"
done
echo
"$#"
}
run echo "one_argument" "second argument" argument\"with\'quotes
Running: echo one_argument "second argument" "argument\"with'quotes"
one_argument second argument argument"with'quotes
I don't think there's an elegant solution to what you want, because "$#" is handled by bash before any command ever gets to see it. You'll have to manually re-construct the command-line:
#!/bin/bash
function run() {
echo -n "Running:"
for arg in "$#"; do
arg="$(sed 's/"/\\&/g' <<<$arg)"
[[ $arg =~ [[:space:]\\\'] ]] && arg=\"arg\"
echo -n " $arg"
done
echo ""
"$#"
}
run "$#"
Output:
$ ./test.sh echo arg1 "arg 2" "arg3\"with'other\'\nstuff"
Running: echo arg1 "arg 2" "arg3\"with'other\'\nstuff"
arg1 arg 2 arg3"with'other\'\nstuff
Note that there are some corner cases where you won't get the exact input command line. This happens when you pass arguments that bash expands before passing them on, e.g.:
$ ./test.sh echo foo'bar'baz
Running: echo foobarbaz
foobarbaz
$ ./test.sh echo "foo\\bar"
Running: echo "foo\bar"
foobar

Make Bash script exit and print error message if users invoke the script incorrectly

Script needed was
#!/bin/bash
# Check if there are two arguments
if [ $# -eq 2 ]; then
# Check if the input file actually exists.
if ! [[ -f "$1" ]]; then
echo "The input file $1 does not exist."
exit 1
fi
else
echo "Usage: $0 [inputfile] [outputfile]"
exit 1
fi
# Run the command on the input file
grep -P "^[\s]*[0-9A-Za-z-]+.?[\s]*$" "$1" > "$2"
Edit, the script has changed to
grep -P "^[\s]*[0-9A-Za-z-]+.?[\s]*$" $*
if [ ! -f "$1" ]; then
echo 'Usage: '
echo
echo './Scriptname inputfile > outputfile'
exit 0
fi
invoking the script with no parameters gives no erros and sits blank
Usage:
./Scriptname inputfile > outputfile
I have bit of code
grep -P "^[\s]*[0-9A-Za-z-]+.?[\s]*$" $*
This code pulls lines that have a single word on them and pumps the output to a new file, so for example
This is a multi word line
this
the above line is not
now
once again wrong
The output would be
This
now
The code works, users invoke the code using ./scriptname file > newfile
However, I am trying to expand the code to give users an error message if they invoke the script incorrectly.
For the error messange, I'm thinking of echoing something back like scriptname file_to_process > output_file.
I did try
if [incorrectly invoted unsure what to type]
echo $usage
exit 1
Usage="usage [inputfile] [>] [outputfile]
However I have had little luck. The code runs but does nothing if I invoke with just the script name. Also, if I invoke the script with just the scriptname and the input file, it will output the results instead of exiting with the error message.
Other ones I have tried are
if [ ! -n $1 ]; then
echo 'Usage: '
echo
echo './Scriptname inputfile > outputfile'
exit 0
fi
Given replies I have received so far, my code now is
#!/bin/bash
grep -P "^[\s]*[0-9A-Za-z-]+.?[\s]*$" $*
if [ ! -f "$1" ]; then
echo 'Usage: '
echo
echo './Scriptname inputfile > outputfile'
exit 0
fi
When invoking the script without an input file the script does nothing and has to be aborted with ctrl+c, still trying to get the echo of the invoke message.
When you are invoking the script like ./scriptname file > newfile, the shell interprets file as the only argument to ./scriptname. This is because > is the standard output redirection operator.
I would like to propose 2 possible alternatives:
Alternative 1:
Maybe you're can try passing it as 1 argument like this?
./scriptname 'file > newfile'
In that case one way to check the format would be
#!/bin/bash
# Check if the format is correct
if [[ $1 =~ (.+)' > '(.+) ]]; then
# Check if the input file actually exists.
if ! [[ -f "${BASH_REMATCH[1]}" ]]; then
echo "The input file ${BASH_REMATCH[1]} does not exist!"
exit 1
fi
else
echo "Usage: $0 \"[inputfile] [>] [outputfile]\""
exit 1
fi
# Redirect standard output to the output file
exec > "${BASH_REMATCH[2]}"
# Run the command on the input file
grep -P "^[\s]*[0-9A-Za-z-]+.?[\s]*$" "${BASH_REMATCH[1]}"
Note: If you are checking whether the arguments are valid or not, it's generally better to run commands only after the checking is done.
Alternative 2:
Passing 2 arguments like
./scriptname file newfile
The script looks like this
#!/bin/bash
# Check if there are two arguments
if [ $# -eq 2 ]; then
# Check if the input file actually exists.
if ! [[ -f "$1" ]]; then
echo "The input file $1 does not exist."
exit 1
fi
else
echo "Usage: $0 [inputfile] [outputfile]"
exit 1
fi
# Run the command on the input file
grep -P "^[\s]*[0-9A-Za-z-]+.?[\s]*$" "$1" > "$2"
I'd use parameter expansion for this:
inputfile=${1:?Usage: $(basename $0) inputfile > outputfile}
If the script is called without arguments (i.e. $1 is unset) the ${var:?error message} expansion causes the shell to display an error with the given message and exit. Otherwise the first argument is assigned to $inputfile.
Try to add double quotes around $1 and use -f to check for exists and is normal file:
if [ ! -f "$1" ]; then
echo 'Usage: '
echo
echo './Scriptname inputfile > outputfile'
exit 0
fi
Also you can check for the param count with $# and cat an usage message:
if [ ! $# -eq 1 ]; then
cat << EOF
Usage:
$0 'input_file' > output_file
EOF
exit 1
fi

Resources