The simple script below does not work when, rather than passing a single file name, I want to pass multiple files through expansion characters like *
#!/bin/bash
fgrep -c '$$$$' $1
If I give the command script.sh file.in the script works. If I give the command script.sh *.in it doesn't.
Use "$#" to pass multiple file names to fgrep. $1 only passes the very first file name.
fgrep -c '$$$$' "$#"
Related
i am writing a shell script practice.sh. I want to give my first argument $1 from command line to ls command in script.e.g
if I run my script in terminal $bash practice.sh *.mp3
the argument *.mp3
I want to use for ls command
#!/bin/bash
output=$ls $1
it doesn't work
any help?
The obvious answer for what you say you want is just
#!/bin/bash
ls "$1"
which will run ls, passing it (just) the first argument to the script.
However, you also say you want to run this like: practice.sh *.mp3 which runs the script with many arguments (not just one) -- the *.mp3 will be expanded to be all the of the .mp3 files in the current directory. For that, you likely want something more like
#!/bin/bash
ls "$#"
which will pass all of the arguments to your script (however many there are) to the ls command.
These scripts will just run ls with its stdout connected to whatever your script has its stdout connceted to, so the output will (likely) just appear on your terminal. If you instead want to capture the output of the ls command (so you can do something else with it), you need something like
#!/bin/bash
output=$(ls "$#")
which will run ls with all the arguments, and capture the output in the variable $output. You can then do things with that variable.
Use shell expansion to record the output of the command in the variable output:
output=$(ls $1)
This will record the output of the command ls $1 in the variable output.
You can then use echo $output to print out your output.
You can read more about shell expansion in the GNU Bash reference manual.
I have a frequent situation where I want to run a given command over all files that match a certain pattern. As such I have the following:
iterate.sh
#!/bin/bash
for file in $1; do
$2
done
So I can use it as such iterate.sh "*.png" "echo $file"
The problem is when the command is being ran it doesn't seem to have access to the $file variable as that sample command will just output a blank line for every file.
How can I reference the iterator $file from the arguments of the program?
#!/bin/bash
for file in $1; do
eval $2
done
iterate.sh "*.png" 'echo $file'
Need single quotes around the argument with $file so it doesn't expand on the command line. Need to eval in the loop to actually do the command in the argument instead of just expanding the argument.
I am creating a bash script that will simply use grep to look through a bunch of logs for a certain string.
Something interesting happens though.
For the purpose of testing all of the log files the files are named test1.log, test2.log, test3.log, etc.
When using the grep command:
grep -oHnR TEST Logs/test*
The output contains all instances from all files in the folder as expected.
But when using a command but contained in the bash script below:
#!/bin/bash
#start
grep -oHnR $1 $2
#end
The output displays the instances from only 1 file.
When running the script I am using the following command:
bash test.bash TEST Logs/test*
Here is an example of the expected output (what occurs when simply using grep):
Logs/test2.log:8:TEST
Logs/test2.log:20:TEST
Logs/test2.log:41:TEST
Logs/test.log:2:TEST
Logs/test.log:18:TEST
and here is an example of the output received when using the bash script:
Logs/test2.log:8:TEST
Logs/test2.log:20:TEST
Logs/test2.log:41:TEST
Can someone explain to me why this happens?
When you call the line
bash test.bash TEST Logs/test*
this will be translated by the shell to
bash test.bash TEST Logs/test1.log Logs/test2.log Logs/test3.log Logs/test4.log
(if you have four log files).
The command line parameters TEST, Logs/test1.log, Logs/test2.log, etc. will be given the names $1, $2, $3, etc.; $1 will be TEST, $2 will be Logs/test1.log.
You just ignore the remaining parameters and use just one log file when you use $2 only.
A correct version would be this:
#!/bin/bash
#start
grep -oHnR "$#"
#end
This will pass all the parameters properly and also take care of nastinesses like spaces in file names (your version would have had trouble with these).
To understand what's happening, you can use a simpler script:
#!/bin/bash
echo $1
echo $2
That outputs the first two arguments, as you asked for.
You want to use the first argument, and then use all the rest as input files. So use shift like this:
#!/bin/bash
search=$1
shift
echo "$1"
echo "$#"
Notice also the use of double quotes.
In your case, because you want the search string and the filenames to be passed to grep in the same order, you don't even need to shift:
#!/bin/bash
grep -oHnR -e "$#"
(I added the -e in case the search string begins with -)
The unquoted * is being affected by globbing when you are calling the script.
Using set -x to output what is running from the script makes this more clear.
$ ./greptest.sh TEST test*
++ grep -oHnR TEST test1.log
$ ./greptest.sh TEST "test*"
++ grep -oHnR TEST test1.log test2.log test3.log
In the first case, bash is expanding the * into the list of file names versus the second case it is being passed to grep. In the first case you actually have >2 args (as each filename expanded would become an arg) - adding echo $# to the script shows this too:
$ ./greptest.sh TEST test*
++ grep -oHnR TEST test1.log
++ echo 4
4
$ ./greptest.sh TEST "test*"
++ grep -oHnR TEST test1.log test2.log test3.log
++ echo 2
2
You probably want to escape the wildcard on your bash invocation:
bash test.bash TEST Logs/test\*
That way it'll get passed through to grep as an *, otherwise the shell will have expanded it to every file in the Logs dir whose name starts with test.
Alternatively, change your script to allow more than one file on the command line:
#!/bin/bash
hold=$1
shift
grep -oHnR $hold $#
I'm writing a very small bash script to merge some files in a directory.
Say I have a directory full of files:
deb_1
deb_2
deb_3
deb_4
...
I want to write a small bash script to merge them all into a file, and delete the originals
So I would run, mrg deb* outputfile, and the resulting directory would look like:
outputfile
Containing all of the deb files merged. The way I do it normally is cat deb* > outputfile && rm deb* -f
However trying to convert this to a bash script doesn't quite work out:
#!bin/bash
cat $1 > $2 && rm $1 -f
The wildcard expansion replaces $1-> deb_1,$2-> deb_2
Keep your script as is:
#!bin/bash
cat $1 > $2 && rm $1 -f
But apply single quotes to the first argument when calling it:
bash myscript.sh 'deb*' outputfile
Along the line that #Eugeniu mentioned in his comment, something like
$ myscript outputfile files*
would be possible, given the following definition of myscript:
#!/bin/bash
OUTPUT="$1";shift
cat "$#" > "$OUTPUT" && rm "$#" -f
$# is a list of all command line arguments,
and "$#" is a list of separately-quoted command line arguments: "deb01" "deb02" "deb03".
shift is used to remove the output file from the list of parameters so that it does not appear in the expansion of $#.
cat concatenates the files together into your ouput file,
and rm removes the originals.
Usage
The general form of the command becomes:
myscript OUTPUT [INPUT_FILE]...
In your case, you'd want to call this as
$ myscript outputfile deb_{1..4}/*
which grabs every file inside of each directory as command line arguments to myscript
Alternate implementation
Using the last argument as the output file is probably possible (using $#), but requires more work to remove the last parameter from $#.
A simple way to overcome this — with the obvious modification to the script to use stdout — would be:
$ myscript input files... > outputfile
Since redirections are applied by the shell (truncating and opening outputfile for input) before the command is run, the rm would still be safe in exactly the same sense that it is now — which is questionable, IMO.
I have a file called fruit.txt which contains a list of fruit names (apple, banana.orange,kiwi etc). I want to create a script that allows me to pass an argument when calling the script i.e. script.sh orange which will then search the file fruit.txt for the variable (orange) using grep. I have the following script...
script name and argument as follows:
script.sh orange
script snippet as follows:
#!/bin/bash
nameFind=$1
echo `cat` "fruit.txt"|`grep` | $nameFind
But I get the grep info usage command and it seems that the script is awaiting some additional command etc. Advice greatly appreciated.
The piping syntax is incorrect there. You are piping the output of grep as input to the variable named nameFind. So when the grep command tries to execute it is only getting the contents of fruit.txt. Do this instead:
#!/bin/bash
nameFind=$1
grep "$nameFind" fruit.txt
Something like this should work:
#!/bin/bash
name="$1"
grep "$name" fruit.txt
There's no need to use cat and grep together; you can simply pass the name of the file as the third argument, after the pattern to be matched. If you want to match fixed strings (i.e. no regular expressions), you can also use the -F modifier:
grep -F "$name" fruit.txt