Issue with wildcards into arguments of a Bash function - bash

From the following link, I tried to use the following solution to compare a group of source files (here fortran90 = *.f90).
To do this and see the source which are different, I have put into my ~/.bashrc :
function diffm { for file in "$1"/"$2"; do diff -qs "$file" "$3"/"${file##*/}"; done ;}
But unfortunately, if I am in the current directory for argument $1, i.e by execute :
$ diffm . '*.f90' ../../dir2
The result is : impossible to access to './*.f90'. However, the sources *.f90 exist but wildcards are not taken into account.
Surely a problem with double quotes on arguments of my function ($1, $2, $3)?
More generally, this function doesn't work well.
How can I modify this function in order to make it work in all cases, even being in the current directory "." for the first argument $1 or the third $3?

If i understood what you are trying to do, this should work
diffm () {
dir="$1"; shift
for file in "$#"; do
filename=$(basename "$file")
diff -qs "$file" "$dir/$filename"
done
}
Usage
diffm ../../dir2 ./*.f90

Filename generation does not occur within quotes. Hence, you pass the literal string *.f90 to the function, and this string is used there literally too. If you know for sure that there is exactly one f90 files in your directory, don't use quotes and write
diffm . *.f90 ../../dir2
Things get ugly if the file name has a space embedded (which, BTW, is one reason why I prefer Zsh over bash - you don't have to care about this in Zsh). To deal with this case, you could do a
myfile=*.f90
diffm . "$myfile" ../../dir2
But sooner or later, you will be bitten by the fact, that for whatever reason, you have more than one f90 file, and your strategy will break. Therefore, a better solution would be to use a loop, which also works for the border case of having only one file:
iterating=0
for myfile in *.f90
do
if ((iterating == 0))
then
((iterating+=1))
diffm . "$myfile" ../../dir2
elif [[ ! -f $myfile ]]
then
echo "No files matching $myfile in this directory"
else
echo "ERROR: More than one f90-file. Don't know which one to diff" 1>&2
fi
done
The elif part just cares for the case that you don't have any f90 files. In this case, the loop body is executed once, and myfile contains the wildcard pattern.

Related

bash finding files in directories recursively

I'm studying the bash shell and lately understood i'm not getting right recursive calls involving file searching- i know find is made for this but I'm recently asked to implement a certain search this way or another.
I wrote the next script:
#!/bin/bash
function rec_search {
for file in `ls $1`; do
echo ${1}/${item}
if[[ -d $item ]]; then
rec ${1}/${item}
fi
done
}
rec $1
the script gets as argument file and looking for it recursively.
i find it a poor solution of mine. and have a few improvement questions:
how to find files that contain spaces in their names
can i efficiently use pwd command for printing out absolute address (i tried so, but unsuccessfully)
every other reasonable improvement of the code
Your script currently cannot work:
The function is defined as rec_search, but then it seems you mistakenly call rec
You need to put a space after the "if" in if[[
There are some other serious issues with it too:
for file in `ls $1` goes against the recommendation to "never parse the output of ls", won't work for paths with spaces or other whitespace characters
You should indent the body of if and for to make it easier to read
The script could be fixed like this:
rec() {
for path; do
echo "$path"
if [[ -d "$path" ]]; then
rec "$path"/*
fi
done
}
But it's best to not reinvent the wheel and use the find command instead.
If you are using bash 4 or later (which is likely unless you running this under Mac OS X), you can use the ** operator.
rec () {
shopt -s globstar
for file in "$1"/**/*; do
echo "$file"
done
}

Passing more than one argument through to a command in a shell wrapper

I am trying to write a custom command to copy files into a specific directory.
I am not sure the best way to do this. Right now, the script is this
#!/bin/sh
cp -rf $1 /support/save/
I called this command filesave, it works great for 1 file, but if you do *.sh or something similar it only copies the first file. This makes sense as that is the point of $1. Is there an input variable that will just collect all inputs not just the specific one?
#!/bin/sh
cp -rf -- "$#" /support/save
Use "$#" to expand to your entire argument list. It is essential that this be placed in double-quotes, or else it will behave identically to $* (which is to say, incorrectly).
The -- is a widely implemented extension which ensures that all following arguments are treated as literal arguments rather than parsed as options, thus making filenames starting with - safe.
To demonstrate the difference, name the following script quotdemo.
#!/bin/sh
printf '$#: '; printf '<%s>\n' "$#"
printf '$*: '; printf '[%s]\n' $*
...and try running:
touch foo.txt bar.txt "file with spaces.txt" # create some matching files
quotdemo *.txt # ...then test this...
quotdome "*.txt" # ...and this too!

bash to print certain file names to text

I have spent a lot of time the past few weeks and posting on here. I finally think I am much closer with learning bash but I have one problem with my code I cannot for the life of me figure out why it will not run. I can run each line in the terminal and it returns a result but for some reason when I point it to run, it will do nothing. I get a a syntax error: word unexpected (expecting "do").
#!/bin/bash
image="/Home/Desktop/epubs/images"
for f in $(ls "$image"*.jpg); do
fsize=$(stat --printf= '%s' "$f");
if [ "$fsize" -eq "40318" ]; then
echo "$(basename $f)" >> results.txt
fi
done
What am I missing???
The problem might be in line endings. Make sure your script file has unix line endings, not the Windows ones.
Also, do not iterate over output of ls. Use globbing right in the shell:
for f in "$file"/*.jpg ; do
Your for loop appears to be missing a list of values to iterate over:
image="/Home/Desktop/epubs/images"
for f in $(ls "$image"*.jpg); do
Because $image does not end with a /, your ls command expands to
for f in $(ls /Home/Desktop/epubs/images*.jpg); do
which probably results in
for f in ; do
causing the syntax error. The simplest fix is
for f in $(ls "$image"/*.jpg); do
but you should take the advice in the other answers and skip ls:
for f in "$image"/*.jpg; do
Here's how I would do that.
#!/bin/bash -e
image="/Home/Desktop/epubs/images"
(cd "$image"
for f in *.jpg; do
let fsize=$(stat -c %s "$f")
if (( fsize == 40318 )); then
echo "$f"
fi
done) >results.txt
The -e means the script will exit if anything goes wrong (can't cd into the directory, for instance). Saves a lot of error checking when you're happy with that behavior.
The parentheses mean that the cd command is in a subshell; the surrounding script (including the redirection into results.txt) is still in whatever directory you started in.
Now that we're in the directory, we can just look for *.jpg, no directory prefix, and no need to call basename on anything.
Using let and (( == )) treats the size value as a number instead of a string, so we won't get tripped up by any wonkiness in the way stat chooses to format the value.
We just redirect the output the entire loop into the result file instead of appending every time through; it's more efficient. If you have existing contents in results.txt that you want to keep, you can just change the > back to a >>, but leaving it around the whole loop is still more efficient than opening the file and appending to it on every iteration.

Basename puts single quotes around variable

I am writing a simple shell script to make automated backups, and I am trying to use basename to create a list of directories and them parse this list to get the first and the last directory from the list.
The problem is: when I use basename in the terminal, all goes fine and it gives me the list exactly as I want it. For example:
basename -a /var/*/
gives me a list of all the directories inside /var without the / in the end of the name, one per line.
BUT, when I use it inside a script and pass a variable to basename, it puts single quotes around the variable:
while read line; do
dir_name=$(echo $line)
basename -a $dir_name/*/ > dir_list.tmp
done < file_with_list.txt
When running with +x:
+ basename -a '/Volumes/OUTROS/backup/test/*/'
and, therefore, the result is not what I need.
Now, I know there must be a thousand ways to go around the basename problem, but then I'd learn nothing, right? ;)
How to get rid of the single quotes?
And if my directory name has spaces in it?
If your directory name could include spaces, you need to quote the value of dir_name (which is a good idea for any variable expansion, whether you expect spaces or not).
while read line; do
dir_name=$line
basename -a "$dir_name"/*/ > dir_list.tmp
done < file_with_list.txt
(As jordanm points out, you don't need to quote the RHS of a variable assignment.)
Assuming your goal is to populate dir_list.tmp with a list of directories found under each directory listed in file_with_list.txt, this might do.
#!/bin/bash
inputfile=file_with_list.txt
outputfile=dir_list.tmp
rm -f "$outputfile" # the -f makes rm fail silently if file does not exist
while read line; do
# basic syntax checking
if [[ ! ${line} =~ ^/[a-z][a-z0-9/-]*$ ]]; then
continue
fi
# collect targets using globbing
for target in "$line"/*; do
if [[ -d "$target" ]]; then
printf "%s\n" "$target" >> $outputfile
fi
done
done < $inputfile
As you develop whatever tool will process your dir_list.tmp file, be careful of special characters (including spaces) in that file.
Note that I'm using printf instead of echo so that targets whose first character is a hyphen won't cause errors.
This might work
while read; do
find "$REPLY" >> dir_list.tmp
done < file_with_list.txt

Renaming a file extension without specifying

I am creating a bash shell script that will rename a file extension without having to specify the old file extension name. If I enter "change foo *" to the Terminal in Linux, it will change all file extension to foo.
So lets say I've got four files: "file1.txt", "file2.txt.txt", "file3.txt.txt.txt" and "file4."
When I run the command, the files should look like this: "file1.foo", "file2.txt.foo", "file3.txt.txt.foo" and "file4.foo"
Can someone look at my code and correct it. I would also appreciate it if someone can implement this for me.
#!/bin/bash
shift
ext=$1
for file in "$#"
do
cut=`echo $FILE |sed -n '/^[a-Z0-9]*\./p'`
if test "${cut}X" == 'X'; then
new="$file.$ext"
else
new=`echo $file | sed "s/\(.*\)\..*/\1.$ext/"`
fi
mv $file $new
done
exit
Always use double quotes around variable substitutions, e.g. echo "$FILE" and not echo $FILE. Without double quotes, the shell expands whitespace and glob characters (\[*?) in the value of the variable. (There are cases where you don't need the quotes, and sometimes you do want word splitting, but that's for a future lesson.)
I'm not sure what you're trying to do with sed, but whatever it is, I'm sure it's doable in the shell.
To check if $FILE contains a dot: case "$FILE" in *.*) echo yes;; *) echo no;; esac
To strip the last extension from $FILE: ${FILE%.*}. For example, if $FILE is file1.txt.foo, this produces file1.txt. More generally, ${FILE%SOME_PATTERN} expands to $FILE with a the shortest suffix matching SOME_PATTERN stripped off. If there is no matching suffix, it expands to $FILE unchanged. The variant ${FILE%%SOME_PATTERN} strips the longest suffix. Similarly, ${FILE#SOME_PATTERN} and ${FILE##SOME_PATTERN} strip a suffix.
test "${TEMP}X" == 'X' is weird. This looks like a misremembered trick from the old days. The normal way of writing this is [ "$TEMP" = "" ] or [ -z "$TEMP" ]. Using == instead of = is a bash extension. There used to be buggy shells that might parse the command incorrectly if $TEMP looked like an operator, but these have gone the way of the dinosaur, and even then, the X needs to be at the beginning, because the problematic operators begin with a -: [ "X$TEMP" == "X" ].
If a file name begins with a -, mv will think it's an option. Use -- to say “that's it, no more options, whatever follows is an operand”: mv -- "$FILE" "$NEW_FILE".
This is very minor, but a common (not universal) convention is to use capital letters for environment variables and lowercase letters for internal script variables.
Since you're using only standard shell features, you can start the script with #!/bin/sh (but #!/bin/bash works too, of course).
exit at the end of the script is useless.
Applying all of these, here's the resulting script.
#!/bin/sh
ext="$1"; shift
for file in "$#"; do
base="${file%.*}"
mv -- "$file" "$base.$ext"
done
Not exactly what you are asking about, but have a look at the perl rename utility. Very powerful! man rename is a good start.
Use: for file in *.gif; do mv $file ${file%.gif}.jpg; done
Or see How to rename multiple files
For me this worked
for FILE in `ls`
do
NEW_FILE=${FILE%.*}
NEW_FILE=${NEW_FILE}${EXT}
done
I just want to tell about NEW_FILE=${FILE%.*}.
Here NEW_FILE gets the file name as output. You can use it as you want.
I tested in bash with uname -a = "Linux 2.4.20-8smp #1 SMP Thu Mar 13 17:45:54 EST 2003 i686 i686 i386 GNU/Linux"

Resources