BASH parameters with wildcard - bash

I'm trying to do a bash script that will find & copy similar files to a destination directory.
For example, I'm passing a parameter 12300 to a script and I want to copy all files that start with 12300... to a new directory.
like this:
sh script.sh 12300
and here's the script:
if [ -f /home/user/bashTest/$#*.jpg ]
then
cp /home/user/bashTest/$#*.jpg /home/user/bashTest/final/
fi
This just doesn't work. I have tried all kinds of solutions but nothing has worked.
The question is: How can I use wildcard with parameter?

When you're checking for multiple files with -f or -e it can get nasty. I recommend kenfallon's blog. This is something like what he would recommend:
#! /bin/bash
ls -l /home/user/bashTest/$1*.jpg > /dev/null
if [ "$?" = "0" ]
then
cp /home/user/bashTest/$1*.jpg /home/user/bashTest/final/
fi
Not sure how the $# would play in here, or if it's required.

Enclose the thing that expands to the parameters in {}, i.e. /home/user/bashTest/${#}*.jpg. You should use $1 instead of $# in your case however as you only seem to be able to handle the first argument given to the script. $1 expands to the first argument, $2 to the second etc.
You also need a loop to iterate over all files that this glob expands to, e.g.
for file in /tmp/${#}*.jpg
do
if [ -f $file ]
then
echo $file
fi
done

Here is a solution:
#!/bin/bash
cp /home/user/bashTest/${1}*.jpg /home/user/bashTest/final/
Discussion
In this case, a simple cp command will do
I have tested it with files that have embedded spaces

Write this in script.sh:
cp /home/user/bashTest/$1*.jpg /home/user/bashTest/final/
That's all.
UPD. #macduff solution usefull too.

This will find all of them in your $HOME directory and subdirectories (you may wish to tweak find to follow/not follow symlinks and/or adjust the $HOME base directory where it starts the search)
#!/bin/sh
DEST=/your/dest/folder
for FILE in `find "$HOME" -iname "$1"*`;do
[ -f "$FILE" ] && mv "$FILE" "$DEST/$FILE"
#or ln -s ...if you want to keep it in its original location
done
if you want to do multiple patterns using $#
for PATTERN in $#; do
for FILE in `find "$HOME" -iname "$PATTERN"*`;do
[ -f "$FILE" ] && mv "$FILE" "$DEST/$FILE"
done
done

Related

Make directory based on filenames before fourth underscore

I have these test files:
ABCD1234__12_Ab2_Hh_3P.mp4
ABCD1234__12_Ab2_Lw_3P.wmv
ABCD1234__12_Ab2_SSV.mov
EFGH56789__13_Mn1_SSV.avi
EFGH56789__13_Mn1_Ve1_3P.mp4
EFGH56789__13_Mn1_Ve2_3P.webm
I want to create a context service in Automator that makes directories based on the filename prefixes above like so:
ABCD1234__12_Ab2
EFGH56789__13_Mn1
...and move the files into those two accordingly. The only consistent variables in the names are underscores, so I was thinking I could delineate by those, preferably capturing the name before the fourth one.
I originally started with this very simple script:
for file in "$#"
do
mkdir "${file%.*}" && mv "$file" "${file%.*}"
done
Which makes a folder for every file and moves each file into its own folder.
I tried adding variables, various if/thens, etc. but to no avail (not a programmer by trade).
I also wrote another script to do it in a slightly different way, but with the same results to mess around with:
for folder in "$#"
do
cd "$1"
find . -type f -maxdepth 1 -exec bash -c 'mkdir -p "${0%.*}"' {} \; \
-exec bash -c 'mv "$0" "${0%.*}"' {} \;
done
I feel like there's something obvious I am missing.
Your script is splitting on dot, but you say you want to split on underscore. If the one you want to split on is the last one, the fix is trivial:
for file in "$#"
do
mkdir -p "${file%_*}" && mv "$file" "${file%_*}"
done
To get precisely the fourth, try
for file in "$#"
do
tail=${file#*_*_*_*_}
dir=${file%_"$tail"}
mkdir -p "$dir" && mv "$file" "$dir"
done
The addition of the -p option is a necessary bug fix if you want to use && here; mkdir without this option will fail if the directory already exists.
Perhaps see also the section about parameter expansions in the Bash Reference Manual which explains this syntax and its variations.
You could do something like this:
#!/bin/bash
shopt -s extglob
for file
do
if [[ $file =~ ^(([^_]*_){3}[^_]*) ]]
then
echo "${BASH_REMATCH[0]}"
else
echo "${file%.*}"
fi
done

How to iterate over a directory and display only filename

I would want to iterate over contents of a directory and list only ordinary files.
The path of the directory is given as an user input. The script works if the input is current directory but not with others.
I am aware that this can be done using ls.. but i need to use a for .. in control structure.
#!/bin/bash
echo "Enter the path:"
read path
contents=$(ls $path)
for content in $contents
do
if [ -f $content ];
then
echo $content
fi
done
ls is only returning the file names, not including the path. You need to either:
Change your working directory to the path in question, or
Combine the path with the names for your -f test
Option #2 would just change:
if [ -f $content ];
to:
if [ -f "$path/$content" ];
Note that there are other issues here; ls may make changes to the output that break this, depending on wrapping. If you insist on using ls, you can at least make it (somewhat) safer with:
contents="$(command ls -1F "$path")"
You have two ways of doing this properly:
Either loop through the * pattern and test file type:
#!/usr/bin/env bash
echo "Enter the path:"
read -r path
for file in "$path/"*; do
if [ -f "$file" ]; then
echo "$file"
fi
done
Or using find to iterate a null delimited list of file-names:
#!/usr/bin/env bash
echo "Enter the path:"
read -r path
while IFS= read -r -d '' file; do
echo "$file"
done < <(
find "$path" -maxdepth 1 -type f -print0
)
The second way is preferred since it will properly handle files with special characters and offload the file-type check to the find command.
Use file, set to search for files (-type f) from $path directory:
find "$path" -type f
Here is what you could write:
#!/usr/bin/env bash
path=
while [[ ! $path ]]; do
read -p "Enter path: " path
done
for file in "$path"/*; do
[[ -f $file ]] && printf '%s\n' "$file"
done
If you want to traverse all the subdirectories recursively looking for files, you can use globstar:
shopt -s globstar
for file in "$path"/**; do
printf '%s\n' "$file"
done
In case you are looking for specific files based on one or more patterns or some other condition, you could use the find command to pick those files. See this post:
How to loop through file names returned by find?
Related
When to wrap quotes around a shell variable?
Why you shouldn't parse the output of ls
Is double square brackets [[ ]] preferable over single square brackets [ ] in Bash?

Add .old to files without .old in them, having trouble with which variable to use?

#!/bin/bash
for filenames in $( ls $1 )
do
echo $filenames | grep "\.old$"
if [ ! $filenames = 0 ]
then
$( mv "$1/$filenames" "$1/$filenames.old" )
fi
done
So I think most of the script works. It is intended to take the output of ls for a directory inputed in the first parameter, and search for any files with .old at the end. Any files that do not contain .old will then be renamed.
The script successfully renames the files, but it will add .old to a file already containing the extension. I am assuming that the if variable is wrong, but I cannot figure out which variable to use in this case.
Answer is in the key but if anyone needs to do this here is an even easier way:
#!/bin/bash
for filenames in $( ls $1 | grep -v "\.old$" )
do
$( mv "$1/$filenames" "$1/$filenames.old" )
done
Use `find for this
find /directory/here -type f ! -iname "*.old" -exec mv {} {}.old \;
Problems the original approach
for filenames in $( ls $1 ) Never parse ls output. Check [ this ]
Variables are not double quoted, say in if [ ! $filenames = 0 ]. This results in word-splitting. Use "$filenames" unless you expect word splitting.
So the final script would be
#!/bin/bash
if [ -d "$1" ]
then
find "$1" -type f ! -iname "*.old" -exec mv {} {}.old \;
# use -maxdepth 1 with find if you don't wish to recursively check subdirectories
else
echo "Directory : $1 doesn't exist !"
fi
Usage
./script '/path/to/directory'
Don't use ls in scripts.
#!/bin/bash
for filename in "$1"/*
do
case $filename in *.old) continue;; esac
mv "$filename" "$filename.old"
done
I prefer case over if because it supports wildcard matching naturally and portably. (You could run this with /bin/sh just as well.) If you wanted to use if instead, that'd be
if echo "$filename" | grep -q '\.old$'; then
or more idiomatically, but recent shells only,
if [[ "$filename" == *.old ]]; then
You want to avoid calling additional utility functions if simple shell builtins will do. Why? Each additional utility you call grep, etc. spawns and runs in a separate subshell of its own. (if you are spawning a subshell for every iteration in your loop -- things will really slow down) If the shell doesn't provide a feature, then sure... calling a utility is the right thing to do.
As mentioned above, shell globbing along with parameter expansion with substring removal provides a simple test for determining if a file has an .old extension. All you need is:
for i in "$1"/*; do
[ "${i##*.}" = "old" ] || mv "$i" "${i}.old"
done
(note: this will skip add the .old extension to single file named 'old', but that can be handled separately if needed -- unlikely. Additionally, the solution with find is a fine approach as well)
I solved the problem, as I was misled by my instructor!
$? is the variable which represents the pipeline output which is currently in the forground (which would be grep). The new code is unedited except for
if [ ! $? = 0 ]

Comparing relative paths in bash script

I am trying to build a bash script capable of comparing two directories given as arguments $1 and $2, and changing the files' timestamps from the second directory ( if they are not different than a given timestamp $3 ) to be the same as the files with the same name in the first directory. I'm doing okay with that, but I don't see how to access the folders inside the given directories, and compare the files inside those folders.
For example, if I have Directory1 and Directory2 given as arguments:
Directory1 contains:
-text.txt
-folder1/secondfile.txt
-folder2/thirdfile.txt
and Directory2 contains:
-text.txt
-folder1/secondfile.txt
-folder3/thirdfile.txt
so in this case I want my script to modify the files text.txt and secondfile.txt, but not the thirdfile.txt because the relative paths are not the same. How would my script access folders in the directory and how would it compare relative paths? I have managed to do what I wanted with files from the directory, but not folders, and I have no idea how to compare relative paths, even though I searched around I couldn't find it.
So far I've done this script (with help from SO):
#!/bin/bash
cd "$1"
function check {
for i in *; do
if [[-d "$i"]]; then
cd "$i"
check
cd -
fi
if [[-f "$i"]]; then
if [[stat %y "$i" == "$3"]]; then
#if [[path check]];then
#touch -r "$i" "$2/path/$i"
fi
}
and I don't know how to do the [[path check]] which should check if both files have the same relative path (relative to the directories given as arguments).
EDIT:
As the answer suggests, is the following code the right way to do it?
#!/bin/bash
cd "$1"
shopt -s globstar
for i in **; do
if [[stat %y "$i" == "$3"]]; then
if [["$1/$i" == "$2/$i"]];then
touch -r "$i" "$2/$i"
fi
There was an answer here before, which I wanted to reply to, suggesting using shopt -s globstar and ** instead of *.
The additional question was something along the lines of "Would I be able to compare the relative paths?".
Yes, you would. With shopt -s globstar, ** expands to include the relative path to each file. So it would return text.txt folder1/secondfile.txt folder2/thirdfile.txt.
EDIT:
You should not need to cd "$1" either, for "$1" and "$2" would not exist inside dir "$1". Try something along the lines of:
#!/usr/bin/env bash
shopt -s globstar
for i in $(cd "$1"; echo **); do
if [[ $(stat -c %y "$1/$i") == "$3" ]]; then
if [[ -f "$1/$i" ]] && [[ -f "$2/$i" ]]; then
touch -r "$1/$i" "$2/$i"
fi
fi
done

Recursive Shell Script and file extensions issue

I have a problem with this script. The script is supposed to go trough all the files and all sub-directories and sub-files (recursively). If the file ends with the extension .txt i need to replace a char/word in the text with a new char/word and then copy it into a existing directory. The first argument is the directory i need to start the search, the second is the old char/word, third the new char/word and fourth the directory to copy the files to. The script goes trough the files but only does the replacement and copies the files from the original directory. Here is the script
#!/bin/bash
funk(){
for file in `ls $1`
do
if [ -f $file ]
then
ext=${file##*.}
if [ "$ext" = "txt" ]
then
sed -i "s/$2/$3/g" $file
cp $file $4
fi
elif [ -d $file ]
then
funk $file $2 $3 $4
fi
done
}
if [ $# -lt 4 ]
then
echo "Need more arg"
exit 2;
fi
cw=$1
a=$2
b=$3
od=$4
funk $cw $a $b $od
You're using a lot of bad practices here: lack of quotings, you're parsing the output of ls... all this will break as soon as a filename contains a space of other funny symbol.
You don't need recursion if you either use bash's globstar optional behavior, or find.
Here's a possibility with the former, that will hopefully show you better practices:
#!/bin/bash
shopt -s globstar
shopt -s nullglob
funk() {
local search=${2//\//\\/}
local replace=${3//\//\\/}
for f in "$1"/**.txt; do
sed -i "s/$search/$replace/g" -- "$f"
cp -nvt "$4" -- "$f"
done
}
if (($#!=4)); then
echo >&2 "Need 4 arguments"
exit 1
fi
funk "$#"
The same function funk using find:
#!/bin/bash
funk() {
local search=${2//\//\\/}
local replace=${3//\//\\/}
find "$1" -name '*.txt' -type f -exec sed -i "s/$search/$replace/g" -- {} \; -exec cp -nvt "$4" -- {} \;
}
if (($#!=4)); then
echo >&2 "Need 4 arguments"
exit 1
fi
funk "$#"
In cp I'm using
the -n switch: no clobber, so as to not overwrite an existing file. Use it if your version of mv supports it, unless you actually want to overwrite files.
the -v switch: verbose, will show you the moved files (optional).
the -t switch: -t followed by a directory tells to copy into this directory. It's a very good thing to use cp this way: imagine instead of giving an existing directory, you give an existing file: without this feature, this file will get overwritten several times (well, this will be the case if you omit the -n option)! with this feature the existing file will remain safe.
Also notice the use of --. If your cp and sed supports it (it's the case for GNU sed and cp), use it always! it means end of options now. If you don't use it and if a filename start with a hyphen, it would confuse the command trying to interpret an option. With this --, we're safe to put a filename that may start with a hyphen.
Notice that in the search and replace patterns I replaced all slashes / by their escaped form \/ so as not to clash with the separator in sed if a slash happens to appear in search or replace.
Enjoy!
As pointed out, looping over find output is not a good idea. It also doesn't support slashes in search&replace.
Check gniourf_gniourf's answer.
How about using find for that?
#!/bin/bash
funk () {
local dir=$1; shift
local search=$1; shift
local replace=$1; shift
local dest=$1; shift
mkdir -p "$dest"
for file in `find $dir -name *.txt`; do
sed -i "s/$search/$replace/g" "$file"
cp "$file" "$dest"
done
}
if [[ $# -lt 4 ]] ; then
echo "Need 4 arguments"
exit 2;
fi
funk "$#"
Though you might have files with the same names in the subdirectories, then those will be overwritten. Is that an issue in your case?

Resources