I take pictures for my job on three different cameras. I am looking to automate the process of organising these to folders based on the first 3 letters and then the filetype.
Unfortunately my bash script syntax knowledge is non existent and so I can't figure out more than creating the directories..
An eg of the incoming files:
HAM1234.JPG
HAM1234.RAW
HDR1234.JPG
HDR1234.RAW
STL1234.JPG
STL1234.RAW
These would go into 3 folders
HAM - REF/{RAW,JPG}
HDR - HDRI/{RAW,JPG}
STL - STILLS/{RAW,JPG}
With the filetypes being aligned.
Any help would be much appreciated!
Jason
With your shown examples:
#!/bin/bash
mkdir -p {HAM\ -\ REF,HDR\ -\ HDRI,STL\ -\ STILLS}/{RAW,JPG}
shopt -s failglob
mv HAM*.JPG "HAM - REF/JPG"
mv HAM*.RAW "HAM - REF/RAW"
mv HDR*.JPG "HDR - HDRI/JPG"
mv HDR*.RAW "HDR - HDRI/RAW"
mv STL*.JPG "STL - STILLS/JPG"
mv STL*.RAW "STL - STILLS/RAW"
From man bash:
failglob: If set, patterns which fail to match filenames during pathname expansion result in an expansion error.
Please refer a very basic bash script based on your requirements and information you provided:
#!/bin/bash
for files in *
do
if [ -f "$files" ]; then
fileprefix=${files%%[0-9]*.*}
case $fileprefix in
HAM)
if [ ! -d REF ]; then
mkdir REF;
fi
mv "$fileprefix"*.* REF
;;
HDR)
if [ ! -d HDRI ]; then
mkdir HDRI;
fi
mv "$fileprefix"*.* HDRI
;;
STL)
if [ ! -d STILLS ]; then
mkdir STILLS;
fi
mv "$fileprefix"*.* STILLS
;;
esac
fi
done
The first logic for files in * assumes the script itself and all the image files resides in the same directory i.e. PWD.
if [ -f "$files" ]; then
Only run the logic if the matching item in the iteration is a file
fileprefix=${files%%[0-9]*.*}
Extract the letters (first 3) and store it in variable fileprefix
After that a simple switch-case follows.
You can modify the script as per your needs.
Hope this answers your question.
Related
I'm trying to loop through files in a directory, where the directory is passed through as an argument. I currently have the following script saved in test.sh:
#!/bin/bash
for filename in "$1"/*; do
echo "File:"
echo $filename
done
And I am running the above using:
sh test.sh path/to/loop/over
However, the above doesn't output the files at the directory path/to/loop/over, it instead outputs:
File:
path/to/loop/over/*
I'm guessing it's interpreting path/to/loop/over/* as a string and not a directory. My expected output is the following:
File:
foo.txt
File:
bar.txt
Where foo.txt and bar.txt are files in the path/to/loop/over/ directory. I found this answer which suggested to add a /* after the $1, however, this doesn't seem to help (neither do these suggestions)
Iterate over content of directory
Compatible answer (not only bash)
As this question is tagged shell, there is a POSIX compatible way:
#!/bin/sh
for file in "$1"/* ;do
[ -f "$file" ] && echo "Process '$file'."
done
Will be enough (work with filenames containing spaces):
$ myscript.sh /path/to/dir
Process '/path/to/dir/foo'.
Process '/path/to/dir/bar'.
Process '/path/to/dir/foo bar'.
This work well by using any posix shell. Tested with bash, ksh, dash, zsh and busybox sh.
#!/bin/sh
cd "$1" || exit 1
for file in * ;do
[ -f "$file" ] && echo "Process '$file'."
done
This version won't print path:
$ myscript.sh /path/to/dir
Process 'foo'.
Process 'bar'.
Process 'foo bar'.
Some bash ways
Introduction
I don't like to use shopt when not needed... (This change standard
bash behaviours and make script less readables).
There is an elegant way for doing this by using standard bash, without requirement of shopt.
Of course, previous answer work fine under bash, but. There are some
interresting way for making your script more powerfull, flexible, pretty, detailed...
Sample
#!/bin/bash
die() { echo >&2 "$0 ERROR: $#";exit 1;} # Emergency exit function
[ "$1" ] || die "Argument missing." # Exit unless argument submitted
[ -d "$1" ] || die "Arg '$1' is not a directory." # Exit if argument is not dir
cd "$1" || die "Can't access '$1'." # Exit unless access dir.
files=(*) # All files names in array $files
[ -f "$files" ] || die "No files found." # Exit if no files found
for file in "${files[#]}";do # foreach file:
echo Process "$file" # Process file
done
Explanation: considering globbing vs real files
When doing:
files=(/path/to/dir/*)
variable $files becomes an array containing all files contained under /path/to/dir/:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
But if nothing match glob pattern, star won't be replaced and array become:
declare -p files
declare -a files=([0]="/path/to/dir/*")
From there. looking for $files is like looking for ${files[0]} ie: first field in array. So
[ -f "$files" ] || die "No files found."
will execute die function unless first field of array files is a file ([ -e "$files" ] to check for existing entry, [ -d "$files" ] to check for existing directory, ans so on... see man bash or help test).
But you could do replace this filesystem test by some string based test, like:
[ "$files" = "/path/to/dir/*" ] && die "No files found."
or, using array length:
((${#files[#]}==1)) && [ "${files##*/}" = "*" ] && die "No files found."
Dropping paths by using Parameter expansion:
For suppressing path from filenames, instead of cd $path you could do:
targetPath=/path/to/dir
files=($targetPath/*)
[ -f "$files" ] || die "No files found."
Then:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
You could
printf 'File: %s\n' ${files[#]#$targetPath/}
File: bar
File: baz
File: foo
This would happen if the directory is empty, or misspelled. The shell (in its default configuration) simply doesn't expand a wildcard if it has no matches. (You can control this in Bash with shopt -s nullglob; with this option, wildcards which don't match anything are simply removed.)
You can verify this easily for yourself. In a directory with four files,
sh$ echo *
a file or two
sh$ echo [ot]*
or two
sh$ echo n*
n*
And in Bash,
bash$ echo n*
n*
bash$ shopt -s nullglob
bash$ echo n*
I'm guessing you are confused about how the current working directory affects the resolution of directory names; maybe read Difference between ./ and ~/
Below is a bash script to move files around and rename them. The problem is it doesn't work when there is more than one file in the directory. I'm assuming because the last parameter in the mv command is a file. Any suggestions?
'#!/bin/bash'
'INPUTDIR="/home/southern-uniontn/S001007420"'
'OUTPUTDIR="/mnt/edi-06/southern-uniontn/flats-in"'
'BACKUPDIR="/backup/southern-uniontn/S001007420"'
YEAR=`date +%Y`
MONTH=`date +%m`
DAY=`date +%d`
HOUR=`date +%H`
MINUTE=`date +%M`
######## Do some error checking #########
# Does backup dir exist?
if [ ! -d $BACKUPDIR/$YEAR ]
then
mkdir $BACKUPDIR/$YEAR
fi
if [ ! -d $BACKUPDIR/$YEAR/$MONTH ]
then
mkdir $BACKUPDIR/$YEAR/$MONTH
fi
if [ ! -d $BACKUPDIR/$YEAR/$MONTH/$DAY ]
then
mkdir $BACKUPDIR/$YEAR/$MONTH/$DAY
fi
if [[ $(find $INPUTDIR -type f | wc -l) -gt 0 ]];
then
###### Rename the file, move it to Backup, then copy to the Output Directory #####
for f in $INPUTDIR/*
do
echo "`date` - Move recurring txt flat file to BackupDir for Union TN from Southern"
mv $INPUTDIR/* $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt
sleep 2
echo "`date` - Copy backup file to the Union TN Output Directory"
cp $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt $OUTPUTDIR/
done;
fi
Some notes:
Get out of the habit of using ALLCAPS variable names, leave those as reserved
by the shell. One day you'll write PATH=something and then wonder
why your script is
broken.
mkdir -p can create parent directories, and will not error if the dir already exists
store the filenames in an array. Then the shell does not have to duplicate
the work, and you don't need to count how many there are: if there are no
files, the loop has zero iterations
if you want to keep the same directory hierarchy in the outputdir,
you need to do that by hand.
use read to get the date parts
with bash v4.2+, printf can be used instead of calling out to date
use magic value "-1" to mean "now".
printf '%(%Y-%m-%d)T\n' -1 prints "2021-10-25" (as of the day I write this)
This is, I think, what you want:
#!/bin/bash
inputdir='/home/southern-uniontn/S001007420'
outputdir='/mnt/edi-06/southern-uniontn/flats-in'
backupdir='/backup/southern-uniontn/S001007420'
read year month day hour minute < <(printf '%(%Y %m %d %H %M)T\n' -1)
# create backup dirs if not exists
date_dir="$year/$month/$day"
mkdir -p "$backupdir/$date_dir"
mkdir -p "$outputdir/$date_dir"
mapfile -t files < <(find $inputdir -type f)
for f in "${files[#]}"
do
###### Rename the file, move it to Backup, then copy to the Output Directory #####
backup_file="UnionTN-S001007420-$year$month$day-$hour$minute.txt"
printf '%(%c)T - Move recurring txt flat file to backupdir for Union TN from Southern\n' -1
mv "$f" "$backupdir/$date_dir/$backup_file"
printf '%(%c)T - Copy backup file to the Union TN Output Directory\n' -1
cp "$backupdir/$date_dir/$backup_file" "$outputdir/$date_dir/$backup_file"
done
When using a glob with mv, the target must be an existing directory, and all matching files will be moved inside that directory.
In your case,
mv $INPUTDIR/* $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt
tells mv to move all file inside the $INPUTDIR/* directory to a directory named $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt.
I'm not sure what you're trying to do, but I hope this help.
Some more advice you could use:
Don't put the shebang (the first line beginning with "#") and the first three variable declarations inside single-quotes.
Some argue it is more portable and better to write /usr/bin/env bash instead of /bin/bash in the shebang
if [ CONDITION ] /then ACTION /fi statements can be simplified by writing [ CONDITION ] && ACTION
You reduce your likely hood of encountering unexpected behaviour when double-quoting your strings and variable (i.e. write "${year}/${month}/" instead of $year/$month.
No need to call mkdir a, followed by mkidr a/b, then mkdir a/b/c and so on, you can just call mkdir -p a/b/c. The p flag tells mkdir to create parent directories if they don't already exist.
It is unnecessary to validate the existence of a directory before calling mkdir since mkdir already validates that for you.
As pointed out by commenters, all-caps variables are conventions for special POSIX related variables. You should use another type of casing.
You could use date to do the formatting for you: date +%Y/%m/%d will print 2021/10/25
Strings without interpolation can have single-quotes.
(Optional, prevent undesired behaviors) Put set -e at the beginning of your scripts, after the shebang, to tell bash to halt if an error is encountered
And finally, use man <command_name> for built-in documentation!
I'm trying to print all directories/subdirectories from a given start directory.
for i in $(ls -A -R -p); do
if [ -d "$i" ]; then
printf "%s/%s \n" "$PWD" "$i"
fi
done;
This script returns all of the directories found in the . directory and all of the files in that directory, but for some reason the test fails for subdirectories. All of the directories end up in $i and the output looks exactly the same.
Let's say I have the following structure:
foo/bar/test
echo $i prints
foo/
bar/
test/
While the contents of the folders are listed like this:
./foo:
file1
file2
./bar:
file1
file2
However the test statement just prints:
PWD/TO/THIS/DIRECTORY/foo
For some reason it returns true for the first level directories, but false for all of the subdirectories.
(ls is probably not a good way of doing this and I would be glad for a find statement that solves all of my issues, but first I want to know why this script doesn't work the way you'd think.)
As pointed out in the comments, the issue is that the directory names include a :, so -d is false.
I guess that this command gives you the output you want (although it requires Bash):
# enable globstar for **
# disabled in non-interactive shell (e.g. a script)
shopt -s globstar
# print each path ending in a / (all directories)
# ** expands recursively
printf '%s\n' **/*/
The standard way would either to do the recursion yourself, or to use find:
find . -type d
Consider your output:
dir1:
dir1a
Now, the following will be true:
[ -d dir1/dir1a ]
but that's not what your code does; instead, it runs:
[ -d dir1a ]
To avoid this, don't attempt to parse ls; if you want to implement recursion in baseline POSIX sh, do it yourself:
callForEachEntry() {
# because calling this without any command provided would try to execute all found files
# as commands, checking for safe/correct invocation is essential.
if [ "$#" -lt 2 ]; then
echo "Usage: callForEachEntry starting-directory command-name [arg1 arg2...]" >&2
echo " ...calls command-name once for each file recursively found" >&2
return 1
fi
# try to declare variables local, swallow/hide error messages if this fails; code is
# defensively written to avoid breaking if recursing changes either, but may be faulty if
# the command passed as an argument modifies "dir" or "entry" variables.
local dir entry 2>/dev/null ||: "not strict POSIX, but available in dash"
dir=$1; shift
for entry in "$dir"/*; do
# skip if the glob matched nothing
[ -e "$entry" ] || [ -L "$entry" ] || continue
# invoke user-provided callback for the entry we found
"$#" "$entry"
# recurse last for if on a baseline platform where the "local" above failed.
if [ -d "$entry" ]; then
callForEachEntry "$entry" "$#"
fi
done
}
# call printf '%s\n' for each file we recursively find; replace this with the code you
# actually want to call, wrapped in a function if appropriate.
callForEachEntry "$PWD" printf '%s\n'
find can also be used safely, but not as a drop-in replacement for the way ls was used in the original code -- for dir in $(find . -type d) is just as buggy. Instead, see the "Complex Actions" and "Actions In Bulk" section of Using Find.
I have a bash script, which goes through list of directories and if some directory contains zip files it bind zip file name into variable and perform some actions over it and then goes to another in this dir. Unfortunately, it works when there is one zip file per directory. If more - it gives error "Binary operator expected"
Script:
if [ -e $currdir/*.zip ]; then
for file in $currdir/*.zip; do
echo the zip is "${file##*/}"
done
Please help me to rework script accordingly.
If you need exactly check then you can use:
if [[ -n $(echo "$currdir"/*.zip) ]]; then
for f in "$currdir"/*.zip; do
echo "Processing $f file..";
done
fi
But I'd prefer just looping over files that contain *.zip extension:
for f in "$currdir"/*.zip; do
echo "Processing $f file..";
done
Use
for file in "$currdir"/*.zip; do
[ -e "$file" ] || continue
echo the zip is "${file##*/}"
done
As pointed out in the comments the glob will happen in the shell, then [ is called with the output, i.e:
[ -e * ]
will become:
[ -e Desktop Documents Downloads ... ]
Therefore trying to expand and checking in the for iteration will work.
Please see: http://mywiki.wooledge.org/WordSplitting and http://wiki.bash-hackers.org/syntax/expansion/globs
I think the case construct is too often overlooked.
case *.jpg in *.jpg ) echo found files ;; * ) echo no files found ;; esac
produces the correct message in my dir with 1000s+ jpgs ;-)
Change both references from jpg to zip and see if it works for you.
IHTH
My IM stores the logs according to the contact name. I have created a file with the list of active contacts. My problem is following:
I would like to create a bash script with read the active contacts names from the file and compare it with the directories. If the directory name wouldn't be found on the list, it would be moved to another directory (let's call it "archive"). I try to visualise it for you.
content of the list:
contact1
contact2
content of the dir
contact1
contact2
contact3
contact4
after running of the script, the content fo the dir:
contact1
contact2
contact3 ==> ../archive
contact4 ==> ../archive
You could use something like this:
mv $(ls | grep -v -x -F -f ../file.txt) ../archive
Where ../file.txt contains the names of the directories that should not be moved. It is assumed here that the current directory only contains directories, if that is not the case, ls should be replaced with something else. Note that the command fails if there are no directories that should be moved.
Since in the comments to the other answer you state that directories with whitespace in the name can occur, you could replace this by:
for i in *
do
echo $i | grep -v -x -q -F -f ../file.txt && mv "$i" ../archive
done
This is an improved version of marcog's answer. Note that the associative array requires Bash 4.
#!/bin/bash
sourcedir=/path/to/foo
destdir=/path/to/archive
contactfile=/path/to/list
declare -A contacts
while read -r contact
do
contacts[$contact]=1
done < "$contactfile"
for contact in "$sourcedir"/*
do
if [[ -f $contact ]]
then
index=${contact##*/}
if [[ ! ${contacts[$index]} ]]
then
mv "$contact" "$destdir"
fi
fi
done
Edit:
If you're moving directories instead of files, then change the for loop above to look like this:
for contact in "$sourcedir"/*/
do
index=${contact/%\/}
index=${index##*/}
if [[ ! ${contacts[$index]} ]]
then
mv "$contact" "$destdir"
fi
done
There might be a more concise solution, but this works. I'd strongly recommend prefixing the mv with echo to test it out first, otherwise you could end up with a serious mess if it doesn't do what you want.
declare -A contacts
for contact in "$#"
do
contacts[$contact]=1
done
ls a | while read contact
do
if [[ ! ${contacts[$contact]} ]]
then
mv "a/$contact" ../archive
fi
done