Bash script to move all png files in folder and its subfolders to another directory? - bash

In ~/Desktop/a/ , I have .png files, and there are also subfolders within this that also have .png files.
I'd like to move all of those .png files to another folder.
This is my code so far. It runs, but nothing is placed into the target folder. What is the problem?
#!/bin/bash
cd ~/Desktop/a/
for f in $(find . -type f -name "*.png")
do
mv $f ~/Desktop/new/
done

I guess that these image filenames maybe include spaces or other special characters.
find ~/Desktop/a/ -type f -name "*.png" -exec mv "{}" ~/Desktop/new/ \;
or
find ~/Desktop/a/ -type f -name "*.png" -print0 | xargs -0 -I{} mv "{}" ~/Desktop/new/

If your bash is new enough, you can also use globstar:
cd ~/Desktop/a || exit 1
shopt -s globstar
mv -- **/*.png ~/Desktop/new
Or (if there are too many files to fit in a single command line):
shopt -s globstar
for f in ~/Desktop/a/**/*.png; do
mv -- "$f" ~/Desktop/new
done

Related

copy files with the base directory

I am searching specific directory and subdirectories for new files, I will like to copy the files. I am using this:
find /home/foo/hint/ -type f -mtime -2 -exec cp '{}' ~/new/ \;
It is copying the files successfully, but some files have same name in different subdirectories of /home/foo/hint/.
I will like to copy the files with its base directory to the ~/new/ directory.
test#serv> find /home/foo/hint/ -type f -mtime -2 -exec ls '{}' \;
/home/foo/hint/do/pass/file.txt
/home/foo/hint/fit/file.txt
test#serv>
~/new/ should look like this after copy:
test#serv> ls -R ~/new/
/home/test/new/pass/:
file.txt
/home/test/new/fit/:
file.txt
test#serv>
platform: Solaris 10.
Since you can't use rsync or fancy GNU options, you need to roll your own using the shell.
The find command lets you run a full shell in your -exec, so you should be good to go with a one-liner to handle the names.
If I understand correctly, you only want the parent directory, not the full tree, copied to the target. The following might do:
#!/usr/bin/env bash
findopts=(
-type f
-mtime -2
-exec bash -c 'd="${0%/*}"; d="${d##*/}"; mkdir -p "$1/$d"; cp -v "$0" "$1/$d/"' {} ./new \;
)
find /home/foo/hint/ "${findopts[#]}"
Results:
$ find ./hint -type f -print
./hint/foo/slurm/file.txt
./hint/foo/file.txt
./hint/bar/file.txt
$ ./doit
./hint/foo/slurm/file.txt -> ./new/slurm/file.txt
./hint/foo/file.txt -> ./new/foo/file.txt
./hint/bar/file.txt -> ./new/bar/file.txt
I've put the options to find into a bash array for easier reading and management. The script for the -exec option is still a little unwieldy, so here's a breakdown of what it does for each file. Bearing in mind that in this format, options are numbered from zero, the {} becomes $0 and the target directory becomes $1...
d="${0%/*}" # Store the source directory in a variable, then
d="${d##*/}" # strip everything up to the last slash, leaving the parent.
mkdir -p "$1/$d" # create the target directory if it doesn't already exist,
cp "$0" "$1/$d/" # then copy the file to it.
I used cp -v for verbose output as shown in "Results" above, but IIRC it's also not supported by Solaris, and can be safely ignored.
The --parents flag should do the trick:
find /home/foo/hint/ -type f -mtime -2 -exec cp --parents '{}' ~/new/ \;
Try testing with rsync -R, for example:
find /your/path -type f -mtime -2 -exec rsync -R '{}' ~/new/ \;
From the rsync man:
-R, --relative
Use relative paths. This means that the full path names specified on the
command line are sent to the server rather than just the last parts of the
filenames.
The problem with the answers by #Mureinik and #nbari might be that the absolute path of new files will spawn in the target directory. In this case you might want to switch to the base directory before the command and go back to your current directory afterwards:
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec cp --parents '{}' ~/new/ \; ; cd $path_current
or
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec rsync -R '{}' ~/new/ \; ; cd $path_current
Both ways work for me at a Linux platform. Let’s hope that Solaris 10 knows about rsync’s -R ! ;)
I found a way around it:
cd ~/new/
find /home/foo/hint/ -type f -mtime -2 -exec nawk -v f={} '{n=split(FILENAME, a, "/");j= a[n-1];system("mkdir -p "j"");system("cp "f" "j""); exit}' {} \;

Rename files in several subdirectories

I want to rename a file present in several subdirectories using bash script.
my files are in folders:
./FolderA/ABCD/ABCD_Something.ctl
./FolderA/EFGH/EFGH_Something.ctl
./FolderA/WXYZ/WXYZ_Something.ctl
I want to rename all of the .ctl file with the same name (name.ctl).
I tried several command using mv or rename but didnt work.
Working from FolderA:
find . -name '*.ctl' -exec rename *.ctl name.ctl '{}' \;
or
for f in ./*/*.ctl; do mv "$f" "${f/*.ctl/name .ctl}"; done
or
for f in $(find . -type f -name '*.ctl'); do mv $f $(echo "$f" | sed 's/*.ctl/name.ctl/'); done
Can you help me using bash?
thanks
You can do this with one line with:
find . -name *.ctl -exec sh -c 'mv "$1" `dirname "$1"`/name.ctl' x {} \;
The x just allows the filename to be positional character 1 rather than 0 which (in my opinion) wrong to use as a parameter.
Try this:
find . -name '*.ctl' | while read f; do
dn=$(dirname "${f}")
# remove the echo after you sanity check the output
echo mv "${f}" "${dn}/name.ctl"
done
find should get all the files you want, dirname will get just the directory name, and mv will perform the rename. You can remove the quotes if you're sure that you'll never have spaces in the names.

bash script optimization file rename

i am a total noob, but i figured out this script for doing the following:
I have a folder called "unrar" in there are subfolders with unknown foldername with rar file inside.
Now i enter unknownsubfolder, find rar file and unrar it in unknownsubfolder.
After that i find the new file and rename it with the unknownsubfoldername. Now i grab the file and move it to ./unrar.
#!/bin/bash
cd /home/user/unrar/
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.rar" -execdir unrar e -r '{}' \;); done
echo "$(tput setaf 2)-> unrar done!$(tput sgr0)"
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.mkv" -exec mv '{}' "${PWD##*\/}.mkv" \;); done
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.mp4" -exec mv '{}' "${PWD##*\/}.mp4" \;); done
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.avi" -exec mv '{}' "${PWD##*\/}.avi" \;); done
cd /home/user/unrar
find -name "*.mkv" -exec mv '{}' /home/user/unrar \;
find -name "*.mp4" -exec mv '{}' /home/user/unrar \;
find -name "*.avi" -exec mv '{}' /home/user/unrar \;
This works fine with most files, but in some cases it doesn't
I want to find *.rar in DIR and unrar it. the newfile.(.mkv|.avi|.mp4) should be renamed to DIR(.mkv|.avi|.mp4) and moved to ./unrar
This is my filestructure.
./unrar/
- unknownsubfolder/
-file.rar
-file.r00
-....
- unknownsubfolder1/
- s01/
- file.rar
- file.r00
- ....
- s02/
- file.rar
- file.r00
- ....
- ....
If case1, unrar "/unknownsubfolder/file.rar" and get "x.mkv". the file is renamed from "x.mkv" to "unknwonsubfolder.mkv" and moved to "./unrar/unknownsubfolder.mkv"
(same with *.avi + *.mp4) ==perfekt
if case2, in my script unknownsubfolder/s01/file.rar will be unrard, but not renamed to s01.mkv insted to unknwonsubfolder1.mkv.
(if there are more like s02, s03, s04 ...) i always end up with one unknownsubfolder.mkv file in ./unrar) ==wrong output
So i guess i have 3 questions
How do i get the right DIRname for renaming the file? Or how do i enter unknownsubfolder/s01 ....?
Is there a way to exclude a word from the find? sometimes "unknownsubfolder" contains another folder+file called "sample(.mkv|.avi|.mp4)". I would like to exclude that, to prevent the original file to be overwritten with the sample file. happens sometimes.
I am sure i can combine some of the code,to make it even shorter. Could someone explain how? So how i combine the mkv,avi and mp4 in one line.
regards, wombat
(EDIT: for better understanding)
UPDATE:
I adjusted the solution to work with unrar. Since I did not had unrar installed previously, I used gunzip to construct the solution and then simply replaced it with unrar. The problem with this approach was that, by default, unrar extracts to the current working directory. Another difference is that the name of the extracted file can be completely different from the archive's name - it is not just a matter of different extensions. The original archive is also not deleted after extraction.
Here is the solution specifically tailored to work with unrar with respect to aforementioned behavior:
#!/bin/bash
path="$1"
omit="$2"
while read f;do
unrar e -r "${f}" "${f%/*}" > /dev/null
done < <(find "${path}" -type d -name "${omit}" -prune -o -type f -print)
while read f;do
new="${f%/*}"
new="${new##*/}"
mv "${f}" "${path}/${new}"
done < <(find "${path}" -type d -name "${omit}" -prune -o -type f -a \! -name '*.rar' -print )
You can save the script, e.g., as rename-script (do not forget to make it executable), and then call it like
./rename-script /path/to/unrar omitfolder
Notice, that inside the script there is no cd. You will have to at least provide the location of the unrar folder as first parameter, otherwise you will get an error. In case of OP this would be /home/user/unrar. The omitfolder is not a path, it is just the name of the folder that you want to omit. So in OP's case this would be sample.
./rename-script /home/user/unrar sample
As requested by OP in the comments, you can read about the bash read-builtin and process substitution in order to understand how the while-loop works and how it assigns the filenames returned by find to the variable f.

Looping through all files in a given directory [duplicate]

This question already has answers here:
Looping through all files in a directory [duplicate]
(6 answers)
Closed 4 years ago.
Here is what I'm trying to do:
Give a parameter to a shell script that will run a task on all files of jpg, bmp, tif extension.
Eg: ./doProcess /media/repo/user1/dir5
and all jpg, bmp, tif files in that directory will have a certain task run on them.
What I have now is:
for f in *
do
imagejob "$f" "output/${f%.output}" ;
done
I need help with the for loop to restrict the file types and also have some way of starting under a specified directory instead of current directory.
Use shell expansion rather than ls
for file in *.{jpg,bmp,tif}
do
imagejob "$file" "output/${file%.output}"
done
also if you have bash 4.0+, you can use globstar
shopt -s globstar
shopt -s nullglob
shopt -s nocaseglob
for file in **/*.{jpg,bmp,tif}
do
# do something with $file
done
for i in `ls $1/*.jpg $1/*.bmp $1/*.tif`; do
imagejob "$i";
done
This is assuming you're using a bashlike shell where $1 is the first argument given to it.
You could also do:
find "$1" -iname "*.jpg" -or -iname "*.bmp" -or -iname "*.tif" \
-exec imagejob \{\} \;
You could use a construct with backticks and ls (or any other commando of course):
for f in `ls *.jpg *.bmp *.tif`; do ...; done
The other solutions here are either Bash-only or recommend the use of ls in spite of it being a common and well-documented antipattern. Here is how to solve this in POSIX sh without ls:
for file in *.jpg *.bmp *.tif; do
... stuff with "$file"
done
If you have a very large number of files, perhaps you also want to look into
find . -maxdepth -type f \( \
-name '*.jpg' -o -name '*.bmp' -o -name '*.tif' \) \
-exec stuff with {} +
which avoids the overhead of sorting the file names alphabetically. The -maxdepth 1 says to not recurse into subdirectories; obviously, take it out or modify it if you do want to recurse into subdirectories.
The -exec ... + predicate of find is a relatively new introduction; if your find is too old, you might want to use -exec ... \; or replace the -exec stuff with {} + with
find ... -print0 |
xargs -r0 stuff with
where however again the -print0 option and the corresponding -0 option for xargs are a GNU extension.

using find with exec

I want to copy files found by find (with exec cp option) but, i'd like to change name of those files - e.g find ... -exec cp '{}' test_path/"test_"'{}' , which to my test_path should copy all files found by find but with prefix 'test'. but it ain't work.
I'd be glad if anyone could give me some ideas how to do it.
best regards
for i in `find . -name "FILES.EXT"`; do cp $i test_path/test_`basename $i`; done
It is assumed that you are in the directory that has the files to be copied and test_path is a subdir of it.
if you have Bash 4.0 and assuming you are find txt files
cd /path
for file in ./**/*.txt
do
echo cp "$file" "/test_path/test${file}"
done
of with GNU find
find /path -type f -iname "*.txt" | while read -r -d"" FILE
do
cp "$FILE" "test_${FILE}"
done
OR another version of GNU find+bash
find /path -type f -name "*txt" -printf "cp '%p' '/tmp/test_%f'\n" | bash
OR this ugly one if you don't have GNU find
$ find /path -name '*.txt' -type f -exec basename {} \; | xargs -I file echo cp /path/file /destination/test_file
You should put the entire test_path/"test_"'{}' in ""
Like:
find ... -exec cp "{}" "test_path/test_{}" \;
I would break it up a bit, like this;
for line in `find /tmp -type f`; do FULL=$line; name=`echo $line|rev|cut -d / -f -1|rev` ; echo cp $FULL "new/location/test_$name" ;done
Here's the output;
cp /tmp/gcc.version new/location/test_gcc.version
cp /tmp/gcc.version2 new/location/test_gcc.version2
Naturally remove the echo from the last part, so it's not just echo'ng what it woudl of done and running cp

Resources