Script for moving all *.512.png to a new folder - bash

Can you make a script(bash) for moving all the files with the ending *.512.png to a new folder like res512(will be new branch) (keeping all the subfolders)
for this repo I tried really long but I can't figure it out.

You're not very specific with what you're asking.
If you want to move all files that have the suffix .512.png from within your current directory to a new directory, you can use the following
mkdir res512
cp -r *.512.png res512/
If you want to move all files that have the suffix .512.png from within your directory and all child directories into a new directory, you can use
mkdir res512
for f in $(find -type f -name "*.512.png")
do
cp $f res512/
done
If you want to move all files that have the suffix .512.png including their directory structure into a new directory, you can use
find . -name '*.512.png' -exec cp --parents \{\} res512/ \;
Replace cp with mv if you want to move the files instead of copy them.

Related

Copy all files with a certain extension from all subdirectories and preserving structure of subdirectories

How can I copy specific files from all directories and subdirectories to a new directory while preserving the original subdirectorie structure?
This answer:
find . -name \*.xls -exec cp {} newDir \;
solves to copy all xls files from all subdirectories in the same directory newDir. That is not what I want.
If an xls file is in: /s1/s2/ then it sould be copied to newDir/s1/s2.
copies all files from all folders and subfolders to a new folder, but the original file structure is lost. Everything is copied to a same new folder on top of each other.
You can try:
find . -type f -name '*.xls' -exec sh -c \
'd="newDir/${1%/*}"; mkdir -p "$d" && cp "$1" "$d"' sh {} \;
This applies the d="newDir/${1%/*}"; mkdir -p "$d" && cp "$1" "$d" shell script to all xls files, that is, first create the target directory and copy the file at destination.
If you have a lot of files and performance issues you can try to optimize a bit with:
find . -type f -name '*.xls' -exec sh -c \
'for f in "$#"; do d="newDir/${f%/*}"; mkdir -p "$d" && cp "$f" "$d"; done' sh {} +
This second version processes the files by batches and thus spawns less shells.
This should do:
# Ensure that newDir exists and is empty. Omit this step if you
# don't want it.
[[ -d newDir ]] && rm -r newDir && mkdir newDir
# Copy the xls files.
rsync -a --include='**/*.xls' --include='*/' --exclude='*' . newDir
The trick here is the combination of include and exclude. By default, rsync copies everything below its source directory (. in your case). We change this by excluding everything, but also including the xls files.
In your example, newDir is itself a subdirectory of your working directory and hence part of the directory tree searched for copying. I would rethink this decision.
NOTE: This would not only also copy directories whrere the name ends in .xls, bur also recreated the whole directory structure of your source tree (even if there are no xls files in it), and populate it only with xls files.
Thanks for the solutions.
Meanwhile I found also:
find . -name '*.xls' | cpio -pdm newDir

Making a directory and moving files into that directory that match a pattern

I can pattern match files and move them into a directory using the line below. But I need to make the directory first.
(must make testdir directory first)
find . -type f -name '*-bak*' -exec mv '{}' ./testdir ';'
What I'm trying to do now is have the line of code also create the directory and move the files that match that pattern into that directory using the same line of code.
mkdir -p testdir && find . -type f -name '*-bak*' -exec mv {} testdir/ ';'
Be careful though, if you get two backups with the same name in different folders, you'll only be left with a single copy and all other copies overwritten!
EDIT: use mv -i to get prompted in that case instead of overwriting the files

How can i copy the contents of a directory located in multiple locations using find command and preserving directory structure?

I have a folder named accdb under multiple directories all under one parent directory dist. I want to copy the contents of accdb for all directories while preserving the code structure
I succeeded in making the recursive folder structure with:
cd ~/dist; find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \;
But i am failing to copy the contents of accdb. This command just makes the structure until directory accdb.
I tried
find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \ && cp -r {} ~/acc_trial/{} \;
I get an error:
find: missing argument to `-exec'
I don't know if this is possible using only a find expression, I'm pretty sure it is not. Besides you must consider that if you have one subfolder named accdb inside one accdb folder you'll probably get an error, that's why in the script that I've made I decided to use rsync:
#!/bin/bash
DEST='/home/corronx/provisional/destination_dir'
#Clean destination directory, PLEASE BE CAREFUL IT MUST BE A REMOVABLE DIRECTORY
rm -rf $DEST/*
FIND='test'
LOOK_PATH='/home/corronx/provisional'
FILES=($(find . -type d -name $FIND))
for ((i=0; i<${#FILES[#]};i++))
do
#Remove first character .
FILES[$i]=${FILES[$i]:1:${#FILES[$i]}}
#Create directories in destination path
mkdir -p $DEST${FILES[$i]}
rsync -aHz --delete ${FILES[$i]:1:${#FILES[$i]}}/ $DEST${FILES[$i]}
echo $i
done
Explanation
First of all I'd recommend using full paths in your script because an rm -rf expression inside an script is pretty dangerous (If you want comment that line and delete destination folder before running script).
DEST= Destination path.
FIND= Subfolder name that your are looking for.
LOOK_PATH= Path where you want to execute find
I create an array called FILES that contain all folders that returns find expression, after that I just create destination directories and run rsync to copy files, I've used rsync because I think it is better in case there is any subdirectory with the same name.
PLEASE BE CAREFUL WITH rm -rf expression, if DEST is not set you'll delete everything in your machine

Get all content from a folder with subfolder copied to another directory with bash

I have a directory that has many folders in it that have sub folders again and then there are some files. Can I write a bash script that copies all the files from the given directory into one single folder? So that I do not have to navigate through ever single folder and copy the content to another folder.
In the topmost dir under which you want the files to be copied:
find . -type f -exec cp {} /some/new/location \;
Finds all the normal files and then copies them to /some/new/location
You could use find to list all files inside the folder:
find ./source -type f
and then use the output as argument for cp, it would look like this:
cp $(find ./source -type f) destination
There would be problem if there are some files within the original directory tree with conflicting name. In that case cp would refuse to copy additional files with the same name with error like:
cp: will not overwrite just-created destination/t22' with./source/test/t2/t22'
To make copies of files with same name, you can use backup option, like this:
cp --backup=numbered $(find ./source -type f) destination
If you want to see what is happening use -v (verbose) option
cp -v --backup=numbered $(find ./source -type f) destination

Copy files from the find function into a new folder without overwriting any files Terminal

I want to copy all files(which all have the same name) from all Subfolders to a new folder:
$cd /My/Folder
$find /Users/My/Other/Folder -name "result.xml" -exec cp '{}' ./ \;
Because all files have the same name it is overwriting itself the whole time. I would like it to always create a new name for every file.
The new files should be called result.xml, result1.xml, result2.xml...

Resources