I want to copy all files(which all have the same name) from all Subfolders to a new folder:
$cd /My/Folder
$find /Users/My/Other/Folder -name "result.xml" -exec cp '{}' ./ \;
Because all files have the same name it is overwriting itself the whole time. I would like it to always create a new name for every file.
The new files should be called result.xml, result1.xml, result2.xml...
Related
I want to unzip all zip files in a directory, but I don't know how the contents were zipped (with a directory? by what name?), so I want to place the unziped contents into a directory named like the original zip file (in my case these are zip files students submitted to our Learning Management system).
I originally found the reverse of this, but it wasn't what I needed.
find . -name '*.zip' -print -exec unzip '{}' -d '{}'-unzipped \;
Can you make a script(bash) for moving all the files with the ending *.512.png to a new folder like res512(will be new branch) (keeping all the subfolders)
for this repo I tried really long but I can't figure it out.
You're not very specific with what you're asking.
If you want to move all files that have the suffix .512.png from within your current directory to a new directory, you can use the following
mkdir res512
cp -r *.512.png res512/
If you want to move all files that have the suffix .512.png from within your directory and all child directories into a new directory, you can use
mkdir res512
for f in $(find -type f -name "*.512.png")
do
cp $f res512/
done
If you want to move all files that have the suffix .512.png including their directory structure into a new directory, you can use
find . -name '*.512.png' -exec cp --parents \{\} res512/ \;
Replace cp with mv if you want to move the files instead of copy them.
I have a directory that has many folders in it that have sub folders again and then there are some files. Can I write a bash script that copies all the files from the given directory into one single folder? So that I do not have to navigate through ever single folder and copy the content to another folder.
In the topmost dir under which you want the files to be copied:
find . -type f -exec cp {} /some/new/location \;
Finds all the normal files and then copies them to /some/new/location
You could use find to list all files inside the folder:
find ./source -type f
and then use the output as argument for cp, it would look like this:
cp $(find ./source -type f) destination
There would be problem if there are some files within the original directory tree with conflicting name. In that case cp would refuse to copy additional files with the same name with error like:
cp: will not overwrite just-created destination/t22' with./source/test/t2/t22'
To make copies of files with same name, you can use backup option, like this:
cp --backup=numbered $(find ./source -type f) destination
If you want to see what is happening use -v (verbose) option
cp -v --backup=numbered $(find ./source -type f) destination
I tried the search, but couldn't find the answer to my specific problem.
When I use,
find /recovered_files "*.jpg" -type f -exec cp {} /out \;
to copy all .jpg files from directories within the /recovered_files directory, the /out directory gets filled with every single file (jpg, txt, xml etc etc) from within the source directories.
Can anyone please explain wherein my stupidity lies, pleeeeeease???
Many thanks, Mark.
What you're doing at the moment is equivalent to calling cp /dir/dir/dir/file.jpg /out for each file, which will copy the file into /out. Thus, all of the files are being put into the same directory.
rsync allows filters to select only certain files to be copied. Change from and to to the appropriate directories in the following:
rsync -r from/* to --include=*.jpg --filter='-! */' --prune-empty-dirs
Credit to this post for this solution.
Edit: changed to rsync solution. Original as follows:
find from -name "*.jpg" -type f -exec mkdir -p to/$(dirname {}) \; -exec cp --parents {} to \;
You should replace from and to with the appropriate locations, and this form won't quite work if from begins with /. Just cd to / first if you need to. Also, you'll end up with all the files inside to underneath the entire directory structure of from, but you can just move them back out again.
I have a folder, named Octave, in my Documents folder and in this Octave folder there are numerous other sub-folders which contain SED scripts to delete certain files contained in these sub-folders. Is there a top level command I can run in the terminal that will run all these "delete" SED files in the sub-folders at once, or sequentially? I wish to do this to clean up the Octave folder prior to backing it up.
something like this should work (not tested):
for file in $(find $(pwd) -name "*.sed"); do
cd $(basename $file);
$file
done
What about find -exec command? Something like:
find /home/babelproofreader/Documents/Octave -type f -name "*.sed" -exec {} \;