Bash delete and copy multiple files with multiple extension - bash

I kind of got stuck in a situation. I wanted to copy certain files from one directory to another (non recursive).
I have multiple files with extensions like .txt, .so, .so.mem, .so.lib, .lib and multiple directories in a directory called base. I'd like to copy all the files non-recursively (only from the base directory) to another directory called test.
I did the following:
Try 1
pushd $base
find -not -name '*.so' -not -name '*.so.*' -type f -print() -exec rm -f {} +
cp -f $base/* $test
In above try the find has somehow deleted all except .so, even though I have written -not -name '*.so.*' i.e. files .so.mem and .so.lib should not be deleted.
Am I doing something wrong ?

If I understood right, that's what you want:
mkdir test
# move all required files to the 'test' directory
mv base/*.txt base/*.so base/*.so.* base/*.lib test
# delete all remaining files from the 'base' directory
rm base/*
In the end, you'll have an empty base directory (assuming no hidden files) and a test directory full of *.txt, *.so and *.lib files.

Related

How can i copy the contents of a directory located in multiple locations using find command and preserving directory structure?

I have a folder named accdb under multiple directories all under one parent directory dist. I want to copy the contents of accdb for all directories while preserving the code structure
I succeeded in making the recursive folder structure with:
cd ~/dist; find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \;
But i am failing to copy the contents of accdb. This command just makes the structure until directory accdb.
I tried
find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \ && cp -r {} ~/acc_trial/{} \;
I get an error:
find: missing argument to `-exec'
I don't know if this is possible using only a find expression, I'm pretty sure it is not. Besides you must consider that if you have one subfolder named accdb inside one accdb folder you'll probably get an error, that's why in the script that I've made I decided to use rsync:
#!/bin/bash
DEST='/home/corronx/provisional/destination_dir'
#Clean destination directory, PLEASE BE CAREFUL IT MUST BE A REMOVABLE DIRECTORY
rm -rf $DEST/*
FIND='test'
LOOK_PATH='/home/corronx/provisional'
FILES=($(find . -type d -name $FIND))
for ((i=0; i<${#FILES[#]};i++))
do
#Remove first character .
FILES[$i]=${FILES[$i]:1:${#FILES[$i]}}
#Create directories in destination path
mkdir -p $DEST${FILES[$i]}
rsync -aHz --delete ${FILES[$i]:1:${#FILES[$i]}}/ $DEST${FILES[$i]}
echo $i
done
Explanation
First of all I'd recommend using full paths in your script because an rm -rf expression inside an script is pretty dangerous (If you want comment that line and delete destination folder before running script).
DEST= Destination path.
FIND= Subfolder name that your are looking for.
LOOK_PATH= Path where you want to execute find
I create an array called FILES that contain all folders that returns find expression, after that I just create destination directories and run rsync to copy files, I've used rsync because I think it is better in case there is any subdirectory with the same name.
PLEASE BE CAREFUL WITH rm -rf expression, if DEST is not set you'll delete everything in your machine

Copy files recursively to all sub-domain folders

On my server, I have x number of sub-domains
Their folder names are like this:
example1.mydomain.com , example2.mydomain.com, examplex.mydomain.com
and they all exist in one location.
I am trying to write a simple bash script to copy all folders and files in folder SOURCE for example to all of those sub-domain folders (and replace existing).
In other words, I want to copy my files from source to any folder with the name *.mydomain.com
I tried rsync but couldn't do the *.mydomain part
I suggest:
for i in *.mydomain.com; do rsync -aSv "SOURCE/" "$i"; done
The trailing / after SOURCE is important.
You can use find command to search all files and then use this output to copy them, e.g. assuming you are searching in /home and copying to /target
find /home -name "*.mydomain.com" -exec cp -r {} /target/ \;
But one problem in above solution I see is it might find files / folders with these names and copy them none the less (not sure if it will maintain the same folder hierarchy), perhaps IF you are only looking for folder then try below instead,
find /home -name "*.mydomain.com" -type d -exec cp -r {} /target/ \;

Copying Multiple Files within different Folders to another Folder in BASH

I have a lot of audio files located in different folders and I only want to copy the files that contains "LR" in their names to another folder.
To find all regular files that exist under sourcedir and that contain LR in their names and copy them to destdir:
find sourcedir -type f -name '*LR*' -exec cp -t destdir {} +
How it works
find sourcedir
This begins a find command tells it to start with directory sourcedir.
-type f
This tells find to look only for regular files.
-name '*LR*'
This tells find to limit its search to files whose names contain LR.
-exec cp -t destdir {} +
For any such files found, this runs the cp command to move then to destdir.
For more information, see man find.

Get all content from a folder with subfolder copied to another directory with bash

I have a directory that has many folders in it that have sub folders again and then there are some files. Can I write a bash script that copies all the files from the given directory into one single folder? So that I do not have to navigate through ever single folder and copy the content to another folder.
In the topmost dir under which you want the files to be copied:
find . -type f -exec cp {} /some/new/location \;
Finds all the normal files and then copies them to /some/new/location
You could use find to list all files inside the folder:
find ./source -type f
and then use the output as argument for cp, it would look like this:
cp $(find ./source -type f) destination
There would be problem if there are some files within the original directory tree with conflicting name. In that case cp would refuse to copy additional files with the same name with error like:
cp: will not overwrite just-created destination/t22' with./source/test/t2/t22'
To make copies of files with same name, you can use backup option, like this:
cp --backup=numbered $(find ./source -type f) destination
If you want to see what is happening use -v (verbose) option
cp -v --backup=numbered $(find ./source -type f) destination

Problems using find and cp to copy just .jpg files from a LOT of directories to one new path

I tried the search, but couldn't find the answer to my specific problem.
When I use,
find /recovered_files "*.jpg" -type f -exec cp {} /out \;
to copy all .jpg files from directories within the /recovered_files directory, the /out directory gets filled with every single file (jpg, txt, xml etc etc) from within the source directories.
Can anyone please explain wherein my stupidity lies, pleeeeeease???
Many thanks, Mark.
What you're doing at the moment is equivalent to calling cp /dir/dir/dir/file.jpg /out for each file, which will copy the file into /out. Thus, all of the files are being put into the same directory.
rsync allows filters to select only certain files to be copied. Change from and to to the appropriate directories in the following:
rsync -r from/* to --include=*.jpg --filter='-! */' --prune-empty-dirs
Credit to this post for this solution.
Edit: changed to rsync solution. Original as follows:
find from -name "*.jpg" -type f -exec mkdir -p to/$(dirname {}) \; -exec cp --parents {} to \;
You should replace from and to with the appropriate locations, and this form won't quite work if from begins with /. Just cd to / first if you need to. Also, you'll end up with all the files inside to underneath the entire directory structure of from, but you can just move them back out again.

Resources