Copying files in multiple directories in terminal - macos

I am new to using the command line for operations, so forgive me if this question is obvious. I would like to move all files of a certain type (.aiff) from one directory that contains many sub-directories. The file structure looks like this:
directory
- subdir1
-- sound.aiff
-- other.txt
- subdir2
-- sound2.aiff
-- other2.txt
I've tried using something like cp -R /Users/me/directory/*.aiff /Users/me/newdirectory but I get a "no such file or directory" error. I don't know how to specify that the files I want copied in the subdirectories must be .aiff files.

Try this:
cp -R /Users/me/directory/*/*.aiff /Users/me/newdirectory
But probably the destination /Users/me/newdirectory is missing.
You could verify this by doing:
file /Users/me/newdirectory
If the directory doesn't exist will print an error like:
Users/me/newdirectory: cannot open `/Users/me/newdirectory' (No such file or directory)
Create the directory with:
mkdir /Users/me/newdirectory
Next, try to copy the files again, if you want to move them use mv instead of cp
Another way is to use the command find, for example:
find /Users/me/directory -type f -iname "*.aiff" -exec mv {} /Users/me/newdirectory \;
In this example, the command find is going to search for in directory /Users/me/directory/ only for files -type f that end (case insensitive) in *.aiff for each file found it will execute the command mv exec mv {} /Users/me/newdirectory. The {} is a placeholder.
Before moving you could test the command by just finding the desired types:
find . -iname "*.aiff"
This will search for files within the directory the command is executed, notice the . instead of a /Users/me/directory/

Related

How to write a bash script to copy files from one base to another base location

I have a bash script I'm trying to write
I have 2 base directories:
./tmp/serve/
./src/
I want to go through all the directories in ./tmp and copy the *.html files into the same folder path in ./src
i.e
if I have a html file in ./tmp/serve/app/components/help/ help.html -->
copy to ./src/app/components/help/ And recursively do this for all subdirectories in ./tmp/
NOTE: the folder structures should exist so just need to copy them only. If it doesn't then hopefully it could create the folder for me (not what I want) but with GIT I can track these folders to manually handle those loose html files.
I got as far as
echo $(find . -name "*.html")\n
But not sure how to actually extract the file path with pwd and do what I need to, maybe it's not a one liner and needs to be done with some vars.
something like
for i in `echo $(find /tmp/ -name "*.html")\n
do
cp -r $i /src/app/components/help/
done
going so far to create the directories would take some more time for me.
I'll try to do it on my own and see if I come up with something
but for argument sake if you do run pwd and get a response the pseudo code for that:
pwd
get response
if that directory does not exist in src create that directory
copy all the original directories contents into the new folder at /src/$newfolder
(possibly running two for loops, one to check the directory tree, and then one to go through each original directory, copying all the html files)
You process substitution to loop the output from your find command and create the destination directory(ies) and then copy the file(s):
#!/bin/bash
# accept first parameters to script as src_dir and dest values or
# simply use default values if no parameter(s) passed
src_dir=${1:-/tmp/serve}
dest=${2-src}
while read -r orig_path ; do
# To replace the first occurrence of a pattern with a given string,
# use ${parameter/pattern/string}
dest_path="${orig_path/tmp\/serve/${dest}}"
# Use dirname to remove the filename from the destination path
# and create the destination directory.
dest_dir=$(dirname "${dest_path}")
mkdir -p "${dest_dir}"
cp "${orig_path}" "${dest_path}"
done < <(find "${src_dir}" -name '*.html')
This script copy .html files from src directory to des directory (create the subdirectory if they do not exist)
Find the files, then remove the src directory name and copy them into the destination directory.
#!/bin/bash
for i in `echo $(find src/ -name "*.html")`
do
file=$(echo $i | sed 's/src\///g')
cp -r --parents $i des
done
Not sure if you must use bash constructs or not, but here is a GNU tar solution (if you use GNU tar), which IMHO is the best way to handle this situation because all the metadata for the files (permissions, etc.) are preserved:
find ./tmp/serve -name '*.html' -type f -print0 | tar --null -T - -c | tar -x -v -C ./src --strip-components=3
This finds all the .html files (-type f) in the ./tmp/serve directory and prints them nul-terminated (-print0), then sends these filenames via stdin to tar as nul-terminated literals (--null) for inclusion (-T -), creating (-c) an archive which is then sent to another tar instance which extracts (-x) the archive printing its contents along the way (optional: -v), changing directory to the destination (-C ./src) before commencing and stripping (--strip-components=3) the ./tmp/serve/ prefix from the files. (You could also cd ./tmp/serve beforehand, using find . instead, and change -C to ../../src.)

How to copy recursively files with multiple specific extensions in bash

I want to copy all files with specific extensions recursively in bash.
****editing****
I've written the full script. I have list of names in a csv file, I'm iterating through each name in that list, then creating a directory with that same name somewhere else, then I'm searching in my source directory for the directory with that name, inside it there are few files with endings of xlsx,tsv,html,gz and I'm trying to copy all of them into the newly created directory.
sample_list_filepath=/home/lists/papers
destination_path=/home/ds/samples
source_directories_path=/home/papers_final/new
cat $sample_list_filepath/sample_list.csv | while read line
do
echo $line
cd $source_directories_path/$line
cp -r *.{tsv,xlsx,html,gz} $source_directories_path/$line $destination_path
done
This works, but it copies all the files there, with no discrimination for specific extension.
What is the problem?
An easy way to solve your problem is to use find and regex :
find src/ -regex '.*\.\(tsv\|xlsx\|gz\|html\)$' -exec cp {} dest/ \;
find look recursively in the directory you specify (in my example it's src/), allows you to filter with -regex and to apply a command for matching results with -exec
For the regex part :
.*\.
will take the name of the file and the dot before extension,
\(tsv\|xlsx\|gz\|html\)$
verify the extension with those you want.
The exec block is what you do with files you got from regex
-exec cp {} dest/ \;
In this case, you copy what you got ({} meaning) to the destination directory.

How can i copy the contents of a directory located in multiple locations using find command and preserving directory structure?

I have a folder named accdb under multiple directories all under one parent directory dist. I want to copy the contents of accdb for all directories while preserving the code structure
I succeeded in making the recursive folder structure with:
cd ~/dist; find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \;
But i am failing to copy the contents of accdb. This command just makes the structure until directory accdb.
I tried
find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \ && cp -r {} ~/acc_trial/{} \;
I get an error:
find: missing argument to `-exec'
I don't know if this is possible using only a find expression, I'm pretty sure it is not. Besides you must consider that if you have one subfolder named accdb inside one accdb folder you'll probably get an error, that's why in the script that I've made I decided to use rsync:
#!/bin/bash
DEST='/home/corronx/provisional/destination_dir'
#Clean destination directory, PLEASE BE CAREFUL IT MUST BE A REMOVABLE DIRECTORY
rm -rf $DEST/*
FIND='test'
LOOK_PATH='/home/corronx/provisional'
FILES=($(find . -type d -name $FIND))
for ((i=0; i<${#FILES[#]};i++))
do
#Remove first character .
FILES[$i]=${FILES[$i]:1:${#FILES[$i]}}
#Create directories in destination path
mkdir -p $DEST${FILES[$i]}
rsync -aHz --delete ${FILES[$i]:1:${#FILES[$i]}}/ $DEST${FILES[$i]}
echo $i
done
Explanation
First of all I'd recommend using full paths in your script because an rm -rf expression inside an script is pretty dangerous (If you want comment that line and delete destination folder before running script).
DEST= Destination path.
FIND= Subfolder name that your are looking for.
LOOK_PATH= Path where you want to execute find
I create an array called FILES that contain all folders that returns find expression, after that I just create destination directories and run rsync to copy files, I've used rsync because I think it is better in case there is any subdirectory with the same name.
PLEASE BE CAREFUL WITH rm -rf expression, if DEST is not set you'll delete everything in your machine

Get all content from a folder with subfolder copied to another directory with bash

I have a directory that has many folders in it that have sub folders again and then there are some files. Can I write a bash script that copies all the files from the given directory into one single folder? So that I do not have to navigate through ever single folder and copy the content to another folder.
In the topmost dir under which you want the files to be copied:
find . -type f -exec cp {} /some/new/location \;
Finds all the normal files and then copies them to /some/new/location
You could use find to list all files inside the folder:
find ./source -type f
and then use the output as argument for cp, it would look like this:
cp $(find ./source -type f) destination
There would be problem if there are some files within the original directory tree with conflicting name. In that case cp would refuse to copy additional files with the same name with error like:
cp: will not overwrite just-created destination/t22' with./source/test/t2/t22'
To make copies of files with same name, you can use backup option, like this:
cp --backup=numbered $(find ./source -type f) destination
If you want to see what is happening use -v (verbose) option
cp -v --backup=numbered $(find ./source -type f) destination

Bash script to recursively copy files and folders when subdir is not present

I have lots of projects archived under a directory tree, some of which have a .git folder in them.
What I'd like to do is recursively copy those files and directories to a new destination, keeping the current structure - EXCEPT for those directories containing a .git folder, in which case the script should run a command (let's say "echo", I'll change it later) followed by the folder name, without creating or copying it.
Any help would be much appreciated.
Edit: I'll try to explain myself better: I need to copy every single file and directory, except for those containing .git, which should be skipped and their path should be passed to another command. In this example, path a/b/c/d and its subfolders should be skipped entirely and a/b/c/d should be displayed using echo (just for brevity, I'll replace it with a different command later):
a
a/b
a/b/c
a/b/c/d/.git
a/b/c/d/e
a/b/c/d/f/g
a/b/c/e
a/b/d
a/c
b
b/c
...
IIUC, the following find one-liner will do the job:
find . -type d -mindepth 1 -maxdepth 1 -exec sh -c "test -e '{}/.git' && echo not copy '{}' || cp -r -- '{}' /tmp/copy-here " \;

Resources