Preserve directory tree while copying files with cp - bash

I have about 1000 folders that I want to extract a single file from to upload to a server but I need to preserve the directory tree.
cp */myFile.txt ../newTree
Is what I basically want to do but instead of each file being saved to ../newTree/myFile.txt I want it to be ../newTree/*/myFile.txt where the * is the wildcard from the cp command.
I couldn't find a flag in the man file for this so I'm thinking I may need another utility besides cp

With rsync:
find ./ -name myFile.txt -print0|rsync -0adv --files-from=- ./ ../newTree/
Without rsync:
You can find all files, for each file you create the directory in the newTree, and copy the file to it.
for file in */myFile.txt; do
dir=$(dirname "$file")
mkdir -p "../newTree/$dir"
cp "$file" "../newTree/$dir"
done

Store all the files in a tar archive , then extract it on the server.

Related

How to gunzip all the files with a given extension and send to another folder?

I have a folder with 28 gz files with the extension .gz and 28 files with the extension .gz.bam.
I would like to unzip all the 28 .gz files and send them to another folder. I was doing one by one as follows:
gunzip -c file1.gz > /mnt/s3/data_transfer/file1
How can I specify I want the .gz and not the .gz.bam?
This might be what you're looking for:
for f in *.gz; do gunzip -c "$f" > /mnt/s3/data_transfer/"${f%.gz}"; done
You may remove the .gz files by rm *.gz after that, if you want.
Or, alternatively
cp *.gz /mnt/s3/data_transfer/ && cd /mnt/s3/data_transfer && gunzip *.gz
Note that the latter command will gunzip all .gz files in the directory /mnt/s3/data_transfer, including the ones that exist, if any, before the cp command is executed. If you want to remove the original .gz files, replace cp with mv.
First, let's find the files:
find . -type f -name "*.gz" > foo.sh
We find the files whose name ends with gz and save them into foo.sh. Let's open it:
vim foo.sh
Now, we prepend the gunzip command to each line. Let's hit the : key, so at the bottom of the file you can enter a command. Now, let's enter this command:
%s/^/gunzip -c /
this replaces the start of each line with the text we want to prepend.
Then we append the destination path, by pressing : again and pasting the following:
%s/$/ \/some\/path/
of course, instead of \/some\/path you will need to use your own path.
Finally, add
cd /some/path
rename 's/\.gz//' *
to remove all the gz extensions from the files in /some/path (replace this with your actual path)
Note that you may need to install rename and here I assumed that the target path already existed.

How to write a bash script to copy files from one base to another base location

I have a bash script I'm trying to write
I have 2 base directories:
./tmp/serve/
./src/
I want to go through all the directories in ./tmp and copy the *.html files into the same folder path in ./src
i.e
if I have a html file in ./tmp/serve/app/components/help/ help.html -->
copy to ./src/app/components/help/ And recursively do this for all subdirectories in ./tmp/
NOTE: the folder structures should exist so just need to copy them only. If it doesn't then hopefully it could create the folder for me (not what I want) but with GIT I can track these folders to manually handle those loose html files.
I got as far as
echo $(find . -name "*.html")\n
But not sure how to actually extract the file path with pwd and do what I need to, maybe it's not a one liner and needs to be done with some vars.
something like
for i in `echo $(find /tmp/ -name "*.html")\n
do
cp -r $i /src/app/components/help/
done
going so far to create the directories would take some more time for me.
I'll try to do it on my own and see if I come up with something
but for argument sake if you do run pwd and get a response the pseudo code for that:
pwd
get response
if that directory does not exist in src create that directory
copy all the original directories contents into the new folder at /src/$newfolder
(possibly running two for loops, one to check the directory tree, and then one to go through each original directory, copying all the html files)
You process substitution to loop the output from your find command and create the destination directory(ies) and then copy the file(s):
#!/bin/bash
# accept first parameters to script as src_dir and dest values or
# simply use default values if no parameter(s) passed
src_dir=${1:-/tmp/serve}
dest=${2-src}
while read -r orig_path ; do
# To replace the first occurrence of a pattern with a given string,
# use ${parameter/pattern/string}
dest_path="${orig_path/tmp\/serve/${dest}}"
# Use dirname to remove the filename from the destination path
# and create the destination directory.
dest_dir=$(dirname "${dest_path}")
mkdir -p "${dest_dir}"
cp "${orig_path}" "${dest_path}"
done < <(find "${src_dir}" -name '*.html')
This script copy .html files from src directory to des directory (create the subdirectory if they do not exist)
Find the files, then remove the src directory name and copy them into the destination directory.
#!/bin/bash
for i in `echo $(find src/ -name "*.html")`
do
file=$(echo $i | sed 's/src\///g')
cp -r --parents $i des
done
Not sure if you must use bash constructs or not, but here is a GNU tar solution (if you use GNU tar), which IMHO is the best way to handle this situation because all the metadata for the files (permissions, etc.) are preserved:
find ./tmp/serve -name '*.html' -type f -print0 | tar --null -T - -c | tar -x -v -C ./src --strip-components=3
This finds all the .html files (-type f) in the ./tmp/serve directory and prints them nul-terminated (-print0), then sends these filenames via stdin to tar as nul-terminated literals (--null) for inclusion (-T -), creating (-c) an archive which is then sent to another tar instance which extracts (-x) the archive printing its contents along the way (optional: -v), changing directory to the destination (-C ./src) before commencing and stripping (--strip-components=3) the ./tmp/serve/ prefix from the files. (You could also cd ./tmp/serve beforehand, using find . instead, and change -C to ../../src.)

Unix command CP to copy a file to multiple directories

I have folder structure like this:
/home/
/folder1/
/backup/
/folder2/
/backup/
/folder3/
/folder4/
/backup/
/folder5/
(As you can see, no all directories "folder" have a directory "backup")
I need to copy the script "checker.php" to all "backup" directories only.
"checker.php" is at:
/home/checker.php
I am using this command:
cp /home/checker.php /home/*/backup/checker.php
But it is not working. Please help.
The cp command doesn't allow multiple destination directories.
A way forward is to loop through the folders:
for d in /home/*/backup; do
cp /home/checker.php "$d"
done

How can i copy the contents of a directory located in multiple locations using find command and preserving directory structure?

I have a folder named accdb under multiple directories all under one parent directory dist. I want to copy the contents of accdb for all directories while preserving the code structure
I succeeded in making the recursive folder structure with:
cd ~/dist; find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \;
But i am failing to copy the contents of accdb. This command just makes the structure until directory accdb.
I tried
find . -name "accdb" -type d -exec mkdir -p -- ~/acc_trial/{} \ && cp -r {} ~/acc_trial/{} \;
I get an error:
find: missing argument to `-exec'
I don't know if this is possible using only a find expression, I'm pretty sure it is not. Besides you must consider that if you have one subfolder named accdb inside one accdb folder you'll probably get an error, that's why in the script that I've made I decided to use rsync:
#!/bin/bash
DEST='/home/corronx/provisional/destination_dir'
#Clean destination directory, PLEASE BE CAREFUL IT MUST BE A REMOVABLE DIRECTORY
rm -rf $DEST/*
FIND='test'
LOOK_PATH='/home/corronx/provisional'
FILES=($(find . -type d -name $FIND))
for ((i=0; i<${#FILES[#]};i++))
do
#Remove first character .
FILES[$i]=${FILES[$i]:1:${#FILES[$i]}}
#Create directories in destination path
mkdir -p $DEST${FILES[$i]}
rsync -aHz --delete ${FILES[$i]:1:${#FILES[$i]}}/ $DEST${FILES[$i]}
echo $i
done
Explanation
First of all I'd recommend using full paths in your script because an rm -rf expression inside an script is pretty dangerous (If you want comment that line and delete destination folder before running script).
DEST= Destination path.
FIND= Subfolder name that your are looking for.
LOOK_PATH= Path where you want to execute find
I create an array called FILES that contain all folders that returns find expression, after that I just create destination directories and run rsync to copy files, I've used rsync because I think it is better in case there is any subdirectory with the same name.
PLEASE BE CAREFUL WITH rm -rf expression, if DEST is not set you'll delete everything in your machine

BASH: Copy all files and directories into another directory in the same parent directory

I'm trying to make a simple script that copies all of my $HOME into another folder in $HOME called Backup/. This includes all hidden files and folders, and excludes Backup/ itself. What I have right now for the copying part is the following:
shopt -s dotglob
for file in $HOME/*
do
cp -r $file $HOME/Backup/
done
Bash tells me that it cannot copy Backup/ into itself. However, when I check the contents of $HOME/Backup/ I see that $HOME/Backup/Backup/ exists.
The copy of Backup/ in itself is useless. How can I get bash to copy over all the folders except Backup/. I tried using extglob and using cp -r $HOME/!(Backup)/ but it didn't copy over the hidden files that I need.
try rsync. you can exclude file/directories .
this is a good reference
http://www.maclife.com/article/columns/terminal_101_using_rsync_locally
Hugo,
A script like this is good, but you could try this:
cp -r * Backup/;
cp -r .* Backup/;
Another tool used with backups is tar. This compresses your backup to save disk space.
Also note, the * does not cover . hidden files.
I agree that using rsync would be a better solution, but there is an easy way to skip a directory in bash:
for file in "$HOME/"*
do
[[ $file = $HOME/Backup ]] && continue
cp -r "$file" "$HOME/Backup/"
done
This doesn't answer your question directly (the other answers already did that), but try cp -ua when you want to use cp to make a backup. This recurses directories, copies rather than follows links, preserves permissions and only copies a file if it is newer than the copy at the destination.

Resources