Source path /var/www/html/20170101/*.jpg
Destination path /Backup/html/20170101/
There will be a lot of Source path below, such as 20170101, 20171231, a directory with date name, and there will be a lot of graphics in each directory. How do I write a Rsync shell script? I'm using it now
ex:
Rsync -avzh --progress /var/www/html/20170101/ /Backup/html/20170101/
I want to write a script that can make the date part of the variable when the day directory is finished, and then change the directory for the next day to continue Rsync
Is this what you want:
Put all the directory names in a text file in the below format
cat dirnamefile
2017010
20171231
Now write a shell script to read the file and substitute the directory each time:
i=0
while read line;
do
arr[$i]=`echo $line`
i = `expr $i + 1`
done<dirname
for var in ${arr[#]}
do
Rsync -avzh --progress /var/www/html/$var/ /Backup/html/$var/
done
The above script will take in the list of different names for the directories and then loop though running rsync at each loop.
Is this what you are looking for?
Related
I have hundreds of datalogger files in a directory and I want to write a bash script that will take files with the same date in the filename (an example file name is "2016-06-15T170000_SMARTFLUX.data", where 2016-06-15 is the date) and store them in a folder with the date as the name. I am using a Mac Terminal window, which I believe is Linux (I apologize for my ignorance in computer terminology)
So far I have:
#Type of file (extension) to process:
FILES=*.data
#get date string from file name to use for newly created file
DATE=`ls $FILES|head -n 1|cut -c 1-10`
Any help would be greatly appreciated. I have only modified a bash script that combines these types of files into a text, and I have not created any folders or moved files.
Assuming your script is in the same dir as the data files:
#!/bin/bash
for filename in *.data; do
target_dir=${filename:0:10}
if [[ ! -d $target_dir ]]; then
mkdir $target_dir
fi
mv $filename $target_dir
done
Hi everyone I have some files to be copied to database. but for every time I should write "cp .csv'/tmp" so suggest me command in shell script so that I can copy all the files at once. Thank you
#!/bin/sh
file=/pathtoyour/textfile.txt
while read line
do
cp $line /tmp/
done < $file
src_dir="/export/home/destination"
list_file="client_list_file.txt"
file=".csv"
echo "src directory="$src_dir
echo "list_file="$list_file
echo "file="$file
cd /export/home/destination
touch $list_file
x=`ls *$file | sort >$list_file`
if [ -s $list_file ]
then
echo "List File is available, archiving now"
y=`tar -cvf mystuff.tar $list_file`
else
echo "List File is not available"
fi
The above script is working fine and it's supposed to create a list file of all .csv files and tar's it.
However I am trying to do it from a different directory while running the script, so it should go to the destination directory and makes a list file with all the .csv in destination directory and make a .tar from the list file(i.e archive the list file)
So i am not sure what to change
there are a lot of tricks in filename handling. the one thing you should know is file naming under POSIX sucks. commands like ls or find may not return the expected result(but 99% of the time they will). so here is what you have to do to get the list of files truely:
for file in $src_dir/*.csv; do
echo `basename $file` >> $src_dir/$list_file
done
tar cvf $src_dir/mystuff.tar $src_dir/$list_file
maybe you should learn bash in a serious manner and try to google first before you asking question in SO next time.
http://www.gnu.org/software/bash/manual/html_node/index.html#SEC_Contents
http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO.html
I've got a collection of hundreds of directories ordered in alphabetical order and with differently named files inside. These directories I want to copy over to another location using rsync.
I don't wanna go over all the directories manually, but instead I want to use the --include option of rsync or create a loop in bash to go over the directories.
For far I've tried using the bash script below, but had no success yet.
for dir in {A..Z}; do
echo "$dir";
rsync --progress --include $dir'*' --exclude '*' -rt -e ssh username#192.168.1.123:/source/directory/ ~/target/directory/
done;
Does anyone know what would be the correct way to go over the directories using rsync's --include option?
Update:
The bash script above was more to try out the loop to go over my directories and see what comes out. The command I actually wanted to use was this one:
for dir in /*; do
rsync --progress --include $dir'*' --exclude '*' --bwlimit=2000 -rt -e ssh username#192.168.1.123:/source/directory/ ~/target/directory/
done;
I know bash can do something like {A..Z}, but this doesn't seem to get me the result I want. I already copied half of the alphabet of directories so I was trying {F..Z} as an array.
Update
I've come up with the following script to run from my source directories location.
#!/bin/bash
time=$(date +"%Y-%m-%d %H:%M:%S") # For time indication
dir=/source/directory/[P-Z]* # Array of directories with name starting with "P" to "Z"
printf "[$time] Transferring: /source/directory/\"$dir\"\n"
rsync -trP -e 'ssh -p 123' --bwlimit=2000 $dir username#192.168.1.123:/target/directory
This will transfer all directories from the source directory with names starting with character "P" to "Z" over ssh using port 123.
This works for me in a shell script. I'm sure there are better ways to do this in a single line command, but this one I just came up with to help myself out.
Sounds like you want recursive rsync. I'd go with:
rsync -r / --restOfYourRsyncArgs
That walks over every file/folder/subfolder in / (could be A LOT, consider excludes and/or a different target path) and uploads/downloads. Set excludes for files and folders you don't want sent.
I've come up with the following script to run from my source directories location.
#!/bin/bash
time=$(date +"%Y-%m-%d %H:%M:%S") # For time indication
dir=/source/directory/[P-Z]* # Array of directories with name starting with "P" to "Z"
printf "[$time] Transferring: /source/directory/\"$dir\"\n"
rsync -trP -e 'ssh -p 123' --bwlimit=2000 $dir username#192.168.1.123:/target/directory
This will transfer all directories from the source directory with names starting with character "P" to "Z" over ssh using port 123. This works for me in a shell script. I'm sure there are better ways to do this in a single line command, but this one I just came up with to help myself out.
I am relatively new to bash scripting.
I need to create a script that will loop through a series of directories, go into subdirectories with a certain name, and then move their file contents into a common folder for all of the files.
My code so far is this:
#!/bin/bash
#used to gather usable pdb files
mkdir -p usable_pdbFiles
#loop through directories in "pdb" folder
for pdbDirectory in */
do
#go into usable_* directory
for innerDirectory in usable_*/
do
if [ -d "$innerDirectory" ] ; then
for file in *.ent
do
mv $file ../../usable_pdbFiles
done < $file
fi
done < $innerDirectory
done
exit 0
Currently I get
usable_Gather.sh: line 7: $innerDirectory: ambiguous redirect
when I try and run the script.
Any help would be appreciated!
The redirections < $innerDirectory and < $file are invalid and this is causing the problem. You don't need to use a loop for this, you can instead rely on the shell's filename expansion and use mv directly:
mkdir -p usable_pdbFiles
mv */usable_*/*.ent usable_pdbFiles
Bear in mind that this solution, and the loop based one that you are working on, will overwrite files with the same name in the destination directory.