we use this command in our .gitlab-ci.yml during the deploy stage. It uses lftp to mirror(ish).
But we recently noticed a big problem which indicates we might not understand the syntax for the --exclude option of lftp.
It indeed seems that --exclude .env is excluding all files matching ..env.. like this one for instance :
application/migrations/20220805084313_rajout_champ_cronjobs_id_environnement.php
- lftp -e "set ftp:ssl-allow no ; mirror -p -Rev ./ public_html/project/ --parallel=10 --exclude-glob .git* --exclude .env" -u $LOGIN,$PWD ftp://$SERVER
Is that normal behavior? If so How do I exclude only files named ".env" then? Or is it a bug?
Thanks.
Related
I try to do a simple rsync command and to exclude directories that have a flagged file in it. The flagged flagged.txt is a text file contains only *. This command below run fine in terminal:
rsync -avP --delete --filter="dir-merge,n- flagged.txt" \
"/mnt/c/Documents/somefolder" "/mnt/j/"
However when I put in in bash script, the flagged is ignored.
Tried
rsync -avP --delete --filter='dir-merge,n- flagged.txt' \
"/mnt/c/Documents/somefolder" "/mnt/j/"
and
filter=(--filter="dir-merge,n- flagged.txt")
rsync -avP --delete "${filter[#]}" "/mnt/c/Documents/somefolder" "/mnt/j/"
and
filter=(--filter='dir-merge,n- flagged.txt')
rsync -avP --delete "${filter[#]}" "/mnt/c/Documents/somefolder" "/mnt/j/"
All failed. Any idea? Thanks in advance!
I have an rsync script that backs up user's home folder using this line of code:
/usr/bin/caffeinate -i /usr/bin/rsync -rlptDn --human-readable --progress --ignore-existing --update $PATH/$NAME/ --exclude=".*" --exclude="Public" --exclude="Library" /Volumes/Backup/Users/$NAME\ -\ $DATE
How do I ignore everything in ~/Library/ but their ~/Library/Mail/? I wanted to include this rsync flag, --include="/Library/Mail", but I'm not sure if I should depend too much on rsync exclusions & inclusions as it can become unreliable and varies between different versions of OS X rsync.
Maybe a command-line regex tool would be more useful? Example:
ls -d1 ~/Library/*| grep -v '^mail' > $ALIST
exec <${ALIST}
read SRC
do
.
.
.
$RSYNC..etc...
rsync's --include and --exclude options obey what is called "precedence" so there's a firm rule you can rely which means what you explicitly include before you exclude will be what is sent.
In your case, add --include ~/Library/Mail before the first --exclude.
Assume there are some folders with these structures
/bench1/1cpu/p_0/image/
/bench1/1cpu/p_0/fl_1/
/bench1/1cpu/p_0/fl_1/
/bench1/1cpu/p_0/fl_1/
/bench1/1cpu/p_0/fl_1/
/bench1/1cpu/p_1/image/
/bench1/1cpu/p_1/fl_1/
/bench1/1cpu/p_1/fl_1/
/bench1/1cpu/p_1/fl_1/
/bench1/1cpu/p_1/fl_1/
/bench1/2cpu/p_0/image/
/bench1/2cpu/p_0/fl_1/
/bench1/2cpu/p_0/fl_1/
/bench1/2cpu/p_0/fl_1/
/bench1/2cpu/p_0/fl_1/
/bench1/2cpu/p_1/image/
/bench1/2cpu/p_1/fl_1/
/bench1/2cpu/p_1/fl_1/
/bench1/2cpu/p_1/fl_1/
/bench1/2cpu/p_1/fl_1/
....
What I want to do is to scp the following folders
/bench1/1cpu/p_0/image/
/bench1/1cpu/p_1/image/
/bench1/2cpu/p_0/image/
/bench1/2cpu/p_1/image/
As you can see I want to recursively use scp but excluding all folders that name "fl_X". It seems that scp has not such option.
UPDATE
scp has not such feature. Instead I use the following command
rsync -av --exclude 'fl_*' user#server:/my/dir
But it doesn't work. It only transfers the list of folders!! something like ls -R
Although scp supports recursive directory copying with the -r option, it does not support filtering of the files. There are several ways to accomplish your task, but I would probably rely on find, xargs, tar, and ssh instead of scp.
find . -type d -wholename '*bench*/image' \
| xargs tar cf - \
| ssh user#remote tar xf - -C /my/dir
The rsync solution can be made to work, but you are missing some arguments. rsync also needs the r switch to recurse into subdirectories. Also, if you want the same security of scp, you need to do the transfer under ssh. Something like:
rsync -avr -e "ssh -l user" --exclude 'fl_*' ./bench* remote:/my/dir
You can specify GLOBIGNORE and use the pattern *
GLOBIGNORE='ignore1:ignore2' scp -r source/* remoteurl:remoteDir
You may wish to have general rules which you combine or override by using export GLOBIGNORE, but for ad-hoc usage simply the above will do. The : character is used as delimiter for multiple values.
Assuming the simplest option (installing rsync on the remote host) isn't feasible, you can use sshfs to mount the remote locally, and rsync from the mount directory. That way you can use all the options rsync offers, for example --exclude.
Something like this should do:
sshfs user#server: sshfsdir
rsync --recursive --exclude=whatever sshfsdir/path/on/server /where/to/store
Note that the effectiveness of rsync (only transferring changes, not everything) doesn't apply here. This is because for that to work, rsync must read every file's contents to see what has changed. However, as rsync runs only on one host, the whole file must be transferred there (by sshfs). Excluded files should not be transferred, however.
If you use a pem file to authenticate u can use the following command (which will exclude files with something extension):
rsync -Lavz -e "ssh -i <full-path-to-pem> -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" --exclude "*.something" --progress <path inside local host> <user>#<host>:<path inside remote host>
The -L means follow links (copy files not links).
Use full path to your pem file and not relative.
Using sshfs is not recommended since it works slowly. Also, the combination of find and scp that was presented above is also a bad idea since it will open a ssh session per file which is too expensive.
You can use extended globbing as in the example below:
#Enable extglob
shopt -s extglob
cp -rv !(./excludeme/*.jpg) /var/destination
This one works fine for me as the directories structure is not important for me.
scp -r USER#HOSTNAME:~/bench1/?cpu/p_?/image/ .
Assuming /bench1 is in the home directory of the current user. Also, change USER and HOSTNAME to the real values.
I am trying to write a simple bash script for my local (Mac OS X) machine to move files from a directory on my machine to a remote machine. This line is failing:
rsync --verbose --progress --stats --compress --rsh=ssh \
--recursive --times --perms --links --delete \
--exclude "*bak" --exclude "*~" \
/repository/* $DEV_SERVER:$REMOTE_DIR
$DEV_SERVER and $REMOTE_DIR are defined previously, and I echo them to verify they're accurate.
The error I'm getting is:
rsync: link_stat /Users/myusername/mycurrentdirectory failed: No such file or directory (2)
To note here is that rather than using the defined directory (/repository, which is in the root of the machine), it uses my working directory. What is causing this?
Check that your \ characters have no whitespace after them at the end of the line. This will cause BASH to not interpret the line wrap correctly, giving the rsync error above.
Remove the '*' from the source location, rsync knows to look in the inside of the directory if you specify the '/' in the end
like that:
rsync --verbose --progress --stats --compress --rsh=ssh --recursive --times --perms --links --delete --exclude "*bak" --exclude "*~" /repository/ $DEV_SERVER:$REMOTE_DIR
I was encountering the same error while doing some rsync work. I had the wrong character for specifying options which I must have gotten from copying and pasting the command from elsewhere:
−
rather than the correct character below:
-
The source path (here: /repository/*) does contain a wildcard (*).
There is some info in the man pages for rsync:
"Note that the expansion of wildcards on the commandline (*.c) into a list of files is handled by the shell before it runs rsync and not by rsync itself (exactly the same as all other posix-style programs)."
I.e. in case you put in bash double quotes "" around the (source path) including the wildcard symbol (*), the rsync commandwill succeed on command line interface, while inside a bash shell it will yield exactly the error described here.
rsync: link_stat "<source path>*" failed: No such file or directory (2)
=> make sure to not put the wild card(s) in quotes in an rsync source path.
This:
rsync --verbose --progress --stats --compress --rsh=ssh \
--recursive --times --perms --links --delete \
--exclude "*bak" --exclude "*~" \
/repository/* $DEV_SERVER:$REMOTE_DIR
should be this:
rsync --verbose --progress --stats --compress --rsh=ssh --recursive --times --perms --links --delete --exclude "*bak" --exclude "*~" /repository/* $DEV_SERVER:$REMOTE_DIR
Bash interprets the \ character differently than the command line, or perhaps there is an implicit non-whitespace character after it.
I am trying to copy a project to my server with rsync.
I have project specific install scripts in a subdirectory
project/specs/install/project1
What I am trying to do is exclude everything in the project/specs directory but the project specific install directory: project/specs/install/project1.
rsync -avz --delete --include=specs/install/project1 \
--exclude=specs/* /srv/http/projects/project/ \
user#server.com:~/projects/project
But like this the content of the specs directory gets excluded but the install/project1 directory does not get included.
I have tried everything but i just don't seem to get this to work
Sometime it's just a detail.
Just change your include pattern adding a trailing / at the end of include pattern and it'll work:
rsync -avz --delete --include=specs/install/project1/ \
--exclude=specs/* /srv/http/projects/project/ \
user#server.com:~/projects/project
Or, in alternative, prepare a filter file like this:
$ cat << EOF >pattern.txt
> + specs/install/project1/
> - specs/*
> EOF
Then use the --filter option:
rsync -avz --delete --filter=". pattern.txt" \
/srv/http/projects/project/ \
user#server.com:~/projects/project
For further info go to the FILTER RULES section in the rsync(1) manual page.
The other solution is not working here.
Reliable way
You have no choice but to manually descend for each level of your sub-directory. There is no risk to include unwanted files, as rsync doesn't include the files of included directories.
1) Create an include filter file, for instance "include_filter.txt":
+ /specs/
+ /specs/install/
+ /specs/install/project1/***
- /specs/**
2) Run it:
rsync -avz --delete --include-from=include_filter.txt \
/srv/http/projects/project/ \
user#server.com:~/projects/project
Don't forget the starting slash "/", otherwise you may match sub-directories named "**/specs/install/project1/".
By choosing an include type filter (--include-from=FILE), the starting plus "+" signs are actually optional, as this is the default action with no sign. (You can have the opposite "-" by default with --exclude-from=FILE.)
The double stars "**" means "any path"
The triple stars "***" means "any path, including this very directory"
Easy way
You can start your filters "*/", allowing rsync to descend all your sub-levels. This is convenient but:
All directories will be included, albeit empty. This can be fixed with the rysnc option -m, but then all empty dirs will be skipped.
1) Create an include filter file, for instance "include_filter.txt":
+ /**/
+ /specs/install/project1/***
- /specs/**
2) Run it:
rsync -avzm --delete --include-from=include_filter.txt \
/srv/http/projects/project/ \
user#server.com:~/projects/project
Note the added option -m.
Order of --include and --exclude affects what is being included or excluded.
When there are particular subdirectories to be included need to place them first.
Similar post available here.