I have lots of files under dir1/ in the server.
I want to copy all the png files under dir1/*.png to google drive.
However
rclone copy dir1/*.png gdrive:dir2/
gives error.
Usage:
rclone copy source:path dest:path [flags]
Flags:
--create-empty-src-dirs Create empty source dirs on destination after copy
-h, --help help for copy
Use "rclone [command] --help" for more information about a command.
Use "rclone help flags" for to see the global flags.
Use "rclone help backends" for a list of supported services.
Command copy needs 2 arguments maximum: you provided 66 non flag arguments:
Is there a way to solve this issue?
"perfect" with --include :-)
rclone copy dir1/ gdrive:dir2/ --include "*.png"
Ok, I have found a way to resolve this issue in another way around.
I used the "exclude" flag.
In this way, I exclude the "directory_I_do_not_want_to_copy_under_dir1" and all of its content.
Therefore I copied the remaining under dir1/ including *.png files.
Not the perfect but an approximate solution :)
rclone copy dir1/ gdrive:dir2/ --exclude==/directory_I_do_not_want_to_copy_under_dir1/**
Related
I'm not allowed to use rsync on the cluster I'm working on so I need to use cp. I want to copy a large directory including all files and subfolders etc. but without any folders that have the name "outdir".
I tried cp -r -v ./!(outdir) ../target-directory/
but it still copies all folders and contents in deeper directories with the name outdir. It only included the outdir folders in the highest directory.
I also tried cp -r ./*/!(outdir) ../target-directory/ but that one copied all files into the folder without keeping any hirarchy or folders etc.
I also tried certain find commands but it didn't work, but maybe I was just doing something stupid. I'm a beginner with bash so if you could explain your answer and what the flags etc. do that would really be helpfull, I've been trying forever now, on what I think shouldn't be that hard to do.
Instead of cp, you can use tar with option --exclude to control what you want copied or not.
The full command is:
tar --exclude="outdir" -cvpf - . | (cd TARGET_DIRECTORY; tar -xpf -)
So any path that contains the "outdir" pattern will be excluded.
Without the --exclude option, it will copy the entire structure of your current directory under TARGET_DIRECTORY.
You can replace the . in the first tar by your desired source directory.
Good morning,
I want to transfer specific files in a directory to a remote machine while keeping the architecture of the subdirectories. Moreover, I only want to transfer files that have a peculiar extension (e.g. ".txt").
I have tried the following:
rsync -aP --include *.txt ./sourceDirectory user#hostIP:destDirectory
but it copies to the destination (destDirectory) all the files, and not only those which match the .txt pattern.
Could anybody help me with such riddle?
P.S.: obviously, the directory sourceDirectory contains subdirectories where are located my .txt files.
I used to have the same problem when rsync videos to home NAS. If you read the manual of rsync, the --include flag really means "Do not exclude". It actually has to work together with a --exclude flag.
The follow command will do your job:
rsync -aP --include="*/" --include="*.txt" --exclude="*" sourceDirectory destDirectory
The command literally means exclude everything, except for subfolders and txt file.
I'm looking for a way to check if the files were downloaded correctly by using rclone. Maybe the option -i is able to do that?
rclone sync -i SOURCE remote:DESTINATION
Option --checksum is able to do that.
https://rclone.org/docs/#c-checksum
When rclone has finished copying a file, it compares checksums. Add option -vv to see hashes.
I currently have rsync working well. It copies all my files from one directory to another directory. The only thing is it is physically copying the files.
I have a lot of large files that I don't want to have a duplicate of all the files. I just want to create a symbolic link in the new directory so that I can serve the data on a webpage. The source directory has some scripts and files I don't want the public to see. I'm moving the safe data to the web root (destination).
What I would like rsync to do is any new files in the source directory would create links into the destination. That way I am not using up my hard drive space like I currently am doing. What I have works perfect except for doing the symbolic link aspect to it. Is there a way to have rsync track and create symbolic links?
rsync -aP --exclude="file.sql" --exclude="*~" --exclude=".*" --exclude="*.sh" . ${destination}
It's not a symlink, but you might be able to work with --link-dest=DIR. It creates a hard link which will create a new name for the same file. This will behave similarly to a softlink as long as:
Both files are on the same filesystem
You don't plan to delete the original and not the copy (the symlink would break but a hard-link won't)
You don't have anything explicitly checking to see if it's a softlink
You could use cp -aR -s (Linux or FreeBSD) or cp --archive --recursive --symbolic-link (Linux) to create symbolic links to the source files in the destination directory instead of copies. Note that -s is non-standard.
Can lndir be useful to you. According to manual it creates a shadow directory of symbolic links to another directory tree.
I think master_delivery is probably the best tool for this. With the already introduced --link-dest option of rsync, files which are not the same will be copied. If you don't mind the situation where copies and hardlinks are mixed, you can use rsync, but if you want to eliminate duplicates completely, use master_delivery.
Usage is:
gem install master_delivery
master_delivery -m <path_to_master> -d <path_to_delivery_root>
Is anybody able to point me in the right direction for writing a batch script for a UNIX shell to move files into a zip one at at time and then delete the original.
I cant use the standard zip function because i don't have enough space to fit the zip being created.
So any suggestions please
Try this:
zip -r -m source.zip *
Not a great solution but simple, i ended up finding a python script that recursively zips a folder and just added a line to delete the file after it is added to the zip
You can achieve this using find as
find . -type f -print0 | xargs -0 -n1 zip -m archive
This will move every file into the zip preserving the directory structure. You are then left with empty directories that you can easily remove. Moreover using find gives you a lot of freedom on what files you want to compress.
I use :
zip --move destination.zip src_file1 src_file2
Here the detail of "--move" option from the man pages
--move
Move the specified files into the zip archive; actually, this
deletes the target directories/files after making the specified zip
archive. If a directory becomes empty after removal of the files, the
directory is also removed. No deletions are done until zip has
created the archive without error. This is useful for conserving disk
space, but is potentially dangerous so it is recommended to use it in
combination with -T to test the archive before removing all input
files.