I have a problem regarding transferring files from a server to another server. I tried using PSCP putty. it worked in the first time from local to server. what I'm trying to do is zip all files then transfer to another server. what commands should I use to achieve this?
pscp -P 22 test.zip root#domain.com:/root
this line of code works when transferring local to remote server, however, I want to compress files from a server to another remote server, or at least remote to local, then local to remote, whatever method is possible. I cannot compress the files because it's almost 50 GB in total so I am searching for a much faster way to achieve this.
Related
I want to create a watcher that will automatically sync file changes from a local directory to a remote docker container. I need to find a way to transfer the files efficiently. I will also need it for a one time upload command which would transfer a complete folder from local directory to a remote docker container.
I figure one solution would be to scp to a tmp directory on the remote host, and then run docker cp via ssh to copy the files from tmp directory. Is that a good solution? Is there anything better?
By the way, if anyone knows a file sync utility for that use case, please let me know. I tried to search, but it seems like it's not the most popular development workflow?
I would tr using rsync for local to remote host syncing. From their volume mount the directory into the docker container.
How to download file(s) from remote server directory to local machine in PuTTY ?
I got the command for inserting file to remote directory from local machine. But it is not working for me though there is no error message.
pscp c:\documents\foo.txt fred#example.com:/tmp/foo
(Question is probably more suited to Superuser)
You have your parameters in the wrong order. Please refer to the documentation:
https://the.earth.li/~sgtatham/putty/0.70/htmldoc/Chapter5.html#pscp
To download, you need:
pscp [options] [user#]host:source target
What you have there is the opposite, it's for doing an upload.
This may be asking too much from an already very powerful tool, but is there a chance that lftp mirror can execute a command during the mirroring process (from remote directory to the local machine)?
Specific example: lftp is asked to mirror a remote directory with xml files into a local folder and as soon as each file is downloaded/updated, it converts the file to JSON format using xml2json.
I can think of a solution that relies on monitoring the local copy of the mirrored folder for changes via find and then executing xml2json on the new/updated files, but perhaps there is a simpler way?
You can use xfer:verify and xfer:verify-command settings to run a local command on every transferred file.
I am doing website development on OS X, and fairly often I find myself in situations where I move some part of a live website (running Linux/LAMP) to a development server running on my own machine. One such instance involves downloading images (user generated content, e.g. via ftp download), processing them in one way or another and the putting them back on the production site.
The image files involved, being created in a Linux machine, appears to have their filenames encoded in UTF-8 using NFC decomposition. OS X's HFS+ file system on the other hand does not allow NFC decomposed filenames and converts into NFD. However, once I am done and want to upload the files their names will now be using NFD decompositions, since Linux supports them both. As a result, the newly uploaded (and in some cases replaced) files will not be accessible at the expected URL.
I'm looking for a way to change the UTF decomposition of the files during (preferably) or after (convmv looks like a good option, but I don't have sufficient permissions on this server it's not possible in this particular case) transfer, since I'm guessing it's impossible doing it beforehand. I've tried FTP-upload using Transmit and rsync (using a deploy script a normally use) to no avail. the --iconv option in rsync seemed ideal, but unfortunately my server running rsync 2.6.9 did not recognize it.
I'm guessing quite a few people are having similar issues, I'll be happy to hear any solution or workaround!
UPDATE: In this case I ended up rsyncing the files to a virtual machine running Ubuntu, running convmv on them on there, and then rsyncing again to my staging server. While this works fairly well it is a bit time consuming. Perhaps it would be possible to mount an ext file system on OS X and just store the files there instead, using their original NFC decomposed file names?
Also, to avoid this problems all together on future WordPress installs, which was my use case, you could add a simple add_filter('sanitize_file_name', 'remove_accents'); before uploading any files and you should be fine.
It seems that rsync --iconv is the best solution, as you can transfer the files and transcode the names all in one step. You just need to convince your host to upgrade their rsync. Given that the --iconv feature was introduced in rsync 3.0.0, which was released in 2008, it's a bit odd that your host is still running rsync 2.6.9.
If you can't convince your host to install an up-to-date rsync, you could compile your own rsync, upload it somewhere like ~/bin on the server, and add that to your path before the system installed rsync. Then you should be able to use the --iconv option. This should work as long as you are using rsync over SSH (the default), not the rsync daemon; because rsync over SSH works by SSHing to the remote machine, and running rsync --server with the same options that you passed to your local rsync.
Or you could find a host that has up-to-date tools and Perl installed.
Currently I'm using rsync --iconv like this:
Given Linux server and OS X machine:
Copying files from server to machine
You should execute this command from server (it won't work from OS X):
rsync --iconv=UTF-8,UTF-8-MAC /home/username/path/on/server/ 'username#your.ip.address.here:/Users/username/path/on/machine/'
Copying files from machine to server
You should execute this command from machine:
rsync --iconv=UTF-8-MAC,UTF-8 /Users/username/path/on/machine/ 'username#server.ip.address.here:/home/username/path/on/server/'
I am trying to FTP a RAR (zipped) file to another server but am having problems doing so. This is a Windows environment. I know that my FTP connection is setup correctly because I have already transferred over several other RARs. But the difference from what I can tell is that this RAR that is failing is larger in size. It is 761 MB. So when I try to "put" it into the other server, I get the following:
200 PORT command successful.
150 Opening BINARY mode data connection for WCU.rar.
> WCU.rar:Permission denied
226 Transfer complete.
However, the file is never transferred over. Is there a size limitation? And FYI, WCU.rar is a zipped directory, not a file. But I was able to successfully FTP over several other zipped directories.
it can be size limitation, not just stored data but as well transfered data.
did you try to transfer a small file? a small file in the same format? I would say, permissions, but you said that you uploaded already files to this server.
just to help you debug, you can add both commands to your ftp session
ftp> hash
ftp> bin
WCU.rar:Permission denied
You don't have permission to write to that directory. You need write permissions on the folder in order to do so.