I have a remote server running the last ubuntu's version. I was wondering which is the best, secure and easiest way to download large files ( > 1GB) from this server.
The download MUST be:
Encrypted
With resume support (because of my slow download speed)
At this moment I'm using bittorrent sync and it works pretty well. The problem is that it is not open sources and I cannnot trust it 100%.
Which other ways do you recommend? rsync? scp I dont think has resume support..
NOTE: I only need to download files, I dont need a way to keep files and/or foldes
EDIT: I have no webserver installed on this server. So I cant download these files using wget, curl and something..
You mention it's a server, so why not use wget? Just use wget -c http://example.com/file to resume the download.
Alternatively, rsync is also an option with the --partial option.
Related
I'm using coreftp to automatically pull files daily from an external ftp via SFTP. I'm able to pull the files, however despite using the flag 'delsrc', it won't actually delete the source file meaning the files may build up. I think it may have to do with the fact that I can't push to the ftp, however I can delete the files through the coreftp GUI. Thanks for the help
I recommend to use use: "FluentFTP"
It is better than CoreFTP.
Problem: on Windows 7, due to the fact that we can't use nfs (naturally, without hacks), the performance of the couple Vagrant/Magento are really poor.
After so much research, i found that the best way (maybe the only) to solve the problem is to use rsync. Ok, i succeed to use it and the performancies now are really good!
I found a problem: it seems to be that rsync is mono-directional. What i mean? Suppose I do install magento succesfully and then i call "vagrant rsync" command. It will perform a new sync of the folders and, cause it sync the guest file structure with the host file structure, it will "delete" the etc/app/local.xml file that Magento has built for me after the installation, just beacause it doesn't exist in the host file structure.
Now, i read some solution like exclude folders or file from sync, but i think it's really not a great way to solve the problem.
Someone has a better solution? There is a way to sync bidirectionally the two file structures?
UPDATE
I tried to find a solution.
1) I tried to use unison, but i found some kind of error i can't understand.
2) I tried to use the vagrant plugin rsync-back, but it seems it can't find the right folders to sync
3) I finally choose to execute the rsync within the virtual machine. Access through vagrant ssh, execute the command "rsync -av /var/www/ /vagrant.
It seems to work.
So, for the moment, the solution to improve the performance of Vagrant and Magento is to activate the rsync system. To solve the uni-directional sync problem, I need to execute the rsync command from the VM if i need to sync from guest to host (viceversa, it's enough to use vagrant rsync).
If you have a better way, please, give it to me!
Best option i know is unison (realtime bidirectional folder sync)
vagrant-unison plugin from https://github.com/mrdavidlaing/vagrant-unison is outdated and not functioning.
Get updated version of vagrant-unison plugin here https://github.com/dmatora/vagrant-unison
This is indeed the case right now; Vagrant's rsync support is not bidirectional—it only syncs from your host machine to the VM, and not back. There is an open issue to add two-way sync (rsync-push and rsync-pull), but I'm not sure when this issue will see the light of day.
Some other options, in the meanwhile:
vagrant rsync-back plugin - currently allows for only manual one-time rsync pulls, no support for rsync-auto
vagrant unison plugin - uses Unison to synchronize only one folder at a time
The main reason I like rsync is because it's one of the simplest/most robust file sync tools available for Mac/Windows/Linux, and since it's already installed on 2/3 of those platforms, only the Windows devs need to do any extra work to get it going. Most other options (NFS, Unison, etc.) require extra software for everyone, and don't offer much in the way of a performance gain over rsynced files.
The problem I had using rsync was it was a mono directional so if my app creates a file on the remote server, it will be deleted when rsynced next time.
I've tried the above suggestions but I've ended up with using ftp.
I'm using phpstorm and use the automatic deployment. So that if I change anything in the file or create a new one then it will automatically upload your files to the remote server (VM).
It's still one way but it works for me.
I would like to download the latest update 11 of JDK 7 from the Oracle page using the command/tool Wget, but I can't figure how ?
wget --no-check-certificate http://download.oracle.com/otn-pub/java/jdk/7u11-b21/jdk-7u11-macosx-x64.dmg
Oracle products requires licence agreement clicked before downloading, that's why it will not work with wget without special care.
There are several ways of downloading Oracle products with wget. Mostly you will come across to the method of copying cookie file to your server and use it with wget, but it doesn't seem to be working anymore.
The easiest way i have found to donwload an Oracle product through console using wget is the following steps.
Easiest way of downloading
Use any graphical browser (Firefox, Chrome, IE, etc.) on any machine to accept license agreement of the product you meant to download and start downloading the file
Cancel download right away, right click on download progress to copy download url
Use the link you have copied at step2 and just download with wget in a normal way
wget URL_YOU_HAVE_COPIED
Apparently there is a timeout duration before the URL gets invalid after you start copy on the browser. Not sure how much is that time. But it worked like a charm every i have used it.
Any easier methods anyone has ? Please let use know.
I'm trying to figure out a way to automate the deployment to our QA environment. The problem is that our release is quite big, so needs to be Zipped, FTP'd and then Unzipped on the QA server. I'm not sure how best to unzip remotely.
I can think of a few options, but none of them sound right:
Use PsExec to execute a remote commandline call on the QA server to unzip the release.
Host a web service on the QA server that unzips the release and copies it to the right place. This service can be called by our release when it's done uploading the files.
Host a windows service on the QA server that monitors a file location and does the unzipping.
None of these are pretty though. I wonder how others have solved this problem?
PS: we use CruiseControl.NET to execute a NAnt script that does the building, zipping and FTP.
Instead of compressing and un-compressing, you can use a tool like rsync; which can transparently compress data during file transfer. The -z option tells rsync to use compression.
But I assume you are in a Windows environment, in which case you could use cwRsync (which is "rsync for Windows").
Depending on your access to the QA box this might not be a viable solution. You'll need to:
install the cwRsync server on the remote machine and
allow the traffic through any firewalls.
At the last place I worked at, we had a guy write a Windows service on the CI box to do the unzipping. TFS Team Server finished the build and notified a service to zip the completed build and copy it to the CI box. The CI box picked up on the new file, and unzipped it. It may have been a bit heavy, but it worked well - and he was cognizant to log all actions to the event log, so it was easy to diagnose if a server had been reset and the service hadn't started.
Update: One thing that we would have liked to improve on that process was to have the service on the CI box check for zip files and uncompressed files that were older than x months, for purging purposes. We routinely ran out of disk space (it was a VM that we rarely looked at), and had to manually purge old builds when it happened.
I'm looking for a good non-interactive, command line FTP client to be run from a Rakefile. Like Weex, but better. Weex has different problems (for me):
It stores its config file in my home dir. I want the FTP config to be part of my project and weex doesn't have a --config-file option or something.
The behavior of ignoring files seems to be completely buggy. It doesn't remove files which it should, it doesn't let me specify relative paths, even though I do it according to the man page's instructions, etc. I've been struggling with it for an hour now and it is just completely inexplicable.
I tried running rsync over FTPFS/FUSE, but that is dead slow because FTP doesn't store mtimes, which makes rsync diff every file. Plus, there are some refresh problems and other bugs that cause access failure (http://bugs.gentoo.org/208168).
I'm stuck with FTP, unfortunately. Any help is appreciated.
Perhaps something from the ncftp suite (http://www.ncftp.com/ncftp/)? This has the ability to specify a config file of your choice and tools to operate non-interactively (ncftpget/ncftpput).
It doesn't appear to have ignore functionality, but hopefully this was helpful to you..
I've used lftp in the past with good results. It's installed by default in many distributions and offers pretty sophisticated functionality (including a couple ways to exclude files).
try sitecopy: http://www.manyfish.co.uk/sitecopy/
The trouble with lftp is that it is very slow for mirroring--which I suppose you want to do since you have been using weex.
Unfortunately, both weex and sitecopy have very limited proxy handling, so if you need to go through a HTTP proxy, lftp may still be your best bet.