I often have to transfer large files >50GBs sometimes >100GBs between drives both internal and external during backups of our networks email servers. What is the best method of transferring these files? Command Line such as XCOPY? Possibly something robust enough to continue the transfer if interrupted due to time limits or network issues.
Check out robocopy. From Wikipedia:
robocopy, or "Robust File Copy", is a
command-line directory replication
command. It was available as part of
the Windows Resource Kit, and
introduced as a standard feature of
Windows Vista and Windows Server 2008.
For free, I use SyncToy (from Microsoft). That way if something fails it doesn't abort the whole transfer.
The next best for non-repetitive tasks IMHO is XCopy.
I have used Teracopy with good success.
I get asked this question every now and again and I always say the same thing. Microsoft Background Intelligent Transfer Service (BITS). This is the same technology used to deliver large service packs and such to workstations. Some of the features:
Network Throttling
Asynchronous Transfers
Auto-Resume
Priority Levels for Downloads
Proven Transfer Mechanism
For those not wanting to deal with the command line syntax you can explore wrapper applications, such as SharpBITS.NET, that provide a GUI interface.
I use CopyHandler and find it does the job well.
Well i use http://itrnsfr.com to transfer my big files online. I wish they extend the quote over 2 GB they currently offer to free users
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have used InstallAware and InstallShield before, and they are pretty difficult to work with and when something goes wrong it is very difficult to find and resolved the issue.
My question is why can't we use a Windows application written using C# to do this.
I understand that .Net framework may not be installed on the destination computer, so I wonder why no one has ever used this architecture:
I will create a simple installer using IntallSiheld(or any other similar tool) to just install .Net Framework and after that extracts and runs my own Windows application which I have written using C# in elevated mode. My application will run a Wizard with Back and Next button and I will take care of everything in it (copying files, creating and starting Windows Services, adding registry values, creating firewall extensions etc.)
Has anyone ever done this, and is there anything that prevents people from doing this?
In essence: don't try to re-invent the wheel. Use an existing deployment tool and stay with your day job :-). There are many such tools available. See links below.
And below, prolonged, repetitive musing:
Redux: IMHO and with all due respect, if I may say so, making your own installer software is reinventing the wheel for absolutely no gain whatsoever I am afraid. I believe you will "re-discover" the complexities found by others who have walked the path that is involved in deployment as you create your own installer software and find that software can be quick to make, but very hard to perfect. In the process you will expend lots of effort trying to wrap things up - and "the last meter is very long" as you curse yourself dealing with trifles that take up your time at the expense of what would otherwise pay the bills. Sorting out the bugs in any toolkit for whatever technical feature, can take years or even decades. And no, I am not making it up. It is what all deployment software vendors deal with.
Many Existing Tools: there are many existing tools that implement such deployment functionality already - which are not based on Windows Installer (Inno Setup, NSIS, DeployMaster and heaps of other less known efforts):
There is a list of non-MSI installer software here.
There is another list of MSI-capable software here.
My 2 cents - if you do not like MSI, choose one of the free, non-MSI deployment tools. How to create windows installer.
Corporate Deployment: The really important point (for me) is that corporate deployment relies on standardized packaging formats - such as MSI - to allow reliable, remote management of your software's deployment. Making your own installer will not impress any system administrators or corporate deployment specialists (at least until you sort out years of bugs and deficiencies). They want standardized format that they know how to handle (that does not imply that they are that impressed with existing deployment technology). Doing your deployment with standardized deployment formats can get you corporate approval for your software. If you make a weird deployment format that does unusual things on install that can't be easily captured and deployed on a large scale your software is head-first out of any large corporation. No mercy - for real. These are busy environments and you will face little understanding for your unusual solution.
"File-Pushers": Those of us who push files around for a living know that the field of deployment is riddled with silly problems that quickly kill your productiveness in other endeavors - the ones that make you stand out in your field - your day job. Deployment is a high profile, low status endeavor - and we are not complaining. It is just what it is: a necessity that is harder to deal with than you might think. Just spend your time more wisely is what I would conclude.
Complexity: Maybe skim the section "The Complexity of Deployment" here: Windows Installer and the creation of WiX. It is astonishing to deal with all the silly bugs that happen in deployment. It is not just a file copy, though it might be easy to think it is. And if it happens to be just a file copy, then there are existing tools that do the job. Free ones too. See links above. And if you think deployment is only file-copy in general, then please skim this list of tasks a deployment task should be capable of supporting: What is the benefit and real purpose of program installation?
Will your home-grown package handle the following? (just some random thoughts)
A malware-infected terminal server PC in Korea with Unicode characters in the path?
Symbolic links and NTFS junction points paths?
A laptop which shuts itself off in the middle of your file copy because it is out of battery?
Out of disk space situations? What about disk errors? And copy timeouts?
What about reboot requirements? For in-use files or some other reason. How are they to be handled? What if the system is in a reboot pending state and you need to detect it before kicking off your install?
How will you reliably install, configure and start and stop services?
How will you support uninstall and cleanup for your application?
Security software which flags your unknown, unrecognized, non-standard package a security threat and quarantines it? How would you begin to deal with this? Who do you contact to get into the good graces of a "recognized binary" for elevation?
Non-standard NTFS permissioning (ACLs) and NT Privileges? How do you detect it and degrade gracefully when you get permission denied? (for whatever reason).
Deployment of necessary runtimes for your application to work? (has been done by many others before). Download of the lastest runtimes if your embedded ones are out of date? Etc...
Provide a standardized way to extract files from your installation binary?
Provide help and support for your setup binaries for users who try to use them?
Etc... This was just a random list of whatever came to mind quickly. There are obviously many issues.
This was a bit over the top for what you asked, but don't be fooled to think deployment is something you can sort out a solution for in a few hours. And definitely don't take the job promising to do so - if that is what you are being asked. Just my two cents.
The above issues, and many others, are what people discover they have to handle when creating deployment software - for all but the most trivial deployments. Don't waste your time - use some established tool.
Transaction: If you are working in a corporation and just need your files to your testers, you can deploy using batch files for that matter - if you would like to. But you have to support it, and I guarantee you it will take a lot of your time. What do you do when the batch file failed half-way through due to a network error, and your testers are testing files that are inconsistent? Future deployment technologies may be better for such light-weight tasks. Perhaps the biggest feature of a deployment tool is to report whether the deployment completed successfully or not, and to log the errors and to roll the machine back to a stable state if something failed. Windows Installer does a lot of this work for you.
Distribution: A lot of people feel they can "just replicate my build folder to the user's computers". The complexities involved here are many. There is network involved, and network can never be assumed to be reliable, you need lots of error handling here. Then there is the issue of transactions: when do you know when the computer is in a stable state and should stop replicating. How often do you replicate, only on demand? How do you deal with the few computers that failed to replicate. How do you tell the users? These are distribution issues. Corporations have huge tools such as SCCM to deal with all these error conditions. Trying to re-implement all these checks, logging and features will take a long time. In the end you will have re-created an existing distribution system. Full circle. And how do you do inventory of your computers when there is no product registered as installed since only a batch file or script ran? And if you start replicating a lot of packages, how many times do you scan each file to determine if they are up to date? How much network traffic do you want to create? Where does it end? The answer: I guess transactions must be implemented with full logging and error tracking and rollback. Then you are full circle to a distribution system like I mentioned above and a supported package format as well.
This "just replicate my build folder to my users" ideas somehow remind me of this list: https://en.wikipedia.org/wiki/Fallacies_of_distributed_computing. Not a 100% match, but the issues are reminiscent. When networking is involved, things start to become very unpredictable and you need logging, error control, transactions, rollback, network communication, etc... We have re-discovered large scale deployment - the beast that it is.
Network: and let's say you want to replicate your build folder to 10000 desktop machines in your enterprise. How do you kick off the replication? Do you start all replications at once and take down the trading floor of the bank as file replication takes over the whole network like a DDOS attack? Sorry - it is getting out of hand - please pardon the lunacy - but it really is upsetting that this replication approach is seen as viable for large scale deployment with current technology approaches. Built-in Windows features could help, but still need to be tested properly. You need scheduling, queuing, caching, regional distribution shares, logging, reporting / inventory, and God knows what else that a packaging / deployment system gives you already. And re-implementing it will be a pain train of brand new bugs to deal with.
Maybe we one day will see automatic output folder replication based on automatic package generation which really works via an intelligent and transacted distribution system. Many corporate teams are trying, and by using existing tools they get closer with standard package formats used. I guess current cloud deployment systems are moving in this direction with online repositories and easy, interactive installation, but we still need to package our software intelligently. It will be interesting to see what the future holds and what new problems result for packaging and distribution in the age of the cloud.
As we pull files directly from online repositories on-demand we will see a bunch of new problems? Malware, spoofing and injection? (already problematic, but could get worse). Remote files deleted without warning (to get rid of vulnerable releases that should no longer be used - leaving users stranded)? Certificate and signature problems? Firewalls & proxy issues? Auto-magic updates with unfortunate bugs hitting everyone immediately and unexpectedly? And the fallacies of the network and other factors as linked to above. Beats me. We will see.
OK, it became a rant as usual - and that last paragraph is heading over board with speculation (and some of the issues already apply to current deployment). Sorry about that. But do try to get management approval to use an existing packaging & deployment solution is my only advice.
Links:
Stefan Kruger's Installsite.org twitter feed: https://twitter.com/installsite
Choosing a deployment tool:
How to create windows installer
What installation product to use? InstallShield, WiX, Wise, Advanced Installer, etc
Windows Installer and the creation of WiX
WiX quick start tips
More on dark.exe (a bit down the page)
I've searched large and deep, but nothing is available, as far as I can see.
TLDR: How can I use rsync with a SharePoint installation? (Or something like rsync)
Long description
We have a large install base of Macs (~50%), Windows (~40%), and Linux (~10%), so our environment is pretty heterogeneous. Being an experimental job we produce a considerable amount of experimental datasets that we need to share, and more importantly, backup.
Right now we use external hard drives to store these files and folders, since our computers cannot hold these amount of data (50GB++, for instance, per dataset). And when we need to share, we "physically" share. We mainly we use rsync with some kind of backend (what kind is not important), but this solution requires computers to be left turned on, and act as servers.
For reasons that I will not bother you with, we cannot leave a computer on after work.
Having OneDrive for Business seemed a very promising technology to use, since we have more than 1TB per user. We could start syncing out datasets from our computers and hard drives, and we could share even when computers are turned off.
We are aware that we may hit some drawbacks, as not being able to actually share, having some limits about the number of objects (files/directories), but we will handle them later.
I prefer rsync, but right now we're open to any solution.
OneDrive for Business has a download that will allow you to synchronize a directory locally. https://onedrive.live.com/about/en-us/download/
For a Linux platform, you should be able to use onedrive-d found here:
https://github.com/xybu/onedrive-d
I know that it's an old question, but it's unanswered. Maybe a solution could be https://rclone.org/. Rclone is a command line program to sync files and directories to and from the cloud.
For some Webhost issues I have to write a file backup/syncronisation tool for a common OS in the server sector (Windows/Linux). Most Linux root-servers offers the ssh-interface for secure communication so I could use the SSH File Transfer Protocol, but what's the best solution on the windows side (on the fly) ? And are there good D libraries (or C alternatives)
I'm writing here and not in the admin or windows stack because there is one reason: It's important that there are existing libraries. So an easy implementation is more important than the existence of an interface or protocol. The simplicity and the language features and not the possibilities have priority.
All in all I am looking for an easy way to implement an os independent tool for a file exchange. For the synchronisation work it has to be possible to access some file information ( last write time, modified time, filesize, etc.)
edit:"My Version" of a synconisation tool should work on a new system without extra sotfware installation (maybe some automated installation over ssh-windows-equivalent [if there is one])
You only enter your access data and it should work. Furthermore I also need a protocol and this is the biggest problem. Because ssh doesn't work on windows on the fly - is there an equivalent?
rsync is a popular file synchronization tool, best suited to files being added, deleted, or extended. It's been very well debugged and is quite simple to set up. (rsync -avzP username#hostname:/path/to/source/ /path/to/dest/ or rsync -avzP /path/to/source/ username#hostname:/path/to/dest/ are common.)
rsync is frequently tunneled over ssh; it does have its own protocol if you don't mind it being publicly open.
But if you've got a lot of data that is being slightly moved, or frequent renames, a tool like git can make much better use of bandwidth. It does carry the downside of keeping history on both sides, which might be less disk-efficient than you'd like, but it can more than compensate if your bandwidth is a bit less amazing.
git is also frequently tunneled over ssh; it also has its own protocol if you don't mind it being publicly open.
I doubt either one has D library bindings, but C bindings ought to be easy to come by. :)
This might sound like a strange request, but I'm hoping I have more luck here than I've had googling for the same topic.
I'm searching for a Windows based application that allows me to upload files to an FTP server via the command line, across as many threads as possible.
I'm currently trialing WinSCP, which has a simple scripting interface, that I can invoke from the command line. However, whilst it's a) windows based, b) command line driven/scriptable it doesn't make use of any multi threading to synchronise uploads of large files.
It seems I'm forever limited to achieving 2 of my 3 goals. For example, FileZilla is a) windows based and b) multithreaded for uploads, but unfortunately lacks any command line or scripting capabilities :/
Does anyone know of anything that might be able to achieve all 3 of my desires?
Well, FileZilla is GPL, so you could fork it and create a command-line/scriptable version sans the GUI. You'd have to implement the scripting engine though.
Alternately, you could implement a client on top of Twisted FTP (in Python).
When I develop web applications I'm frequently need to sync files from a working folder to external server or another folder. I like keeping my code separated from the web sever.
In open source world there is the eclipse with file sync that does the job pretty well. Unfortunately I can't find any good replacement for Visual Studio.
I've only found two generic solutions:
- Winscp which is pretty good but stucks when a file is locked and ask for confirmation. Which is quite annoying.
- DSynchronize which works pretty well (ie. doesn't ask questions) but doesn't have filters so I can't tell it not to sync my .svn files or web.conf :(.
Do you know any good way to achieve realtime synchronization in Visual Studio or windows?
I doens't have to have gui in fact I would love to see a command line solution like a powershell command that outputs modified files.
I've ended up using Mercurial (to skip the .svn files) and DSynchronize to sync files
I would give a try to immortal classic - rsync. There is cygwin enabled implementation for Windows called cwrsync: http://www.itefix.no/i2/node/10650 . With proper configuration (potentially with some fine tuning with scripting as well) it will do perfectly.
If you would like to have bi directional synchronization, the Unison may be the answer:
http://www.cis.upenn.edu/~bcpierce/unison/
If you are looking for something even fancier, you might give a try to one of distributed file systems available, like CODA (I'm afraid decent Windows systems aren't supported yet): http://www.coda.cs.cmu.edu or native DFS solution from Microsoft, however I'm afraid the set up is too hassling (if not impossible in your case) since it's targeted for enterprise solutions:
http://technet.microsoft.com/en-us/library/cc753479(WS.10).aspx
Of course DFS option probably won't support filtering you are interested in.