I'm working on a windows platform and want to be able to auto sync my files one way 'on change' to my virtual windows or linux web server - also need to be able to filter file types. i can connect to the remote machine via network drives.
i'm ideally looking for a free, easy to set up solution - a commercial product that does what I need is called ViceVersa but its a little overkill and costs :)
Thanks
Josh
I'd use rsync - simple, easy to setup, and provides the filters you need. Also very low on bandwidth after the first pass.
Here is a link explaining how to get it working in Windows
Whilst rsync doesn't allow 'on-change' auto-syncing, it is very fast when it scans a sync'ed directory (even very large ones), so you could schedule a frequent sync to overcome this.
Edit: You could combine it with a program like this, to trigger an rsync on folder contents change. Cheaper than viceversa
For other users, its worth mentioning lsyncd, it will auto sync on changes between two machines (by default deferring to rsync). Will only work on Linux though, but if thats not a problem it works great.
It also seems that Sparkleshare has finally released some working code (Dropbox clone). Havent tried it myself but does cross-platform synching and you can setup your own server.
Related
Our IT folks are telling us (the dev group) we shall not have ANY files stored on our local hard drives, including our TFS working folders. This is ridiculous for a variety of reasons but until I'm convinced it's a good idea, I'll play along and when no one is looking make a local working folder.
Does anyone does have their working folder on a network share? How well does it work? Each developer would have their own folder in the share but it would be on the network. My main concerns are performance and we would need to be connected at all times in order to work.
On a TFS point of view it's working without issue, but stay away from the Local Workspace of TFS/VS11.
I strongly feel for you on the compiling point of view, compiling a solution stored over the network is absolutely a disaster in term of performances.
You did not mention it but I assume your Network Share uses a Network Drive.
Btw, can I know why these guys don't want you to store files locally ?
While it's not something I would typically recommend, if that is the policy and you have to adhere to it, it might be worthwhile to consider simply having server-side development VM's that your devs RDP into. I've seen companies do this before, and the big downside is that if your not connected to the network you can't do anything.
There are some upsides too though. Being able to easily increase resources (RAM, disk space, CPU, etc) because of the virtualization infrastructure. If somebodies laptop dies they are not out of comission, just find a loaner machine and RDP into their VM and they're up and running. If somebody leaves, you have a copy of their entire working machine that you can give to their replacement. All machines can be easily backed up. Etc. Compiling, and working within VS in general should be much faster too than trying to work with a local Visual Studio reading/writing to a network drive.
I am needed to move entire directories from one computer in the network to the other (In a platform independent way). Basically I am working on some automation tool to help the developers do Build Verification Tests, for this; I am directed to automate the installation and un-installation of the product on multiple platforms. So, I will need to first copy the files!
And this is where I needed some help in both conceptual and practical knowledge.
Firstly, let me mention that using something like FileZilla or WinSCP is out of the question since I need things to happen automatically and not through button clicks. But please let me know if these tools have any command line utilities!
I tried Perl's NET::FTP, and while it looked promising, I was wondering whether it was the best way to go. Also, I want to know what are the pre-requisites before I can run FTP, I mean would I need perl installed on the other end as well ? I constantly read that the commands from perl's FTP actually try to connect to a FTP host, does this mean its not going to work if I haven't configured the remote host in some way? And if I am right, then what is this extra piece of configuration to be done?
Apart from this, is there any other way I could solve my problem ? I mean I am looking for API's here that would help me do platform independent file transfers. But once again, I cannot use tools that would need button clicks and stuff, because I am doing automation and everything needs be dome programmatic-ally and automatically.
Also, I think this is a very generic problem-statement: "Moving files across a computers connected by LAN"; So, it would be wonderful if we can have a list of (possibly) many options (ways to solve the problem) in the form of answers to this post.
Thanks in advance for any help that you wish to provide.
If nearly all of the files in your directory have changed, creating an archive, sending it over the network, and unarchiving makes sense. Actually, if your LAN is fast enough, though, it may be faster not to compress the archive--just use tar.
If only some of the files have changed, rsync, a command line tool, will only download the changes. It can be used with ssh like this:
rsync -ae ssh username#hostname:/path/to/files /store/here/locally
http://www.thegeekstuff.com/2010/09/rsync-command-examples/
On Linux and OS X, cron and crontab allow you to schedule scripts to run periodically. Windows provides the Windows Task Scheduler.
FTP is fine if you don't care about encryption over your LAN. Otherwise, SSH would be preferable.
rsync is available on OS X and Linux, but I think you can use it on Windows through Cygwin.
I suggest making an archive (e.g. a .tar.gz file) on the source host, transferring it with scp, and unarchive it on the target host.
You could also use unison or rsync
I would suggest you to develop your own FTP client in .NET. This way you will have complete control over the application, and instead of button-clicks you can schedule it using windows-scheduler. Here is an article about how to create your own FTP client in VB.NET:
http://dot-net-talk.blogspot.com/2008/12/how-to-create-ftp-client-in-vbnet.html
I'm currently working on a migration from svn to mercurial. My needs are plain and simple, I need source control over an intranet in our company. I see examples everywhere for setupping remote repos over IIS. I just don't see the point when I can just make a share on a server.
Can I still setup authorizations and authentications on repos using NTFS permissions?
Am I missing something?
Thank you
Putting a repository on a file share works, but it's not the way recommended by the Mercurial team.
See the "shared disk" part of Publishing Repositories on the HG wiki:
generally restricted to intranets, not generally recommended due to general issues with network filesystem reliability
Be sure to check out Chris Becke's answer as well, because he points out another valid disadvantage (people with write access deleting stuff from the network share, be it intentionally or not).
If you are aware of (and can live with) these things, putting the repositories on the network share is without a doubt the easiest way to setup.
My personal experience is that it works perfectly as long as the Windows share is on a "real" Windows machine.
At work we're using a share on a real Windows server without problems, but at home I ran into issues with a NAS (which behaves like a Windows share but actually runs on Linux).
You can read more about my experiences here:
Can you 'push' to network share using Mercurial on 64bit Windows 7?
There are a number of reasons to prefer, well, anything at all to a writable file share.
In essence it comes down to, there is a limited amount of damage someone can do with the ability to do a push via a web-method.
A read/write share on the other hand is necessary to do a push, but also allows a user to delete an entire repo, history and all.
Without even invoking malicious intent, people (or rogue software agents) have been know to navigate to random network shares and accidentally drag a file to someplace it doesn't belong.
The best reason to lock your PC is not because your co-workers find it amusing to use an unlocked email account to send porn to HR, but because its amazing what a cleaning lady can do with a rag and a keyboard. Its also amazing what Music Library applications can find while scanning all shares in a workgroup, and carefully "move" and catalog to someones Library.
Is there any way, any free software capable of automatic file upload? Let's say I edit php code on my local computer with my favorite IDE. I won't change my IDE, it's great. I want something that would detect a file is changed in my project directory and upload it with FTP/SFTP onto remote server. That's it - just that simple.
What I've already tried:
FTPDrive + FileSync Eclipse Plugin - it's quite slow, uploads ALL the files way to often, works buggy under Vista and Windows 7.
WinSCP automatic synchronization - bugs again, refuses to upload files randomly. Would be the best if it worked right.
Eclipse's native SFTP support - it's USELESS! You cannot use PDT projects with this feature. PDT without projects is no better than Notepad++.
Aptana FTP feature. It's worse than manual! Gawd, it sucks!
Running my own PHP/MySQL server under windows. First, it took me ages to set it up, then, it didn't work EXACTLY as my production environment - I hadn't been able to test my code correctly.
How it should work? I change file here, and it's uploaded there. It would be best, if it sit quietly in tray and bother me only if upload error occured.
Ok, if it's not free, maybe there's something cheap at least?
If there's nothing like it, is there something like FTPDrive?
rsync does exactly what you're asking.
Well, almost: it doesn't watch your filesystem and automatically upload files - you'd have to set up a task to run it every minute or whatever. But it does efficiently upload only the changes. If you're on Linux, lsyncd does the watching part and drives rsync to do the efficient upload part.
In the rails world, we tend to use source control and a deployment tool like vlad or capistrano. It's a bit safer and more consistent than FTP. This is a guide on how to use it with svn and php http://www.simplisticcomplexity.com/2006/08/16/automated-php-deployment-with-capistrano/.
You really should try to get your development server running on your personal machine. It's a much better way and it is worth the initial pain of trying to make it work. There are good tutorials on that out there somewhere.
You can use WebDrive or ExpanDrive, mount a complete remote directory as a local disk drive and directly edit your files on the server. However this highly depends on your connection and how your tools are written. Another approach could be to use one of these tools and with another tool sync all the changes asynchronously.
We have tremendous problems with Visual Studio (2008, if that matters) locking up and slowing down when accessing projects over a network drive. It can take several minutes to open a large Web site project through a mapped drive, and saving even a single file can take a minute or more.
I fired up Wireshark and watched the traffic. VS, it seems, requests massive amounts of files from the network -- there's an enormous amount of SMB traffic. I've done some research, and this traffic seems to stem from two situations.
VS has to have everything in its own process to provide Intellisense.
VS needs to have all the source in order to compile the project.
All the advice I've read seems to boil down to the same thing: work locally, not on a remote machine, then push your code to an integration server via source control.
This would sure solve our problems (VS is quite fast working locally), but what if you can't work locally? What if the project and the infrastructure required to run it is too large and complicated to be replicated on everyone's individual machines?
We've gone 'round this problem a couple times, and the only way we can figure to work on these projects is direct access via a mapped drive. However, the VS slowness and lockups are really becoming a problem.
One solution: we installed VS on the server and work on the projects directly on the servers via RDP. Seriously.
So, I ask:
What does everyone else do? Do you work via the network, or do you replicate projects locally? If remotely, do you suffer from VS performance issues.
We work locally and use SVN to keep all our code on the server.
I find VS 2008 quite slow working locally sometimes so I wouldn't fancy working on a network share.
Trying to compile over a network share is horribly slow using visual studio. Your start times will be bad as the intellisense database is regenerated. Each compilation has to go over the network multiple times. Linking takes forever.
If you need the output of your compilation on the network, I'd recommend doing your compile locally and defining a post-build command to copy the results to your share.
If, as you say, you cannot pull everything locally then I'd suggest your project is too big and needs to be broken up into more manageable chunks. For a multi-tier application, break it up by tier and invest in some form of continuous integration (e.g. CruiseControl) to automatically build individual pieces. In this way you can work locally on an particular piece and pull the pre-build portions from CI for the other pieces of the application.
I'm not terribly surprised that using VS to load projects over a network share has performance issues. VS (in any language) is constantly getting information from files in the project. Once you start loading this over a network you're at the mercy of the underlying network connection. All lags and access issues will directly translate into VS having an issue loading file contents.
I would advise copying the solution locally and using some form of source code control to sync the project on the share.
If the code is too complicated to install on everyone's machine, then don't put it on everyone's machine. Does everyone need to have everything in order to do productive work?
I have 79 projects in my solution that I work with. Several hundred thousand lines of code. I pull my source down everyday from TFS and build it; it's a lot of code, but it's a far better solution than trying to work over a network share.
A more legitimate situation of having the source code on a share is when one has a non-Windows host on which a (number of) virtual Windows machine is running.
I have this exact situation where my desktop machine (the host) is running Debian and I use VMware to run various virtual Windows machines (the guests), including one that has Visual Studio installed so that I can target Windows OS's. Having the source code on a Samba share on the host machine has the following pro's:
The source is not duplicated, so there is no way to confuse different copies while working on several virtual machines at the same time.
I have full control over the source from my preferred OS.
I can turn on and off any of the virtual machines, or roll back to a snapshot, without the risk of loosing changes.
I can build (etc.) from the same source on several machines without having to commit changes before the source fully tested (reason: I have to use Subversion <1.5).
The only problem with this setup is that Visual Studio (6,7,8,9) is painfully slow.
I have mounted the partition (on which the share lies) with "relatime" and this works in as far as the disk activity on the share moderate, but Visual Studio keeps the (virtual) network card occupied all the time.
Any solutions to this would be very appreciated.
I encountered similar problems everytime I worked (work = anything else then just copy / paste files) over a network drive. The problem occured with ZendStudio and Eclipse.
Why not use any kind of source control?
When working on Windows based projects I've always worked locally.
Once at a unix shop (AIX iirc) developers would work via NFS mount and checkin/checkout via RCS...
I'm using VS2005 across to a network share and not having any performance issues. However, it is a new server (Windows Server 2008). I don't have any other data points for VS since using it at work is relatively new for me.
However, some datapoints from using Netbeans for previous projects on a network share... Local build time for my project was 2 minutes on Vista, on a fast dual-core AMD 64-bit machine. For a network share project, on a Server 2003 box, it was 20 minutes. Building that same project from an ancient Tablet PC (1ghz, single core) running XP locally was around 5 minutes. Interestingly enough, the Tablet PC could build on the Server 2003 box in the same 5 minutes.
For those asking "why" on the network share. The network share is automatically backed up, archived, etc. Also, that way I can very easily look at the same projects from multiple machines without having to worry about pushing back into the repository, etc. Once you've gone to having your dev stuff on a device where you can get to it from anywhere/anything, you'll never want to do local storage again!
I have performance problems via network anything, they just aren't good enough yet.
I thought it was common knowledge that disk-speed is one of the major "slowness" factors when it comes to using VS in Windows. Most dev machines I've built have had projects located on 10k RPM RAID0 drives, or at least a single 10k RPM drive. And even then it seems slow sometimes. Just the way it is, I suppose, until VS2009/VS2010 fixes it? :)
From my experience, this lag when working on a network share is 99% due to Intellisense. Disable it and you'll see.
disabling Intellisense indeed speeds up saving and opening files trough a UNC share dramatically
http://blogs.msdn.com/saraford/archive/2007/12/03/did-you-know-how-to-turn-off-intellisense-by-default.aspx
but then again, as stated in other comments, you might as well use a good text editor
I've also experienced the problems with performance mentioned above. It seems to vary from project to project, but I did find one way of speeding up performance significantly for some project types.
Following the advice in this article made a previously unusable project on a network location (it would take minutes to open one file) perform almost like a local project. The basic gist is that you need to grant FULL TRUST to the network location:
To grant permission to all your projects in your Visual Studio Projects folder located on the network, follow these 8 steps:
Open Microsoft .NET Framework 1.1 (or 2.0) Configuration which you'll
find under Administrative Tools in the Control Panel.
Expand Runtime Security Policy | Machine, | Code Groups | All_Code |
LocalIntranet_Zone In the right-hand pane, click Add a Child Code
Group.
In the dialog that follows choose Create a new code group and fill in
a Name like Visual Studio Projects.
Optionally, provide a Description for the Code Group. (You'll see the
description when you click a Code Group in the left tree, helping you
identify the various Code Groups you may have) .
In the Condition Type drop down, choose URL
For the URL field, type something like this:
file://YourServer/My Documents/Visual Studio Projects/*
Under Use existing permission set, choose FullTrust (that is, if you
trust your own applications. If you don't, choose a different
permission set or create a new one).
Not sure why this works, but it made a previously unusable NET 2.0 project perform significantly better.
Original article: http://imar.spaanjaars.com/364/how-do-i-allow-my-visual-studio-net-projects-to-run-from-a-network-location
I was having the same problem. I have a local copy of our build system, which expects certain drive letters, and was also experiencing slowness.
I have solved the problem by adding the following registry keys:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices]
"R:"="\DosDevices\D:\devel\build
"S:"="\DosDevices\D:\devel\src"
Note that the double '\'s above are part of the .reg file format. When using regedit use single '\' throughout.
My build times were divided by 3. :)
I found the info in the wikipedia article on the SUBST command.