I'm currently working on a migration from svn to mercurial. My needs are plain and simple, I need source control over an intranet in our company. I see examples everywhere for setupping remote repos over IIS. I just don't see the point when I can just make a share on a server.
Can I still setup authorizations and authentications on repos using NTFS permissions?
Am I missing something?
Thank you
Putting a repository on a file share works, but it's not the way recommended by the Mercurial team.
See the "shared disk" part of Publishing Repositories on the HG wiki:
generally restricted to intranets, not generally recommended due to general issues with network filesystem reliability
Be sure to check out Chris Becke's answer as well, because he points out another valid disadvantage (people with write access deleting stuff from the network share, be it intentionally or not).
If you are aware of (and can live with) these things, putting the repositories on the network share is without a doubt the easiest way to setup.
My personal experience is that it works perfectly as long as the Windows share is on a "real" Windows machine.
At work we're using a share on a real Windows server without problems, but at home I ran into issues with a NAS (which behaves like a Windows share but actually runs on Linux).
You can read more about my experiences here:
Can you 'push' to network share using Mercurial on 64bit Windows 7?
There are a number of reasons to prefer, well, anything at all to a writable file share.
In essence it comes down to, there is a limited amount of damage someone can do with the ability to do a push via a web-method.
A read/write share on the other hand is necessary to do a push, but also allows a user to delete an entire repo, history and all.
Without even invoking malicious intent, people (or rogue software agents) have been know to navigate to random network shares and accidentally drag a file to someplace it doesn't belong.
The best reason to lock your PC is not because your co-workers find it amusing to use an unlocked email account to send porn to HR, but because its amazing what a cleaning lady can do with a rag and a keyboard. Its also amazing what Music Library applications can find while scanning all shares in a workgroup, and carefully "move" and catalog to someones Library.
Related
We've transitioning from Rackspace dedicated boxes to a completely cloud Azure environment. Production servers and development and as an MS shop we're going to be using Visual Studio Team Services. As an MS ISV partner we have a bunch of MSDN seats so our developers are all going to have an MSDN w/ VS Premium account which we'll use with Team Services/TFS. We're replicating our production web server on a virtual machine but after some refactoring will eventually move to an Azure website.
My question is about when users leave the company. Right now we have everyone log into a development server using RDP. They develop on that server. When someone is gone we shut their access off to that server.
With Team Services when the user opens up a project do they automatically get the entire project downloaded to their local development environment/machine? If someone leaves the company is there a process using VSO that secures that code and removes it from them or makes it inaccessible? Any way to lock it down when we need to? I can't seem to find a procedure to do this.
To add or remove someone from the account, go to the Users hub on the home page for your account. If you remove a user from it, that user will no longer be able to access your account.
When users connect to your account, they'll need to take some action to get source code. That would be cloning in the case of using Git or creating a workspace and running get for TFVC.
If the user has source code, for example, on a machine, there is no way to remotely remove it. They won't be able to get updates, etc., but there's nothing running on the computer that would be able to erase the code the user has already obtained.
All source code sharing i know allow zipping up or browsing the local repository. Including VS Team Services.
Daniel Mann is correct . Developing on shared servers via RDP is terrible for productivity due to development being graphics and disk intensive, often requiring admin rights and reboots / crashes, debugging triggers system interrupts, out of memory loops are fun on a shared machine ie they stuff everybody else around. (Even with RDP you can copy and paste or map a network drive locally or upload to the net )
If your doing critical stuff the ONLY thing that really works is physically bring them in to non internet connected machine /network with USB disabled. However these mechanisms especially denying internet will half productivity.
This is why most organizations rely on legal contracts. On a 2M project is it worth making it a 4M project? There are cases where this is required normally around national security /CIA / Defence but not for IP, there are better / trickier ways.
Pretty much all binaries are reverse engineer-able with little effort if you really want to. obfuscation does very little.
Our IT folks are telling us (the dev group) we shall not have ANY files stored on our local hard drives, including our TFS working folders. This is ridiculous for a variety of reasons but until I'm convinced it's a good idea, I'll play along and when no one is looking make a local working folder.
Does anyone does have their working folder on a network share? How well does it work? Each developer would have their own folder in the share but it would be on the network. My main concerns are performance and we would need to be connected at all times in order to work.
On a TFS point of view it's working without issue, but stay away from the Local Workspace of TFS/VS11.
I strongly feel for you on the compiling point of view, compiling a solution stored over the network is absolutely a disaster in term of performances.
You did not mention it but I assume your Network Share uses a Network Drive.
Btw, can I know why these guys don't want you to store files locally ?
While it's not something I would typically recommend, if that is the policy and you have to adhere to it, it might be worthwhile to consider simply having server-side development VM's that your devs RDP into. I've seen companies do this before, and the big downside is that if your not connected to the network you can't do anything.
There are some upsides too though. Being able to easily increase resources (RAM, disk space, CPU, etc) because of the virtualization infrastructure. If somebodies laptop dies they are not out of comission, just find a loaner machine and RDP into their VM and they're up and running. If somebody leaves, you have a copy of their entire working machine that you can give to their replacement. All machines can be easily backed up. Etc. Compiling, and working within VS in general should be much faster too than trying to work with a local Visual Studio reading/writing to a network drive.
I searched, so hopefully this isn't already a posted question already.
Basically, when we have users connect to the network via VPN, even though the login script will run and map their network drives, their home share drive (in this case P:) does not reflect the network version and shows the "offline" version.
The problem is they don't see all their files, and of course don't know how to trigger synchronization. Of course ideally we would just turn it off and there would be no problem, but as most of you know, working in a corporate environment we're bound by the decisions of the guys in another department.
So, is there a way to trigger folder synchronization? OR is there a way to force Windows when mapping the drive to look to the network version? I tried the true switch on the mapnetworkdrive method of the WSHnetwork but no joy.
Found a simple solution, you can call "mobsync.exe /logon", closing thread.
Details from microsoft, their pages move around so this is just a search.
I am developing a webapp that will be used on LAN mostly. I have different locations where I deployed this app. Some of the locations run windows and some run linux (no x-window system). I need to know if there is a software out there that could easily synchronize my files stored somehere in the cloud (the clouding service can be provided by the app developers or to use different clouds) on both linux and windows machines. My english is a bit rusty so i'm going to explain this in simple words.
I will work on my local machine. I want to upload the files somewhere on the cloud and the clients installed on the LAN servers should synchronize the files. The client must be available for linux under console (as a daemon if possible) while on windows it can be something like dropbox or ubuntu one.
Does somebody know of such an app?
Dropbox is available for Linux.
You could also investigate unison.
I think "Git" is the best solution to develop your project in different machine.
You can sync your code with easy command through this app, and it will record all the version of your code.
Just google "Git tutorial", and you will find many useful introductions.
I think there is a great tool called Syncthing should be considered after 8 years.
https://syncthing.net/
Syncthing is a continuous file synchronization program. It synchronizes files between two or more computers and replaces proprietary sync and cloud services with something open, trustworthy and decentralized. Your data is your data alone and you deserve to choose where it is stored, if it is shared with some third party and how it's transmitted over the internet.
Check the list of Syncthing's goals for more details.
I'm working on a windows platform and want to be able to auto sync my files one way 'on change' to my virtual windows or linux web server - also need to be able to filter file types. i can connect to the remote machine via network drives.
i'm ideally looking for a free, easy to set up solution - a commercial product that does what I need is called ViceVersa but its a little overkill and costs :)
Thanks
Josh
I'd use rsync - simple, easy to setup, and provides the filters you need. Also very low on bandwidth after the first pass.
Here is a link explaining how to get it working in Windows
Whilst rsync doesn't allow 'on-change' auto-syncing, it is very fast when it scans a sync'ed directory (even very large ones), so you could schedule a frequent sync to overcome this.
Edit: You could combine it with a program like this, to trigger an rsync on folder contents change. Cheaper than viceversa
For other users, its worth mentioning lsyncd, it will auto sync on changes between two machines (by default deferring to rsync). Will only work on Linux though, but if thats not a problem it works great.
It also seems that Sparkleshare has finally released some working code (Dropbox clone). Havent tried it myself but does cross-platform synching and you can setup your own server.