Are there any drawbacks to running Visual Studio remotely? - visual-studio

Let's say you have a slow laptop which can't handle Visual Studio but a blazingly fast desktop that can. Let's also say that you want to develop in several rooms in your house. Are there any drawbacks to having Visual Studio running on the desktop and simply using the laptop as a way to access it remotely? I'd guess that the only thing that you would be concerned about would be the network latency, but if the two computers are on the same network that should be minimal.

Do it.
Since you are running Visual Studio in your own local network, the main drawbacks (security and latency) are not there. In addition, you get the speed of your desktop and the mobility of your laptop.

I do this a lot even over broadband, I've never found speed to be a problem.

This is my standard working practice at work. There are times when you have issues, such as opening TFS document attachments can fail, but overall the experience is fine.
It is also an added bonus that you can leave it running continually (i.e. overnight / weekends) and you can kick off a build before you leave for the evening and come back to a packaged installer (or an error :) ).
I'm looking forward to (in a year or two) be able to do this over Hyper-V - then the application will run as though it IS on my laptop, with no remote desktop required.

No big drawbacks. I've been running VS 2008 remotely on a server 400 miles away, using GNU/Linux and rdesktop on my laptop and the server (of course) running Windows. The only problems I encounter are that it is a mess to move files between the two - but if you have the desktop near and can install anything you like (ftp programs for example), I can't see any drawbacks.

In a corp work environment where I've tried this I never felt particularly joyful. Tried using MSTSC and VNC.
Having a desktop with multiple monitors and trying to view that through a smaller laptop display is typically quite painful, never enough space.
Even when it was PC's on the same switch there always seemed to be some delay in the mouse moving or typing, I'm sure you could adjust, I just found it a bit annoying too.
We haven't tried serving up DevStudio from a CITRIX server yet, that might be worth a go.

I work a lot with Visual Studio over broadband, which is ok.
If you are running linux on your laptop, rdesktop is your friend. There are many options to gain more speed, like using 8-bit color instead of 16 or more. I don't know if mstsc offers such options. Visual Studio 2008 has got many options concerning speed which can be enabled if the connection is too slow: disable fancy menus etc.
greetings

I think that having the dual (or more) monitor set-up does beat the ease of mobility when using a laptop connecting to a remote desktop. I work at home at least two days in a working week using my laptop (which is a 17", 1900x1200 screen, basically what they call a "desktop replacement"), connected to VS and TFS using VPN and I find that experience less than the situation at work where I have the 17" laptop screen AND a 24" TFT (also 1900x1200).
I also have experienced that running VS (or SQL Server Management Studio for example) over an RDP session is just not like the real thing. It does get the job done, however the "feel" isn't just the same.

Related

What's a good way to do testing a plug-in on multiple Windows and Outlook versions?

We're building a plug-in for Outlook that should work on multiple Windows versions (XP, Vista, 7) and also with different Outlook versions (2003, 2007, 2010).
The testing problem I am facing right now, is that I can't figure out a good/convenient/thorough way to test the application on multiple Windows and Outlook versions.
At the moment, I have a VirtualBox which runs many virtual machines, with different Windows versions and Outlook versions. So I would have a virtual machine with Windows 7 testing Outlook 2010, and another one with Windows 7 testing Outlook 2007, Windows Vista with Outlook 2010 and so on, going through some of the possible combinations. It kind of gets the job done, although it is cumbersome and takes a long time to test.
Some of the testing included in the application is unit testing, but this is also rather tied in with the machine I test it on (windows 7 with outlook 2010). For example, I was using ManagementObject recently, which worked fine on my system (and thus passed the unit test for that method), however, using that object threw an exception in another person's system, which crashed the application.
I work on Visual Studio 2010 Ultimate.
The questions: Is there a more elegant way to make the testing process more streamline and more efficient? Any other testing methods you recommend? How would you deal with this problem?
Thanks! Looking forward to your replies.
I've worked on similar situations (in my case is to test 20 different languages of Windows). The basic idea is the same: each virtual machine maps to a specific condition that you need to test.
The cumbersome part comes from inappropriate tool usage. AFAIK, there are several ways to achieve automation (to a certain degree):
Use VMWare ESX which supports nice scripting. It costs a little fortune unfortunately.
Use some screen record/replay tools to automate what you are doing by hand right now. There are quite a few on the market and the price difference is huge. Only you can tell what fits you best.
VirtualBox can be reverted via command-line. Therefore, if you can setup a bunch of machines hosting VBox, each configured to revert VBox during reboot, and each guest image is scripted to load your test, then the problem is to reboot these machines as you wish. This is an easier problem since there are quite a few ways to reboot machine remotely.

Team foundation server 2010 - some setup question relating to performance

Setup TFS 2010 on a pretty oldish server (actually an oldish desktop machine running server 2003 - single core, pre Core2 P4 so outdated...)
I'm finding a first adding and first getting of a website with about 700 files is quite slow (over 20 mins already over a VPN line).
Once you do that, the checkin / checkout operations are reasonably ok.
One thing I haven't done yet is get one of the guys at work to make a change and for me at home to do a get latest. We were running VSS up to this and that operation used to be a killer!
Anyway, few questions:
1)We set it up as a basic installation on server 2008 express. Would there be any performance gain with full sql server 2008?
2) We have the option of moving the drive to a better core 2 machine that should be a lot faster - will that make any difference?
Or are we simply running into a typical slowness of TFS over a LAN (bearing in mind we as a team work mainly in the office but sometimes from home over VPN when the speed issue seems to get worse).
TFS in it self isn't slow. We run a TFS on a dedicated VM and with the other VM's on the actual server also taking up ticks, or TFS is decently fast and reliable. Even when checking in and out code, running reports etc... So maybe the 2 core machine would help, but your P4 shouldn't be that bad in running it. 700 Files should be fairly quick within a minute or so. I think its your VPN that makes it slow. Everyone knows how slow VPN's can be.
Just an update. We moved to a faster machine (since we were getting one anyway). The speed of the machine makes no difference.
The good news is, its only slow the first time you add a project to TFS or take it down from TFS.
Daily checkin / checkout is fine. Also, doing a get latest from home is much improved over VSS - it no longer does that horrible freeze while it spends 10 mins figuring out if files have changed.
It was worth the upgrade for this alone.

Moving from XP to Windows 7

This week I’m going to try and start the move from Windows XP to Windows 7 on my development PC at work. I’ve downloaded the Windows Easy Transfer app for going from XP to Win7; that should take care of My Documents. My concern is all of the development environment. In particular I’m concerned about re-establishing things like my Windows services, which host my WCG services, etc. They use TCP and various ports. Plus there are the various ASP.NET apps that are on my machine. What caveats should I be aware of, before I start this?
I deeply don't recommend you to migrate. If I were you, I'd backup these files, format the PC, reinstall everything back again and re-set up the websites. No matter how much pain that may cause, it's still less pain then the potential one you might get if you use this migration tool rather than doing it properly, which would eventually cause you to do the right way anyway.

use win7 as dev-machine operating system enough?

so far i have winsrv 2003 as the operating system on my dev-machine. now my manager asks me if i can switch to win7 because of licencing costs.
anybody of you know any good reasons for doing/not doing this?
I've been using Windows XP on my dev-machine at work, without any problem -- Linux would have been just as fine (or more) too.
I don't see why Windows 7 wouldn't be OK for a development machine, as long as you have all the drivers you need -- you probably don't need a server environment to develop, do you ?
Of course, I wouldn't say the same for a testing/staging server, which has to be as close as possible to the production one.
We don't know what you're developing, but I don't recall ever being hampered by not developing on a server-grade OS, apart from debugging OS-specific issues (which will always happen whatever you pick).
I've been running my dev environment on Windows 7 since the RTM hit. Developing with Visual Studio 2008, SQL 2008, Visual Basic 6, running VMWare--all with no problems.
I'd say the best reason for doing it is because your manager has asked you to.
Seriously, if the company has decided that's the way it want's to go, in my experience you don't usually have a lot of choice.

Do you have performance problems when you work on Visual Studio projects via a network share?

We have tremendous problems with Visual Studio (2008, if that matters) locking up and slowing down when accessing projects over a network drive. It can take several minutes to open a large Web site project through a mapped drive, and saving even a single file can take a minute or more.
I fired up Wireshark and watched the traffic. VS, it seems, requests massive amounts of files from the network -- there's an enormous amount of SMB traffic. I've done some research, and this traffic seems to stem from two situations.
VS has to have everything in its own process to provide Intellisense.
VS needs to have all the source in order to compile the project.
All the advice I've read seems to boil down to the same thing: work locally, not on a remote machine, then push your code to an integration server via source control.
This would sure solve our problems (VS is quite fast working locally), but what if you can't work locally? What if the project and the infrastructure required to run it is too large and complicated to be replicated on everyone's individual machines?
We've gone 'round this problem a couple times, and the only way we can figure to work on these projects is direct access via a mapped drive. However, the VS slowness and lockups are really becoming a problem.
One solution: we installed VS on the server and work on the projects directly on the servers via RDP. Seriously.
So, I ask:
What does everyone else do? Do you work via the network, or do you replicate projects locally? If remotely, do you suffer from VS performance issues.
We work locally and use SVN to keep all our code on the server.
I find VS 2008 quite slow working locally sometimes so I wouldn't fancy working on a network share.
Trying to compile over a network share is horribly slow using visual studio. Your start times will be bad as the intellisense database is regenerated. Each compilation has to go over the network multiple times. Linking takes forever.
If you need the output of your compilation on the network, I'd recommend doing your compile locally and defining a post-build command to copy the results to your share.
If, as you say, you cannot pull everything locally then I'd suggest your project is too big and needs to be broken up into more manageable chunks. For a multi-tier application, break it up by tier and invest in some form of continuous integration (e.g. CruiseControl) to automatically build individual pieces. In this way you can work locally on an particular piece and pull the pre-build portions from CI for the other pieces of the application.
I'm not terribly surprised that using VS to load projects over a network share has performance issues. VS (in any language) is constantly getting information from files in the project. Once you start loading this over a network you're at the mercy of the underlying network connection. All lags and access issues will directly translate into VS having an issue loading file contents.
I would advise copying the solution locally and using some form of source code control to sync the project on the share.
If the code is too complicated to install on everyone's machine, then don't put it on everyone's machine. Does everyone need to have everything in order to do productive work?
I have 79 projects in my solution that I work with. Several hundred thousand lines of code. I pull my source down everyday from TFS and build it; it's a lot of code, but it's a far better solution than trying to work over a network share.
A more legitimate situation of having the source code on a share is when one has a non-Windows host on which a (number of) virtual Windows machine is running.
I have this exact situation where my desktop machine (the host) is running Debian and I use VMware to run various virtual Windows machines (the guests), including one that has Visual Studio installed so that I can target Windows OS's. Having the source code on a Samba share on the host machine has the following pro's:
The source is not duplicated, so there is no way to confuse different copies while working on several virtual machines at the same time.
I have full control over the source from my preferred OS.
I can turn on and off any of the virtual machines, or roll back to a snapshot, without the risk of loosing changes.
I can build (etc.) from the same source on several machines without having to commit changes before the source fully tested (reason: I have to use Subversion <1.5).
The only problem with this setup is that Visual Studio (6,7,8,9) is painfully slow.
I have mounted the partition (on which the share lies) with "relatime" and this works in as far as the disk activity on the share moderate, but Visual Studio keeps the (virtual) network card occupied all the time.
Any solutions to this would be very appreciated.
I encountered similar problems everytime I worked (work = anything else then just copy / paste files) over a network drive. The problem occured with ZendStudio and Eclipse.
Why not use any kind of source control?
When working on Windows based projects I've always worked locally.
Once at a unix shop (AIX iirc) developers would work via NFS mount and checkin/checkout via RCS...
I'm using VS2005 across to a network share and not having any performance issues. However, it is a new server (Windows Server 2008). I don't have any other data points for VS since using it at work is relatively new for me.
However, some datapoints from using Netbeans for previous projects on a network share... Local build time for my project was 2 minutes on Vista, on a fast dual-core AMD 64-bit machine. For a network share project, on a Server 2003 box, it was 20 minutes. Building that same project from an ancient Tablet PC (1ghz, single core) running XP locally was around 5 minutes. Interestingly enough, the Tablet PC could build on the Server 2003 box in the same 5 minutes.
For those asking "why" on the network share. The network share is automatically backed up, archived, etc. Also, that way I can very easily look at the same projects from multiple machines without having to worry about pushing back into the repository, etc. Once you've gone to having your dev stuff on a device where you can get to it from anywhere/anything, you'll never want to do local storage again!
I have performance problems via network anything, they just aren't good enough yet.
I thought it was common knowledge that disk-speed is one of the major "slowness" factors when it comes to using VS in Windows. Most dev machines I've built have had projects located on 10k RPM RAID0 drives, or at least a single 10k RPM drive. And even then it seems slow sometimes. Just the way it is, I suppose, until VS2009/VS2010 fixes it? :)
From my experience, this lag when working on a network share is 99% due to Intellisense. Disable it and you'll see.
disabling Intellisense indeed speeds up saving and opening files trough a UNC share dramatically
http://blogs.msdn.com/saraford/archive/2007/12/03/did-you-know-how-to-turn-off-intellisense-by-default.aspx
but then again, as stated in other comments, you might as well use a good text editor
I've also experienced the problems with performance mentioned above. It seems to vary from project to project, but I did find one way of speeding up performance significantly for some project types.
Following the advice in this article made a previously unusable project on a network location (it would take minutes to open one file) perform almost like a local project. The basic gist is that you need to grant FULL TRUST to the network location:
To grant permission to all your projects in your Visual Studio Projects folder located on the network, follow these 8 steps:
Open Microsoft .NET Framework 1.1 (or 2.0) Configuration which you'll
find under Administrative Tools in the Control Panel.
Expand Runtime Security Policy | Machine, | Code Groups | All_Code |
LocalIntranet_Zone In the right-hand pane, click Add a Child Code
Group.
In the dialog that follows choose Create a new code group and fill in
a Name like Visual Studio Projects.
Optionally, provide a Description for the Code Group. (You'll see the
description when you click a Code Group in the left tree, helping you
identify the various Code Groups you may have) .
In the Condition Type drop down, choose URL
For the URL field, type something like this:
file://YourServer/My Documents/Visual Studio Projects/*
Under Use existing permission set, choose FullTrust (that is, if you
trust your own applications. If you don't, choose a different
permission set or create a new one).
Not sure why this works, but it made a previously unusable NET 2.0 project perform significantly better.
Original article: http://imar.spaanjaars.com/364/how-do-i-allow-my-visual-studio-net-projects-to-run-from-a-network-location
I was having the same problem. I have a local copy of our build system, which expects certain drive letters, and was also experiencing slowness.
I have solved the problem by adding the following registry keys:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices]
"R:"="\DosDevices\D:\devel\build
"S:"="\DosDevices\D:\devel\src"
Note that the double '\'s above are part of the .reg file format. When using regedit use single '\' throughout.
My build times were divided by 3. :)
I found the info in the wikipedia article on the SUBST command.

Resources