Slowdown of Microsoft Visual Studio due to different Virus scanner - visual-studio

What is the least slow Virus scanner to use with Microsoft Visual Studio
I have just had Microsoft Visual Studio “go slow” on me again due to my Virus Checker… (100s of Projects, some with over 100 C# files in them, so any thing that slows down builds is bad.)
We all know that development tools do a lot of file access so are badly effect by Virus Scanner. Most of us have to run a Virus scanner due to do many reasons.
So has anyone measured the effect of different virus scanners (and settings) on the speed of Microsoft Visual Studio?
Has anyone tied Microsoft Security Essentials with Visual Studio?
See also (if you have the rep, please extend the list)
What Really Slows Windows Down (This has some real data)
Visual Studio and Virus Scan of Temp folder
Visual Studio Optimizations
Development machines and anti-virus policy (Sophos Anti-Virus)
Antivirus (Symantec Endpoint) configuration for developer machine
Least intrusive antivirus software for development PC?
Suggestions for a productive hardware setup with excellent virus protection.
Choosing Anti-Anti-Virus Software (Coding Horror)
What are peoples experiences with Visual Studio 2010 and virus checkers?
I got this as part of a helpful email from someone (that will rename nameless) at Microsoft speaking on his own behalf.
It’s not clear that we (Microsoft)
would be able to endorse 3rd party
products. With that in mind, I did
notice that in the posting Ian linked
to (this question) that Computer Associates was listed by someone as
one of the best performing virus
checkers for development environments,
which interestingly enough is a
product that I believe many Microsoft
developers use on their desktops.
Since asking this question, I have had the least problems with Microsoft Security Essentials, however I have no facts or measurments to back this up.

I haven't really done any measurements, but what I usually do is to exclude the real time scanning of my development folder (usually my :\Projects folder).
That way, the compiler can work as fast as possible during my everyday repetitive tasks.
I do have a daily scan that have the folder in question in its path, in order to fetch any possible threat.
On a subjective note, I prefer to use NOD32.

Based on previous installations at various jobs, empirically rated from slowest (very annoying) to quickest (almost no impact):
Symantec (awful)
McAfee
AVG
ESET
Computer Associates (excellent)
I wouldn't bother with the speed tests, etc. shown at the AV review sites since most of these are in controlled environments, often with review-mode enabled. The impact will also vary depending on your network environment (workgroup or domain) and administrator-enforced policies.
Disclosure: I used to work on another now-obsolete anti-virus package back in the 90's.

I'd have to agree with the first answer.
I've seen such issues differ between jobs according to the verocity of the admins' intent to leave configs unchanged for devs. Correctly setup virus scanners still hinder dev, but at least it's bearable.
So I edit the scan lists to:
Exclude all dev code directories
Exclude Temporary ASP.Net gen'ed areas
Exclude Resharper caches
I find this improves the disk thrashing that otherwise occurs with Visual Studio, Resharper and a Virus Scanner all hammering the drive. As always SysInternals' Filemon can help you target rogue services/processes.

We have Trend Micro antivirus at work, and it's terrible. It seems particularly bad doing checkouts.
We commissioned a new build machine recently, and the IS team hadn't set up exclusions for the build drives, and it was taking 45 minutes to check out source code from TFS. With the AV turned off, the exact same source code took about 1 minute 30 seconds to check out.

I also dont have maesurements, but some experiences:
Dont use McAfee: We had serious performance problems (and other more serious ones) on a number of installations with that.
Use Avira AntiVir: Reportedly the highest success rates, and no noticeable delay. I use it since years.

Would comment on answers from #MagnusJohannsson or #Rodrigo but don't have enough reputation. Just to agree really, and +1 for both.
I have NOD32 4.x on two very similar machines, 2nd Gen intel SSD's plenty of RAM, Duo / Quad Core's overclocked, clean installs of win 7, VS2010.
Have used NOD32 for years on many different boxes and many different builds without any problems, but had a horrible issue on one of the machines after a hardware upgrade and reinstall of OS where ekrn.exe (NOD's service) would go nuts and just eat up all the CPU leaving me having to physically shutdown the box.
After lots of to and fro with ESET support it was decided it was due to Visual Studio file access looking suspicious / being to quick, and in the end I excluded my project folders, and since then has been fine. Interestingly was project folder for a solution I was not using at the time, so maybe a TFS thing?
Anyhow this link is a simple guide for anyone having same problem with NOD32's ekrn.exe eating CPU
Excluding files or folders from real time scans

Having Fusion assembly binding logging enabled in combination with a virus scanner can result in performance problems during startup of an application. Either disable the Fusion logging or add the folder that it logs to as an exclusion in your virus scanner.

You really need to weigh the capabilities and support of the antivirus program against the slowness. In my case, I've used several different ones, and the best choice was Avast. The Home edition is free, and they are one of the best about updating their virus definitions as new threats appear.

Don't use Kaspersky(The old Tect Review one) it slows down normal explorer file opening for almost 10 second(Yes, you need to wait 10 second before opening each folder). And yes it affects Visual Studio. The new version does not seem the have the problem. NOD32 seem to not have this problem, and is a bit faster than Kaspersky(I don't even know if it's scan as much as Kaspersky does).
But for what ever reason, NOD32 firewall is bad!

Exclude your project folders and the visual studio app folder for realtime scan, and schedule a scan as often as you can feel safe.

Well to be honest, my work machine doesn't have a virus scanner installed, and for almost 2 years, I've never had a problem with viruses because I'm constantly behind corporate web patrol and other things keeps me pretty safe.
At home, though, I use NOD32, and on 3 different machines all using Visual Studio, I've never noticed any slowdowns. I apologize for not having any benchmarks to measure, just wanted to throw out my "answer."

Related

Performance tips for making Visual Studio 2010 faster? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I don't know if anybody else has had an issue with the performance of Visual Studio 2010, but I close it daily and reopen it, and with an hour it starts to really bog down, and can't even keep up with my typing. Is there some obvious setting I am missing that would help to speed it up?
I am also using ReSharper, but even if I remove that, it only marginally increases the speed.
Since a couple of people have asked for my machine specs:
Intel Q9550 # 2.83 GHz
4 Cores
8GB Physical RAM
2x 60GB SSD in RAID0 combination for solution/project
VS2010 RTM Ultimate
Windows Server 2008 x64 R2 (Performance set for Applications)
Although it is sad to hear that the answer is "buy faster hardware" when my hardware is actually pretty good.
EDIT: Including a link to the Visual Studio Performance Diagnostics tool suggested by TimothyP
I recommend you consider to install some hotfixes from http://connect.microsoft.com/VisualStudio/Downloads. I had also havy performance problems before and seen messages like "insufficient memory" during Cut & Paste operations. This problem and some other (inclusive different Memory leak problems) are already hixed. After installing some hotfixes from http://connect.microsoft.com/VisualStudio/Downloads the performance of Visual Studio on my computer is much better.
Adding my own answer here. I really didn't think ReSharper did that much. After trying every option and only loading one project, it was still lagging even from simply joining 2 lines together (a couple of backspace presses).
I uninstalled ReSharper and along with all of the other tweaks I have done, the thing is blazingly fast.
One note re RAID0 SSDs. You should make sure your RAID controller (and driver!) supports TRIM for RAIDed SSDs. Most RAID controllers - especially the Intel chipset controllers - do NOT do this. The consequence is that I/O performance will quickly degenerate very significantly in SSD-based RAID arrays.
Windows 7 and Windows 2008R2 support the TRIM command - when your controller and driver implement it.
There is lots on this subject at sites like tomshardware.com or anandtech.com. If all else fails, you might consider either using SSDs in non-RAID, or using SSD for OS and normal, large HD for databases etc. You could also look up your specific SSDs on one of the aforementioned sites; performance varies much more widely than you might think.
Try turning off IntelliTrace. I've had numerous problems related to slowness and instability because of this feature (it could just be me). The setting is under Tools > Options > IntelliTrace > Enable IntelliTrace.
Of course, RAM is always very important for a large development environment like Visual Studio, especially the 2010 version and especially if you're using the Ultimate edition which includes such fairly memory intensive features as IntelliTrace and the Architecture and Modelling Diagrams.
However, one of the main things that is often overlooked, but can make a big difference to the overall performance of Visual Studio, is Hard-Drive Speed.
Scott Guthrie (Microsoft's Corporate Vice President of the .NET Developer Platform) wrote a very interesting article about this exact subject.
It's a few years old, and was written around the time of Visual Studio 2005, however, it is still very relevant today since the way that Visual Studio continues to work (specifically the way the compilers work) has not changed so much over that time.
Scott writes:
People often ask me at conferences for
PC hardware recommendations.
Specifically - "what type of machine
do you recommend I get for doing
development with Visual Studio?"
and/or "your laptop seems really fast,
what type is it?"
Some of my recommendations on this
topic are fairly standard and obvious:
Ideally you want to get a duel core or
better CPU. I also always recommend
getting at least 2GB or more of RAM.
The recommendation I make that often
seems to take people a little by
surprise is to make sure you always
get the fastest possible hard-drive
when buying a new machine - and where
necessary trade off purchasing
additional CPU processor speed in
favor of investing in a faster disk
instead.
Also:
Why does hard drive speed matter?
Multi-core CPUs on machines have gotten fast enough over the past few years that in most >common application scenarios you don't usually end up blocking on available processor >capacity in your machine.
What you are much more likely to block on is the Seek and I/O speed capacity with which >your computer accesses your hard drive. If you are using an application that needs to >read/write a lot of files, it is not atypical for your CPU processor utilization to be >really low - since the application might be spending most of its time just waiting for >the disk operations to complete.
When you are doing development with Visual Studio you end up reading/writing a lot of >files, and spend a large amount of time doing disk I/O activity. Large projects and >solutions might have hundreds (or thousands) of source files (including images, css, >pages, user controls, etc). When you open a project Visual Studio needs to read and >parse all source files in it so as to provide intellisense. When you are enlisted in >source control and check out a file you are updating files and timestamps on disk. When >you do a compilation of a solution, Visual Studio will check for updated assemblies from >multiple disk path locations, write out multiple new assemblies to disk when the >compilation is done, as well as persist .pdb debugger symbol files on disk with them (all >as separate file save operations). When you attach a debugger to a process (the default >behavior when you press F5 to run an application), Visual Studio then needs to search and >load the debugger symbols of all assemblies and DLLs for the application so as to setup >breakpoints.
If you have a slow hard-drive, Visual Studio will end up being blocked as it waits for it >to complete these read/write operations - which can really slow down your overall >development experience.
You can read the full article here:
Tip/Trick: Hard Drive Speed and Visual Studio Performance
Do you have the Desktop Experience component enabled in your Server 2008 R2 install? Unlike prior versions, Visual Studio 2010 makes heavy use of WPF and its performance benefits greatly from hardware acceleration. Enabling Desktop Experience will enable the Desktop Window Manager, which improves overall WPF performance.
For the same reason, you should ensure you are using the newest video drivers available.
If you are using many projects to build together in your solution I recommend to set to NOT BUILD in the project properties configuration. This is what I do to speed up mine. It is more evident in the Compile time...
With that kind of hardware, IMHO you shouldn't have any kind of trouble with performance almost no matter what you do. (2 x SSD in RAID-0? -- you're a maniac!!)
It looks like you've already solved this problem (is there anything specific that you did that you could share?), but another thing to check is to make sure your video drivers are up to date. It's surprising, but they can affect a lot of things you wouldn't suspect.
I suppose another culprit could be a hyperactive anti-virus package, too...
I love the suggestion of upgrades when clearly the machine is blazing. My suggestion, if possible would be to try out 2008 and see how it runs. I had several problems myself with 2010, least of all being the performance issue. For the sake of productivity I switched back to 2008.
If you can confirm that the problem does or doesn't occur on an older version we can have more of an idea where the problem lies.
The brand/controller of your SSDs is more important than the fact that they are SSD. Don't buy a cheap/budget SSD - you'd be better off with a good platter drive. Splurge on the high end SSDs and you'll experience major gains.
If you're editing XAML, you can just use the source code editor instead of the XAML editor. The performance difference is phenomenal:
http://msdn.microsoft.com/en-us/library/bb907321(v=vs.90).aspx
It's worth noting that if you open the xaml editor at any point, then you'll need to restart Visual Studio to get the performance back to normal. It's not sufficient to close the xaml editor.
The only way to get an ssd on raid0 with trim support is with the new ocz revo 3, 1.5GB Read/1.25GB Write, A more affordable and stable solution to raided ssd with no trim is to buy an ssd on sata 3 (550MB/s).
As Matt mentioned,
you might want to add more RAM to your machine,
but if it really "bogs down" every time you leave it open for an hour
you might want to get in contact with the VS team (http://connect.microsoft.com),
file a bug report and run the performance diagnostics tools they will send you.
Those guys have really helped me a lot in the past
and I'm sure they'd be willing to help you track down the real reason behind the slow downs.
Aside from that I can tell you that my main development machine
has 8 Xeon cores and 12GB of RAM. On that machine large solutions compile
in just a few seconds while they can take up to a minute to compile on my dual core macbook.
But since the RTM version I have not experienced any slow downs like the ones you are describing.
I'm assuming you are using the RTM version here,
can you give us the specs of your machine? Hardware + Software?
It's very hard to help you based on limited information.
I'm assuming you're using C#, but if you're using C++, maybe you could try turning off intellisense? I thought it was supposed to be better in 2010, but the previous versions always got a speed boost when I hacked out intellisense.
If it's the same as 2005, you can disable intellisense by renaming feacp.dll in [vs root dir]\vc\vcpackages.

VS 2010 very slow

I have just upgraded to VS 2010, and I have performance problems which I did not have before (in VS 2008).
The most annoying thing is that it freezes while I work in the text editor. Sometimes when it freezes I see that it is saving auto recovery information, but not always.
Almost anything I do gives an unacceptable long delay, like saving, starting to debug, ending debug session, switching between design and code view, and doing WinForms designing.
I have some parts of my home directory on a mapped network drive. I suspect that that might be a part of the problem. Is it possible to configure VS 2010 to use exclusively local disk for its "internal" work perhaps?
Any hints would be appreciated! Has anyone else experienced these kinds of problems?
Edit:
I forgot to give my specs:
Win 7 64-bit
4 gb memory
No addins, just standard installation
The project folder is on the network drive
One interesting thing is that I feel that I have better performance in a VM running XP (where the VM runs on the same PC).
VS is great if you do what microsoft recommends and work on a local copy of your projects.
As soon as you start tying to open projects in remote locations you will get this issue.
Recommendations:
use a source control solution.
create a copy of your project locally and run the solution from that.
Also ...
I think it does it's clever stuff in the background, I found the more i use it the faster it gets, especially on long running projects that I regularly go back to.
If you think it might be aformentioned WPF framework you may want to try switching off aero (as a test) if it helps the problem is likely that your chosen graphics hardware is not very good at effect or 3D based output so it's struggling.
Also try reducing the number of background services and apps you have running.
on windows 7 these days 4 gigs of ram is considered standard, so whilst it should perform fine maybe consider putting more ram in if you are trying to handle large datasets / similar business applications.
Another thing you could try is run a repair install over the top of your existing, it may not have cleanly installed something ... unlikely but it may help.
If you can, buy an SSD disk and move all your projects locally.
I find VS2010 super intensive on disk.
It fly on my home machine with an SSD but it's almost unusable on my work machine(Win7 4 gig RAM, but standard disk)
Try setting the number of parallel builds to half the number of cores you have (I think its in options, settings, Solutions and Project, build and run).. I had it set to 8 which was too much.. it spawned 8 msbuild.exe, rebuilding a solution with 70 projects bottlenecked the disk when they all tried to read/writte similar pre-compiled headers. Those msbuild's stick around even after you close the IDE.
Also I disabled the gather browsing info for implicit files, which made intellisense parsing quicker.
An old post I know, but in case it helps others (as the previous answers focused on source code)...
I found that it wasn't my source code that was the issue, that was held locally along with all the references, but the default locations (project, project templates and item templates) as these were held on a networked drive. These can be altered in the Tools -> Options -> Projects and Solutions.
Alternatively you could change the frequency of the saves or turn them off altogether via Tools -> Options -> AutoRecover

Can I improve Visual Studio performance through virtualization?

It stands to reason that Visual Studio (.NET compiling and the IDE) would run better on a $5000 server than a $500 desktop.
Does anyone have experience running Visual Studio in a virtual machine hosted on a server in this price range, with access via RDP? (Assume modern hardware available for the stated prices.)
Obviously, there will be other VMs on that server, but not everyone will be doing intensive tasks such as compiling at the same time, etc. As a starting point, you can assume 4GB of memory and 4 virtual CPUs are allocated to the VM, but feel free to offer other configuration suggestions.
Any insights? How did it work out? I am looking for practical ways to maximize the speed of the compile/run cycle and general IDE performance.
(I'm on the fence as to whether this belongs on Stack Overflow or Server Fault. Since it has to do with Visual Studio and might be of general interest to programmers trying to improve the development experience, I decided to post it here. Please move it if this is not okay.)
If you have a decent multicore processor on your desktop machine it's probably the disk that is a bottle neck. When compiling VS must access many files (in large solutions, multi project). So, I am assuming that CPU is not a problem.
What you can do:
reorganize your projects - if you use copy local then dlls are copied on multiple places (The VisualStudio Project Reference + Copy Local true option is evil!)
buy additional RAM and setup a RAM disk and do your compilation there (beware that if you restart your machine you'll loose RAM disk content - this can be mitigate (stable OS, version control sistem, ...) Speeding up build times dramatically or Speeding up the build – ditch the SSD and go for the RAM drive
buy an SSD disk to do compilation on
It should work out for you; it wouldn't be as good as running it locally on a better machine--but it sounds like it could still be an improvement.
The version of Visual Studio is another large factor, VS2008 has significant performance gains over VS2005.
C# development is also typically less resource-intensive than VB.NET development, since VB.NET runs a background compiler to provide near-real-time feedback about code errors.
And finally, make sure to disable any un-used plug-ins / addins that might be slowing you down.

Visual Studio performance and add-ins

Do the useful add-ins (Resharper, StyleCop, etc.) to Visual Studio speed up your work? Or tools need too many resources and you have to wait until each add-in completes execution?
[Update]: By the way does some body notice whether performance of IDE + Resharper is better for solutions that contain web sites or web applications?
I can speak very strongly that resharper definitly does speed my productivity greatly. Past versions of Resharper have had some bad performance issue with the IDE but I have had no issues with the most recent version.
I use some add-ins as long as they don't affect the performance of Visual Studio. To that end, tools like StyleCop, MZ-Tools, and Visual Studio Commands are the clear winners.
The problem I have with tools like Refactor! and Resharper are that
They degrade performance, particularly for large solutions.
You become dependent on the shortcut keys, etc. they provide and become completely useless when working on another environment that doesn't have them installed.
Yes, tools like Refactor! and Resharper are excellent for what they do and can increase your typing productivity but I don't think the gain is worth the dependence. This, of course, depends largely on how you use them. For things like refactoring method parameters, changing fields into properties, etc. they can be very useful and potentially save a lot of time. Again, while it can save a lot of time it is still important to know what these tools are actually doing for you so you can still be productive without them.
ReSharper definitely puts a demand on hardware resources, particularly when using site wide analysis on a large project. Having said that, the extent of the performance hit is highly dependent on the host machine. On my work laptop (32 bit XP, 3Gb RAM, 7200 RPM HDD, 2.2 GHz dual core) it suffers but on my home PC (64 bit Win 7, 8Gb RAM, 7200 RPM HDD, 2.9 GHz quad core) it flies and I barely notice the performance hit. That said, I still couldn’t live without it even on the lower specced hardware. The productivity gain still outweighs the downtime in waiting for slower processes.
I user Refactor! all the time. Just the time it saves me to encapsulate private variables into properties is worth it in my opinion.
That being said... a lot of the "benefits" of these programs are negated if you program it correctly to begin with.
For example, if you already habitually use "WITH" statements properly, you probably do not need something to clean up your style.
However in corporate America (and elsewhere I am sure), coding practices are not always followed by everyone, and rework and modifications are always coming in, so usually you will end up needing them eventually.
I personally have not experienced any noticeable difference in performance with these type of tools.
I have Resharper, Resharper Scout and Team Explorer + TFS Power tools. My Visual Studio definitely feels a little sluggish comparing to barebones, but if you want superspeed over features why not work in Notepad? For me, Resharper is definitely worth the viscosity.

Tips to upgrade workstations for development team?

I have secured the budget to upgrade the individual workstations and latops. While newer, bigger screens were welcomed with enthusiasm, the thought of re-installation tools and settings caused most of them to blanch and I got one "Do I really have to?".
How much downtime do you usually have when you move to a new machine?
Do you employ tools or script to set up your dev environment, tools, db's, debuggers etc.specifically for a windows environment?
Is there a standard image that you keep and then let devs move in and tweak the machine as necessary?
My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.
Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.
So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.
Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.
Some additional benefits:
When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
If hardware changes, doesn't matter, just move the image.
You can run multiple os's concurrently for testing
You can take "snapshots" in your current image and revert if you really messed something up.
Multiple builds on the same machine...since you can run multiple os's.
Surprisingly the overhead of a virtualized system is quite low.
We only run the software on a real machine for performance tuning/testing purposes.
One day is generally enough for upgrades. I do keep digital copies of VS.NET so much easier to install.
When it comes to other tools generally it's just better to go to websites and install the latest version.
Also it's a good idea to install tools whenever you need instead of trying to install everything at the same time.
The last time I upgraded to a new machine, I think it took about 4 hours to get most of the necessary tools reinstalled. Over time, I've had to re-install quite a few more tools, but I think it's worth it.
If you can get a ghost/image of the default tool set (Visual Studio 2003-2008, Eclipse, NetBeans, or whatever you're using), and all the major service packs, that would help a lot with the initial setup.
I think the downtime is definitely worth it, a new, faster machine will make anyone more productive.
You can have 0 downtime by having both machines available. You will not have as much productivity.
This depends on the number of tools needed by the development team. Tools such as Rational Software Architect can take hours to install on their own. The exercise of having the developers list the applications they need before moving in can help you optimize strategies to deploy effectively. Both machines should be available for a fixed period of time and having them available can allow develoers to both work and kick of long running installs at the same time.
Creating a standard image based on the list provided to you can improve efficiency. Having the relvant software on a share could also let them cherry pick as needed and give the development team the feeling that they can go back as necessary.
Tools to assist in catpuring user settings exist. I have only ever had experience with Doctor Mover. If you have 100 or more developers to move it may be worth the cost. I can't complain too much but it wasn't perfect.
I have never had a problem with just getting a list of all the software a particular users uses. In fact I have never found the base install to be much of an issue. The parts I tend to spend the most time on are re-configuring all of the users custom settings (very common with developers I find). This is where it is very valuable to have the old machine around for awhile so that the user can at a minimum remote-desktop to it and see how they have things set up.
Depending on how your team works, I would highly recommend having every user receiving a new computer get the latest source tree from your source control repository rather than by copying entire directories. And, I would also recommend doing that before actually sending the old workstation elsewhere or even disconnecting it.
One of the great things about tools like CVS and SVN is that it is quite easy for developers to end up with an unofficial "personal branch" from things that are not properly checked in, merged, etc.
While it will cost time to deal with the shift if things are not properly synchronized, it is an invaluable opportunities to catch those things before they come to haunt you later.

Resources