Build impact from antivirus even with folder exclusions - windows

we recently updated one of our windows nodes for jenkins (From Windows Server 2012 to Windows Server 2019), the build times for all projects pretty much doubled in times. It got better when we excluded the workareas on said node. We used McAffee, now we are trying things with the Windows Defender. Still it seems our build times are close to double the amount they used to be, with an identical system (besides antivirus and OS version). Does anyone have an idea? I only find articles where it seemed to be enough to exclude certain folders. Are We missing folders?

Related

How Can I Find The Current Windows Defender Executable Location? And Why Are There Many?

Microsoft has multiple versions of the Defender executable (MpCmdRun.exe) installed on my computer. There is an obvious one in "C:\Program Files\Windows Defender\MpCmdRun.exe" but then two others in "C:\ProgramData\Microsoft\Windows Defender\Platform\4.18.2010.7-0\MpCmdRun.exe" and "C:\ProgramData\Microsoft\Windows Defender\Platform\4.18.2011.6-0\MpCmdRun.exe". The folders all have different versions of MpCmdRun.exe.
Per Microsoft, the latest version is the 4.18.2011.6-0 version, but how would I know this if I hadn't researched? And if I encode some dependency on this location (see below), how would I know when it's been superceded?
My goal is to create a custom scheduled task for Defender that runs full scans rather than quick scans. I tried whacking on the existing Windows Defender task definitions (in Task Scheduler -> Task Scheduler Library -> Microsoft -> Windows -> Windows Defender), but the tasks periodically modify themselves (after updates, etc.) and my changes are lost. I can readily create my own custom task, but I have to know the location of MpCmdRun.exe which, as I pointed out above, seems to move around.
Does anyone know of a reliable way to determine what the location of the latest Defender executable is, preferably easy enough to use in a command line?
Also, anyone have any clues about why Microsoft did it this way? Why not just keep the latest version in "C:\Program Files\Windows Defender"? And why leave old version laying around?
Slow down.
I found the instructions in 30 seconds.
https://learn.microsoft.com/en-us/windows/security/threat-protection/microsoft-defender-antivirus/scheduled-catch-up-scans-microsoft-defender-antivirus

Visual Studio 2013 is idling at 40 percent CPU usage after upgrade to "Update 3"

I'm running Visual Studio 2013 that I use for rather basic MVC-5 web dev on a fairly ok Windows 8.1 machine and never encountered any real performance problems.
I installed Update 3 yesterday. Since then I can barely use the IDE anymore as it is idling at around 40% CPU usage and takes around 5 seconds to register that I actually typed something.
When I reboot my computer it might work for around a minute or so, but then it starts maxing out my CPU usage again without any special cause.
Is anyone aware of a problem with Update 3 that could cause this? How can I debug what's going on?
I've just had this happened to me -- CPU at 30% constantly. Restarted and closed solution to no avail. It was the massive GIT commit that was in the pipeline
Update: I started using SourceTree, the processing still happens but now I don't have to wait for so long for the git sync to occur.
I figured out my problem and it's to do with how we have our solution setup.
We have a very large solution and one of the projects is actually a "Web site" project. We have this because we wanted a project in our solution which is used purely to edit html, css, js files and we use grunt to build these files into our main web app project. A "web site" project is really the only way we could get VS to just list files to edit without building a DLL for that project. To do that we had to manually add a website project to the .sln file.
This has worked for us in the past but now with Update 3 it's killing the CPU even when VS is just idling. I removed this website proj from our sln file and the CPU is back to normal.
This is probably an edge case scenario but if you have a website project in your sln file along with other projects like class libraries and web apps, then this is probably your CPU issue. I haven't tested whether or not just having a normal website project open also has the same CPU issues (it might actually just be a problem with websites, not sure)
Update:
I opened just the folder structure we have for these html,css,js files as a standalone website in vs and CPU was through the roof even when idle. So I think the problem is just website projects in VS themselves.

Hosting Visual Studio projects in dropbox

I develop both on my desktop and laptop, and I am frequently switching between them. Are there any problems that could arise from keeping a project folder in my dropbox and always accessing/editing from there? I'm running the VS2010 on both, but W7 on one and W8 on the other.
I'm using it often. But I do experience some issues. It seems that sometime VS and Dropbox conflict. This shows by leaving some temporary source files or by errors during compilation of file being locked.
In fact I came here while looking how to solve them. But still they are only a little issue and I keep using it that way for a long time.
EDIT: It is not just me. See Visual studio 2012 and dropbox don't play nice together question on SuperUser.
I'm using Dropbox to host my project and I edit and build directly on there and have experienced no problems, ever. Win7, VS2010, CPP. I find Dropbox to be simpler and equally robust to than version control software. I'm a big fan. I should say Microsoft OneDrive once failed me, horribly, and I no longer trust it. With Dropbox, I always check the icon in the systray carefully to make sure it is finished updating before I turn my computer off.
I use both git and Dropbox, as I also switch which machine I'm working on. This way I can use source control with the rest of my team, while also able to pick up where I left off. My 2 PCs that sync are my one at work and at home. Both desktops, both almost always on and running dropbox.
Rarely I get conflicts, when a machine is offline or something. The solution 99% of the time is to simply delete any conflicting files. Because I'm constantly up to date with git, it's fine if I ever have to delete all my local code, since I can always get it back.
So it's really for nothing other than being able to run out of work on an urgent task, and then resume where I left off when I got home.

Do you have performance problems when you work on Visual Studio projects via a network share?

We have tremendous problems with Visual Studio (2008, if that matters) locking up and slowing down when accessing projects over a network drive. It can take several minutes to open a large Web site project through a mapped drive, and saving even a single file can take a minute or more.
I fired up Wireshark and watched the traffic. VS, it seems, requests massive amounts of files from the network -- there's an enormous amount of SMB traffic. I've done some research, and this traffic seems to stem from two situations.
VS has to have everything in its own process to provide Intellisense.
VS needs to have all the source in order to compile the project.
All the advice I've read seems to boil down to the same thing: work locally, not on a remote machine, then push your code to an integration server via source control.
This would sure solve our problems (VS is quite fast working locally), but what if you can't work locally? What if the project and the infrastructure required to run it is too large and complicated to be replicated on everyone's individual machines?
We've gone 'round this problem a couple times, and the only way we can figure to work on these projects is direct access via a mapped drive. However, the VS slowness and lockups are really becoming a problem.
One solution: we installed VS on the server and work on the projects directly on the servers via RDP. Seriously.
So, I ask:
What does everyone else do? Do you work via the network, or do you replicate projects locally? If remotely, do you suffer from VS performance issues.
We work locally and use SVN to keep all our code on the server.
I find VS 2008 quite slow working locally sometimes so I wouldn't fancy working on a network share.
Trying to compile over a network share is horribly slow using visual studio. Your start times will be bad as the intellisense database is regenerated. Each compilation has to go over the network multiple times. Linking takes forever.
If you need the output of your compilation on the network, I'd recommend doing your compile locally and defining a post-build command to copy the results to your share.
If, as you say, you cannot pull everything locally then I'd suggest your project is too big and needs to be broken up into more manageable chunks. For a multi-tier application, break it up by tier and invest in some form of continuous integration (e.g. CruiseControl) to automatically build individual pieces. In this way you can work locally on an particular piece and pull the pre-build portions from CI for the other pieces of the application.
I'm not terribly surprised that using VS to load projects over a network share has performance issues. VS (in any language) is constantly getting information from files in the project. Once you start loading this over a network you're at the mercy of the underlying network connection. All lags and access issues will directly translate into VS having an issue loading file contents.
I would advise copying the solution locally and using some form of source code control to sync the project on the share.
If the code is too complicated to install on everyone's machine, then don't put it on everyone's machine. Does everyone need to have everything in order to do productive work?
I have 79 projects in my solution that I work with. Several hundred thousand lines of code. I pull my source down everyday from TFS and build it; it's a lot of code, but it's a far better solution than trying to work over a network share.
A more legitimate situation of having the source code on a share is when one has a non-Windows host on which a (number of) virtual Windows machine is running.
I have this exact situation where my desktop machine (the host) is running Debian and I use VMware to run various virtual Windows machines (the guests), including one that has Visual Studio installed so that I can target Windows OS's. Having the source code on a Samba share on the host machine has the following pro's:
The source is not duplicated, so there is no way to confuse different copies while working on several virtual machines at the same time.
I have full control over the source from my preferred OS.
I can turn on and off any of the virtual machines, or roll back to a snapshot, without the risk of loosing changes.
I can build (etc.) from the same source on several machines without having to commit changes before the source fully tested (reason: I have to use Subversion <1.5).
The only problem with this setup is that Visual Studio (6,7,8,9) is painfully slow.
I have mounted the partition (on which the share lies) with "relatime" and this works in as far as the disk activity on the share moderate, but Visual Studio keeps the (virtual) network card occupied all the time.
Any solutions to this would be very appreciated.
I encountered similar problems everytime I worked (work = anything else then just copy / paste files) over a network drive. The problem occured with ZendStudio and Eclipse.
Why not use any kind of source control?
When working on Windows based projects I've always worked locally.
Once at a unix shop (AIX iirc) developers would work via NFS mount and checkin/checkout via RCS...
I'm using VS2005 across to a network share and not having any performance issues. However, it is a new server (Windows Server 2008). I don't have any other data points for VS since using it at work is relatively new for me.
However, some datapoints from using Netbeans for previous projects on a network share... Local build time for my project was 2 minutes on Vista, on a fast dual-core AMD 64-bit machine. For a network share project, on a Server 2003 box, it was 20 minutes. Building that same project from an ancient Tablet PC (1ghz, single core) running XP locally was around 5 minutes. Interestingly enough, the Tablet PC could build on the Server 2003 box in the same 5 minutes.
For those asking "why" on the network share. The network share is automatically backed up, archived, etc. Also, that way I can very easily look at the same projects from multiple machines without having to worry about pushing back into the repository, etc. Once you've gone to having your dev stuff on a device where you can get to it from anywhere/anything, you'll never want to do local storage again!
I have performance problems via network anything, they just aren't good enough yet.
I thought it was common knowledge that disk-speed is one of the major "slowness" factors when it comes to using VS in Windows. Most dev machines I've built have had projects located on 10k RPM RAID0 drives, or at least a single 10k RPM drive. And even then it seems slow sometimes. Just the way it is, I suppose, until VS2009/VS2010 fixes it? :)
From my experience, this lag when working on a network share is 99% due to Intellisense. Disable it and you'll see.
disabling Intellisense indeed speeds up saving and opening files trough a UNC share dramatically
http://blogs.msdn.com/saraford/archive/2007/12/03/did-you-know-how-to-turn-off-intellisense-by-default.aspx
but then again, as stated in other comments, you might as well use a good text editor
I've also experienced the problems with performance mentioned above. It seems to vary from project to project, but I did find one way of speeding up performance significantly for some project types.
Following the advice in this article made a previously unusable project on a network location (it would take minutes to open one file) perform almost like a local project. The basic gist is that you need to grant FULL TRUST to the network location:
To grant permission to all your projects in your Visual Studio Projects folder located on the network, follow these 8 steps:
Open Microsoft .NET Framework 1.1 (or 2.0) Configuration which you'll
find under Administrative Tools in the Control Panel.
Expand Runtime Security Policy | Machine, | Code Groups | All_Code |
LocalIntranet_Zone In the right-hand pane, click Add a Child Code
Group.
In the dialog that follows choose Create a new code group and fill in
a Name like Visual Studio Projects.
Optionally, provide a Description for the Code Group. (You'll see the
description when you click a Code Group in the left tree, helping you
identify the various Code Groups you may have) .
In the Condition Type drop down, choose URL
For the URL field, type something like this:
file://YourServer/My Documents/Visual Studio Projects/*
Under Use existing permission set, choose FullTrust (that is, if you
trust your own applications. If you don't, choose a different
permission set or create a new one).
Not sure why this works, but it made a previously unusable NET 2.0 project perform significantly better.
Original article: http://imar.spaanjaars.com/364/how-do-i-allow-my-visual-studio-net-projects-to-run-from-a-network-location
I was having the same problem. I have a local copy of our build system, which expects certain drive letters, and was also experiencing slowness.
I have solved the problem by adding the following registry keys:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices]
"R:"="\DosDevices\D:\devel\build
"S:"="\DosDevices\D:\devel\src"
Note that the double '\'s above are part of the .reg file format. When using regedit use single '\' throughout.
My build times were divided by 3. :)
I found the info in the wikipedia article on the SUBST command.

TortoiseSVN file overlay performance - are there any ways of improving it?

I'm using TortoiseSVN on my development machine (running Windows Server 2003) and VisualSVN Server on the server side. Both are the latest versions (against Subversion 1.6.5).
Everything works well generally; however I'm getting a little frustrated with the TortoiseSVN file overlays (the little icons that show locked or modified statuses on the files in Explorer). Sometimes these overlays seem to update instantly after a commit or lock, sometimes they only change after a couple of refreshes, and sometimes they show completely the wrong status until the next reboot.
It might be an impossible question to answer, given the amount of variables (other installed software, for example), but are there any known tricks to speed up the updating of these overlays?
By far the biggest performance increase I got was to set the client's Icon Overlays to not process the whole hard drive, only the locations my SVN files live in.
To do this, open the settings (right-click in Explorer->TortoiseSVN->Settings), select Icon Overlays, then in the Exclude paths: enter c:\*
In the Include paths: enter the paths to your Subversion working copy directories (for me all are under c:\subversion\* and c:\workspaces\*)
Use a newline to separate entries (see screenshot)
This made the client seem a thousand times quicker.
This screenshot shows how to exclude c: and d: drives, just including the relevant directories:
You can get some more performance tips from the TortoiseSVN docs.
You could disable TSVNCache.exe altogether. I decided I was willing to live without updated icons if it meant I could open and close Visual Studio orders of magnitude faster.
You could also lower the priority of TSVNCache, which is what my boss did with success. The main problem we were having was startup and shutdown times of Visual Studio, so he wrote a batch file that lowered the priority of the process and then started VS.

Resources