DebugDiag Perf Analysis fails with error System.ArgumentException - windows

I am trying to debug high CPU usage in an IIS app pool on a Windows 2012 R2 server. I have installed DebugDiag v2 Update 3 and collected dumps of the IIS app pool whenever its CPU usage goes above a specified threshold.
I was able to get 11 dumps generated - 10 mini dumps and 1 full dump. However, when I open the DebugDiag Analysis tool and load any (or all) of the dump files and do a PerfAnalysis, it always fails with the error System.ArgumentException, no matter what I try. I have tried to do this multiple times, and I still get the same error every time I try to analyse the files.
Has anyone been able to successfully do a Perf Analysis? I tried the crash analysis and it seems to work fine; it's only the Perf Analysis which errors out.
Here is the kind of error I see every time I try to do a PerfAnalysis:

Related

Merging of ETL files has failed (0x80070070) (Flags: 0x0000011f)

Trying to profile my command line application in VS 2017 on a W10 machine with April 2018 Update, VS fails to create the report.
The UI says Microsoft Visual Studio was unable to create a diagnostics report. Check Output window for errors.
The Output window says
Profiling of 'Program' started.
Program has exited.
Profiling of 'Program' stopped.
Diagnostics session stopped with errors.
Merging of ETL files has failed (0x80070070) (Flags: 0x0000011f).
Previous search gives little answers as to why, but the problem seems related to Windows' event logging service. Comments on this similar question suggest it's related to disk occupation, but with no source. My SSD is indeed almost full, but with 6 GB of free space.
I resolved the issue, the multiple times it happened, by restarting windows and then start the profiling as first thing as the OS is ready. On a fresh start it works, but going further with changes and profiling it eventually appears again.
If someone knows a long-term solution feel free to add.
Code 0x80070070 "There is not enough space on the disk." Your disk was full, or became full during a build or other operation as temporary files were created.

Dump File analysis

Recently I start facing issue on few servers where CPU start consuming more resources than usual trend. I am trying to find out the root cause for this and took the dump of w3wp process from Task Manager(right click on process and took the dump).
Now the dmp file size is 14GB and I am trying to analyze it through WinDBG but the tool is not working and getting message:
I also took few minidumps but some of them opening fine while few are not so it's not related to confusion between 32bit or 64bit.(The collected dump is 64bit).
I am trying to know what causing this issue. Is it file size or I am not taking the dump properly.
I checked link but it's not helpful.
Windbg is not the right tool for this job. Dumps are only snapshots so you have no idea what happened before. Use ETW and here the CPU Sampling, which sums all calls and shows you in detail the CPU usage.
Install the Windows Performance Toolkit which is part of the Windows 10 SDK (V1607 works on Win8/8.1(Server2012/R2) and Win10 or the V1511 SDK if you use Windows 7/Server2008R2)), run WPRUi.exe and select CPU Usage
and press on Start. Capture 1-2 minutes of the high CPU usage and next click on Save. Open the generated ETL with WPA.exe (Perf analyzer), drag and drop the CPU Usage (Sampled) graph to the analysys pane
and load the Debug Symbols. Now select your process in the graph, zoom in and expand the stack, here you see the weight of the CPU usage of all calls
In this sample most CPU usage from Internet Explorer comes from HTML stuff.
For .NET applications WPA shows you .net related groupings like GC or JIT:
Expand the stack of the w3wp process to see what it is doing. From the names you should have a clue what happens.

What could be the reason of failing of mini-dump generation but successful generation of Complete windows memory dump?

I was testing for windows SVVP (server virtualization validation program) test and I observe unexpected behavior. When I put 80 Gigs of storage for a 60 cores system, I am able to get the "Complete" memory dump. However, with same configuration, the mini-dump stucks at 0%.
I search many documents including from MSDN, but could able to find a very good reason for this behavior. I would be glad and very thankful if someone could provide a possible reason for such behavior.

Debug Diagnostic not generating dumps on Crash

I have configured DebugDiag to monitor all app pools for crashes. One of our App pool stopped today but DD did not generate any dump files. Event Viewer shows the error:
Application Poole is being automatically disabled due to a series of failures in the processes serving that application Pool.
Currently there are no breakpoints added to DD and first chance exception is not enabled.
Are they really required to get the dump generated? Any special IIS Config required?
I'm on win 2003 with Debug Diag 1.2.
It has generated dumps in the past for memory but haven't seen any crash dumps so far.
Your help will be greatly appreciated, pleaseee guide.

Visual studio Ultimate 2013 slow start on debug

I have installed visual studio 2013 Ultimate on Windows 8 enterprise edition.
When I start debugging an mvc project (which pretty empty) : it takes 27 seconds to start the debugging. I assume it's because IIS express 8 is loading symbols and hangs somewhere.
I have tried an empty mvc project and it starts in 10 seconds : which is very unacceptable.
I have tried :
- deleting all breakpoints
- enabling just my code
- unchecking symbols downloading from microsoft servers and downloading them on a local folder on the computer
- disabling intellitrace (was already disabled when i went to see)
- disabling just-in-time (was already disabled when i went to see)
- unplugging the ethernet cable (yes, i am pretty desperate)
- no antivirus is turned on
The first request (when i launch debugging) always take 27 seconds according to glimpse. The controller run under 1 second which is "acceptable". All the next requests are fine.
But I can't work with the 27 seconds each time I launch debugging.
Can someone help me ? I do not know what to do next.
My computer is a dual core 3Ghz with 4 Go of Ram and a 7200 rpm hd. I don't think it's hardware related.
Thank you very much.
UPDATE :
As soon as I start to use NLog in the code, it takes 30 sec to launch the debug mode.
If I comment all the place where I log something, It takes 10 sec. Sometimes less.
How much time you guys take to launch the debug mode ?
It's quit possible you are referencing dead or slow symbol path. For example, you're at home but accessing a symbol path on company's server. Check it under Tools -> Options -> Debug -> Symbol. If it's ok, check your system as follows.
Make sure there is no other process that runs out of your hardware resources.
First check if CPU usage is too high after staring debugging. If CPU usage is too high, use Process Explorer to check what activities VS Is performing. If they are in an extension thread, disable that extension. If they are in VS own thread, it's most likely a VS bug you can report to MS.
Check if memory usage is too high. If VS memory usage is too high, given that you just start simple debugging, it's a VS bug.
If both CPU and memory are ok, it's probably related to IO operation. Use Process Monitor to check which files are being accessed, especially files on remote machine.
This is how I troubleshoot the same problem on my machine. Hope it help you.

Resources