Windows Server 2012 Memory Leak - visual-studio-2013

I have non-pool memory value about 3 Gb from total 6 Gb on windows server 2012. I think it's memory leak, see screens:
Top process:
Bamboo uses Microsoft Visual Studio 2013 to build project and run tests.
How to determine bad process?
And how to solve problem? May be it's possible to dispose this leak?

You have a memory leak caused by a driver, not by an application. Look at the high value of nonpaged kernel memory. In your case this is 2.7 GB. You can use poolmon to see which driver is causing the high usage.
Install the Windows WDK, run poolmon, sort it via P after pool type so that non paged is on top and via B after bytes to see the tag which uses most memory. Run poolmon by going to the folder where WDK is installed, go to Tools (or C:\Program Files (x86)\Windows Kits\8.1\Tools\x64) and click poolmon.exe.
Now look which pooltag uses most memory as shown here:
Now open a cmd prompt and run the findstr command. To do this, open cmd prompt and type "cd C:\Windows\System32\drivers" to go to the drivers directory, without quotes. Then type findstr /s __ *.*, where __ is the tag that you see in poolmon.
After doing this to see which driver uses this tag:
Now, go to the drivers folder (C:\Windows\System32\drivers) and right-click the driver in question (intmsd.sys in the above image example). Click Properties, go to the details tab to find the Product Name. Look for an update for that product.
If you can't find a driver to the pooltag, look in the pooltag.txt if the tag is used by a Windows driver.
If you find the tag in the pooltag.txt, you need to capture a grow of the pool usage with xperf. First, you have to install the Windows Performance Toolkit. Next open a cmd prompt (cmd.exe) as admin and run this:
xperf -on BASE+Pool -stackwalk PoolAlloc+PoolFree -buffersize 2048
-MaxFile 1024 -FileMode Circular && timeout -1 && xperf -d C:\trace_pool_alloc.etl
Now open it in WPA.exe, load the debug symbols and look for the tag that you saw in poomon under AIFO (allocated insde freed outside) and expend the stack. From the function names you may have any idea what is going on.
In this example the FILE tag usage comes from a tool called locate32 which scans the HDD to build up its search index.

Related

Memory consumption of Windows application

I hope the question is not too vague and someone can add some light to my problem.
I created a Windows application (makefile) with the chromium project (already asked about this problem in the chromium forum) and Visual Studio 2019.
The application starts some processes and each of them used around 20 KB memory but strangely this same application uses over 200 KB per process is some PCs with same Windows version.
(Memory usage after starting the application, nothing else done)
I have been fighting a couple of days with the compiler/linker options with no success. Still huge memory usage.
Chromium examples did not show this problem using my makefile which made me even more crazy.
At the end I ended up changing the name of the exe file, instead of app.exe just app1.exe and...problem gone, normal memory usage in all PCs which shown this problem.
I changed the name in the makefile to generate same executable but with different name and also changed in Windows Explorer the name of the original and problematic exe file with same positive result.
I renamed in Windows Explorer the good app1.exe application back to app.exe and the problem appeared again...
I am searching now for some kind of Windows configuration or program which could generate this problem but no luck so far. Windows Firewall already disabled.
or could this be some kind of virus?
Problem solved...thanks to ProcessHacker tool I found out that the library "verifier.dll" was loaded in the bad case.
This dll is part of the Windows Application Verifier tool. I do not remember to have used this tool before, it is even not activated.
I then found out that there is a registry entry:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\Windows NT\CurrentVersion\Image File Execution Options\
where I could find the name of my application. After renaming this entry everything works as expected.
I will now investigate what the variables inside this entry mean:
GlobalFlag = 0x02000000 and PageHeapFlags = 0x2

VB6 Debugging - compiled

My scenario is I'm supporting a VB6 app at the place I work and in the last few weeks it has started crashing more often than it ever used to. It uses both a local Access MDB database and a remote SQL Server DB for different types of storage. The good news is we are writing a replacement app, the band news I need to support this one in the meantime and the vendor is long gone from this world.
What are some ways I could try and diagnose what is causing the crash? For example so far I've tried ODBC tracing (For the MDB component), SQL Profiler tracing and ProcMon on a client PC.
Is there anything else I could try to discover what the app was trying to do at the time of the crash?
You can also start in a debugger.
windbg or ntsd (ntsd is a console program and maybe installed). Both are also from Debugging Tools For Windows.
Download and install Debugging Tools for Windows
http://msdn.microsoft.com/en-us/windows/hardware/hh852363
Install the Windows SDK but just choose the debugging tools.
Create a folder called Symbols in C:\
Start Windbg. File menu - Symbol File Path and enter
srv*C:\symbols*http://msdl.microsoft.com/download/symbols
then
windbg -o -g -G c:\windows\system32\cmd.exe /k batfile.bat
You can press F12 to stop it and kb will show the call stack (g continues the program). If there's errors it will also stop and show them.
Type lm to list loaded modules, x *!* to list the symbols and bp symbolname to set a breakpoint
Use db address (as in db 01244 to see what's at that memory.
If programming in VB6 then this environmental variable link=/pdb:none stores the symbols in the dll rather than seperate files. Make sure you compile the program with No Optimisations and tick the box for Create Symbolic Debug Info. Both on the Compile tab in the Project's Properties.

Not enough storage is available to complete this operation

Environment:
Visual Studio Ultimate 2010
Windows XP
WPF Desktop Application using .NET 4.0
We have a desktop application which plays a video. This video is part of a project and the project is packaged into the installer. Every once in a while building the installer project shows this error message:
Not enough storage is available to complete this operation
If I restart Visual Studio it works.
Is there a way to avoid this? Is there a better way to package videos in an installer?
This usually happens when the build process needs a lot of RAM memory and cannot get it. Since restarting Visual Studio fixes the problem, most likely it also your case.
Try closing some of the running applications. You can also try adding more RAM to your machine or increasing the page file.
I came across this question when trying to compile my C# solution in Visual Studio 2010 in Windows XP. One project had a fair number of embedded resources in (the size of the resultant assembly was ~140MiB) and I couldn't compile the solution because I was getting the
Not enough storage is available to complete this operation
error in my build output.
None of the answers on this question helped, but I did find an answer to "Not enough storage is available to complete this operation" by ScottBurton42 on social.msdn.microsoft.com. It suggests adding the 3GB switch to the Boot.ini file, and making devenv.exe large-address aware. Adding the 3GB switch to my Boot.ini file was what worked for me (I think devenv.exe for Visual Studio 2010 and above is already large-address aware).
My answer is based on that answer.
Solution 1: Set the /3GB Boot.ini switch
The page Memory Support and Windows Operating Systems on MSDN says:
The virtual address space of processes and applications is still limited to 2 GB unless the /3GB switch is used in the Boot.ini file.
The /3GB switch allocates 3 GB of virtual address space to an application that uses IMAGE_FILE_LARGE_ADDRESS_AWARE in the process header. This switch allows applications to address 1 GB of additional virtual address space above 2 GB.
The virtual address space of processes and applications is still limited to 2 GB, unless the /3GB switch is used in the Boot.ini file. The following example shows how to add the /3GB parameter in the Boot.ini file to enable application memory tuning:
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(2)\WINNT
[operating systems]
multi(0)disk(0)rdisk(0)partition(2)\WINNT="????" /3GB
Note "????" in the previous example is be the programmatic name of the operating system.
In Windows XP, the Boot.ini file can be modified by going to
System Properties → Advanced → Startup and Recovery → Settings → System Startup → Edit
The page on the /3GB switch on MSDN says:
On 32-bit versions of Windows, the /3GB parameter enables 4 GT RAM Tuning, a feature that enlarges the user-mode virtual address space to 3 GB and restricts the kernel-mode components to the remaining 1 GB.
The /3GB parameter is supported on Windows Server 2003, Windows XP, and Windows 2000. On Windows Vista and later versions of Windows, use the IncreaseUserVA element in BCDEdit.
Restarting the machine will then cause the setting to take effect.
Solution 2: Make devenv.exe large address aware:
Open up a Visual Studio Command Prompt (or a Developer Command Prompt, depending on the version of Visual Studio)
Type and execute the following command line:
editbin /LARGEADDRESSAWARE {path}\devenv.exe`
where {path} is the path to devenv.exe (you can find this by going to the properties of the Visual Studio shortcut).
This will allow devenv.exe to access 3GB of memory instead of 2GB.
Problem
In my case, the problem was with a test project containing a very large (1.5GB) test file as an embedded resource. I have 16GB RAM in my machine with 8GB free when this occurred, so RAM was not the issue.
It is possible that we are hitting the 2 GB limit that the CLR has on any single object. Without delving into what MSBuild is doing under the hood, I can only speculate that during compile time, the embedded resource is loaded into an object graph that is hitting this limit.
The Error message is very unhelpful. My first thought when I saw it was, "Have I run out of disk space?"
Solution
It is a file validation test project. One of the requirements is to be able to handle files of this size, so on face value my team thought it reasonable to embed it for use in test cases.
We fixed the error by moving the file onto the network (in the same way that it would be accessed by the validator in production) and marking the test as an integration test instead of a unit test. After-all, aren't unit tests supposed to be fast-running?
Cleaning And rebuilding the solution worked for me
For Visual Studio, you can try to do the following:
Close All Visual Studio instances.
Open Visual Studio Developer tool in Administrator mode.
Navigate to:
C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE.
Type following:
editbin /LARGEADDRESSAWARE devenv.exe.
It's worth also to restart PC.
Hope this helps )
In my case, I had very less memory left in C drive. I cleared few items from C drive and tried again. It worked.
I might be late to answer but for future reference, you might want to check the Windows dump file settings (and probably set it to none).
In my case the server I was executing the code on couldn't handle my parallelized code.
Normally I'm running a setup like the following
new ParallelOptions { MaxDegreeOfParallelism = Math.Max(1, Environment.ProcessorCount / 2) }
Introducing a variable and allowing lockdown of cores used to 1 (resulting in code like the following), resolved this issue for me.
new ParallelOptions { MaxDegreeOfParallelism = 1 }
The key for me:
We had embedded a huge database template (testing had filled it with lots of data) into the application. I have not seen this issue arise since removing Embedded Resource boils properly and moving the database to a recourse folder.
My fix this problem with delete or disable(exclude) the *.rpt files that have large size;and I've optimize the my reports!
I am late to Answer but may be useful for others
In my case just restarting Visual Studio fixes the problem

Issue with Visual C++ 2010 (Express) External Tools command

I posted this on SuperUser...but I was hoping the pros here at SO might have a good idea about how to fix this as well....
Normally we develop in VS 2005 Pro, but I wanted to give VS 2010 a spin. We have custom build tools based off of GNU make tools that are called when creating an executable.
This is the error that I see whenever I call my external tool:
...\gnu\make.exe): *** couldn't commit memory for cygwin heap, Win32 error 487
The caveat is that it still works perfectly fine in VS2005, as well as being called straight from the command line. Also, my external tool is setup exactly the same as in VS 2005.
Is there some setting somewhere that could cause this error to be thrown?
From problem with heap, win32 error 487 :
Each Cygwin app gets a special heap
area to hold stuff which is inherited
to child processes. Eg. all file
descriptor structures are stored in
that heap area (called the "cygheap").
The cygheap has room for at least 4000
file descriptor structures. But -
that's the clue - it's fixed size. The
cygheap can't grow. It's size is
reserved at the application's start
and it's blocks are commited on
demand.
For some reason your server
application needs all the cygheap
space when running under the described
conditions.
A possible solution might be found in Changing Cygwin's Maximum Memory:
Cygwin's heap is extensible. However,
it does start out at a fixed size and
attempts to extend it may run into
memory which has been previously
allocated by Windows. In some cases,
this problem can be solved by adding
an entry in the either the
HKEY_LOCAL_MACHINE (to change the
limit for all users) or
HKEY_CURRENT_USER (for just the
current user) section of the registry.
Add the DWORD value heap_chunk_in_mb
and set it to the desired memory limit
in decimal MB. It is preferred to do
this in Cygwin using the regtool
program included in the Cygwin
package. (For more information about
regtool or the other Cygwin utilities,
see the section called “Cygwin
Utilities” or use the --help option of
each util.) You should always be
careful when using regtool since
damaging your system registry can
result in an unusable system. This
example sets memory limit to 1024 MB:
regtool -i set /HKLM/Software/Cygwin/heap_chunk_in_mb 1024
regtool -v list /HKLM/Software/Cygwin
Exit all running Cygwin processes and
restart them. Memory can be allocated
up to the size of the system swap
space minus any the size of any
running processes. The system swap
should be at least as large as the
physically installed RAM and can be
modified under the System category of
the Control Panel.
It wouldn't hurt to ensure that the maximum size of your windows swap file is large enough.
To summerize : The environment doesn't allocate enough heap space for the cygwin executables. For some reason the problem is more acute with VS2010 Express. You need to either fix the environment, or use another Linux port than cygwin, or use Microsoft utilities.
From the cygwin email lists it looks like other people have run into similar situations, even when not running via Visual Studio, to which they've found that the solution is often to play with Cygwin's maximum memory settings:
http://www.cygwin.com/cygwin-ug-net/setup-maxmem.html
(note: it's worth reading this conversation, from above, about some values that did and didn't work).
Others have also reported issues with Anti-Virus software (recommendation is to unload from memory for some reason), and possibly also compatibility settings (try with it set to XP) which can affect cygwin in certain cases. See: http://www.avrfreaks.net/index.php?name=PNphpBB2&file=viewtopic&p=377066
As for Visual Studio: Are you on a 64bit machine and if so are you usually running the tool in a 64bit environment?
I've found that because Visual Studio 2010 runs in 32bit, tools launched from it are launched as 32bit processes (for a good illustration of this, add "cmd" as a tool). I'm not sure why this wouldn't be affected on 2005 (unless 2005 lets the system launch the process (64bit) and 2010 handles it itself (32bit)).

How can I simulate a disk full error in a Windows environment?

I have to write a bat script for a test scenario where the software that we are testing fails to write to file due to a disk full error. The test script must be automated, so that we can run it on overnight tests, for example. The test script must also work at different computers, so installing a software like a virtual machine wouldn't be the best solution in this case.
How can I simulate that error in a Windows environment?
You could try writing to a full floppy disk.
Edit:
In light of your edited question, you could set up a network share with no disk space quota and write to that. The error will then be produced regardless of the logged on user or machine.
For Windows XP or later:
This command can get the amount of free space for the c:\ drive:
for /f "usebackq tokens=1-5" %%A in (`dir c:\ ^| find "bytes free"`) do (
set FREE_SPACE=%%C
)
Replace c:\ with your drive, as needed.
You can then take some space away from this value so you have a little room to work with:
set /a FREE_SPACE=FREE_SPACE-1024
or however much space you want to keep free.
You can use the fsutil command to create a file to fill up the free space on the disk:
fsutil file createnew c:\spacehog.dat %FREE_SPACE%
Run your test, writing to the drive. After you write 1024 bytes or so you should run out of space.
Download and install TrueCrypt. You can then create a virtual partition of whatever size you want (a couple of megabytes), mount it and then fill it with a couple of documents.
Best Option: Microsoft's consume program
Reasons:
It tests the system disk (vs a separate drive)
It's fast - run the program to fill the disk instantly, stop when no longer needed
It's easy - No creating and deleting files. No extra test partition hanging around. Installation is required, but you can use a simple command afterward.
It's scriptable
Steps:
Install the Windows Server 2003 Resource Kit Tools (Works fine on Windows 7)
cd "%ProgramFiles(x86)%\Windows Resource Kits\Tools" (or whereever it's installed)
consume.exe -disk-space
Command output:
C:\Program Files (x86)\Windows Resource Kits\Tools>consume.exe
Universal Resource Consumer - Just an innocent stress program, v 0.1.0
Copyright (c) 1998, 1999, Microsoft Corporation
consume RESOURCE [-time SECONDS]
RESOURCE can be one of the following:
-physical-memory
-page-file
-disk-space
-cpu-time
-kernel-pool
C:\Program Files (x86)\Windows Resource Kits\Tools>consume.exe -disk-space
Consume: Message: Total disk space: 96049 Mb
Consume: Message: Free disk space: 14705 Mb
Consume: Message: Free per user space: 14705 Mb
Consume: Message: Attempting to use: 14705 Mb
Consume: Message: Reattempting to use: 14705 Mb
Consume: Message: Sleeping ...
Other Options:
Windows 7 has a virtual hard drive feature. Basically do the following: Computer Management > Disk Management > Action Menu > Create VHD > Right click disk and Initialize > Right click
Generate large files (should be instant) until your disk is full using a shell command or Dummy File Generator program. Another prorgram: SpaceHog.
It might seem like a bit much, but one thing I can think of is to use a virtual machine, and set its virtual disk to just big enough to fit the OS on. Fill it with some garbage files to tip it over the edge, then run your program.
Create a secondary partition, fill it with junk and then run your program there.
You could setup a small ramdisk and write to that. See this page for some free ramdisk products.
Create a new user accout, set a quota for it, and use runas to run your app as that user. (not exactly the same as disk full, but should have similar consequences.)
The operating system will respond differently to it's system drive filling than to other drives filling and as such your application will do so too, surely? Simply filling a drive irrespective of what the physical media is used isn't going to be a accurate test.
Can't you mock the file system event for a full disk? Why would you want to wait until the disk is full? Wouldn't you want to monitor disk space periodically and warn the user when the disk is with a percentage margin of filling? Rather than wait until the disk space is terminal simply prevent your application from working until the issue is resolved, not doing so could effect any data IO and be unrecoverable!
If the test has to be a hard integration test then automating a virtual machine, deploying the application and then fill the remaining space with a recursive script is feasible.
Best thing that works on every computer (as testing is not neceessarily done on a dedicated machine) would be a ramdrive/ramdisk that could be set up on the fly.
Only found a Virtual Disk SDK so far see here that maybe could be included in your buzild process.
Different idea: maybe your testing computers could be set up to write to a shared network folder (that is full) mount as a drive?
I have made a modification to the above script to make it compatiable with Windows 7... Essentially adding the switch "/-c" to the for statement. This removes the thousands seperator as fsutil does not like it in the statement.
for /f "usebackq tokens=1-5" %%A in (`dir /-c d:\ ^| find "bytes free"`) do (set FREE_SPACE=%%C)
fsutil file createnew d:\largefile.txt %FREE_SPACE%
use a very small iscsi target

Resources