What can my 32-bit app be doing that consumes gigabytes of physical RAM? - windows

A co-worker mentioned to me a few months ago that one of our internal Delphi applications seems to be taking up 8 GB of RAM. I told him:
That's not possible
A 32-bit application only has a 32-bit virtual address space. Even if there was a memory leak, the most memory it could consume is 2 GB. After that allocations would fail (as there would be no empty space in the virtual address space). And in the case of a memory leak, the virtual pages will be swapped out to the pagefile, freeing up physical RAM.
But he noted that Windows Resource Monitor indicated that less than 1 GB of RAM was available on the system. And while our app was only using 220 MB of virtual memory: closing it freed up 8 GB of physical RAM.
So I tested it
I let the application run for a few weeks, and today I finally decided to test it.
First I look at memory usage before closing the app, using Process Explorer:
the working set (RAM) is: 241 MB
total virtual memory used: 409 MB
And I used Resource Monitor to check memory used by the app, and total RAM in use:
virtual memory allocated by application: 252 MB
physical memory in use: 14 GB
And then memory usage after closing the app:
physical memory in use: 6.6 GB (7.4 GB less)
I also used Process Explorer to look at a breakdown of physical RAM use before and after. The only difference is that 8 GB of RAM really was uncommitted and now free:
Item
Before
After
Commit Charge (K)
15,516,388
7,264,420
Physical Memory Available (K)
1,959,480
9,990,012
Zeroed Paging List (K)
539,212
8,556,340
Note: It's somewhat interesting that Windows would waste time instantly zeroing out all the memory, rather than simply putting it on a standby list, and zero it out as needed (as memory requests need to be satisfied).
None of those things explain what the RAM was doing (What are you doing just sitting there! What do you contain!?)
What is in that memory?
That RAM must contain something useful; it must have some purpose. For that I turned to SysInternals' RAMMap. It can break down memory allocations.
The only clue that RAMMap provides is that the 8 GB of physical memory was associated with something called Session Private. These Session Private allocations are not associated with any process (i.e. not my process):
Item
Before
After
Session Private
8,031 MB
276 MB
Unused
1,111 MB
8,342 MB
I'm certainly not doing anything with EMS, XMS, AWE, etc.
What could possibly be happening in a 32-bit non-Administrator application that is causing Windows to allocate an additional 7 GB of RAM?
It's not a cache of swapped out items
it's not a SuperFetch cache
It's just there, consuming RAM.
Session Private
The only information about "Session Private" memory is from a blog post announcing RAMMap:
Session Private: Memory that is private to a particular logged in session. This will be higher on RDS Session Host servers.
What kind of app is this?
This is a 32-bit native Windows application (i.e. not Java, not .NET). Because it is a native Windows application it, of course, makes heavy use of the Windows API.
It should be noted that I wasn't asking people to debug the application; I was hoping a Windows developer out there would know why Windows might hold memory that I never allocated. Having said that, the only thing changed recently (in the last 2 or 3 years) that could cause such a thing is the feature that takes a screenshot every 5 minutes and saves it to the user's %LocalAppData% folder. A timer fires every five minutes:
QueueUserWorkItem(TakeScreenshotThreadProc);
And pseudo-code of the thread method:
void TakeScreenshotThreadProc(Pointer data)
{
String szFolder = GetFolderPath(CSIDL_LOCAL_APPDTA);
ForceDirectoryExists(szFolder);
String szFile = szFolder + "\\" + FormatDateTime("yyyyMMdd'_'hhnnss", Now()) + ".jpg";
Image destImage = new Image();
try
{
CaptureDesktop(destImage);
JPEGImage jpg = new JPEGImage();
jpg.CopyFrom(destImage);
jpg.CompressionQuality = 13;
jpg.Compress();
HANDLE hFile = CreateFile(szFile, GENERIC_WRITE,
FILE_SHARE_READ | FILE_SHARE_WRITE, null, CREATE_ALWAYS,
FILE_ATTRIBUTE_ARCHIVE | FILE_ATTRIBUTE_ENCRYPTED, 0);
//error checking elucidated
try
{
Stream stm = new HandleStream(hFile);
try
{
jpg.SaveToStream(stm);
}
finally
{
stm.Free();
}
}
finally
{
CloseHandle(hFile);
}
}
finally
{
destImage.Free();
}
}

Most likely somewhere in your application you are allocating system resources and not releasing them. Any WinApi call that creates an object and returns a handle could be a suspect. For example (be careful running this on a system with limited memory - if you don't have 6GB free it will page badly):
Program Project1;
{$APPTYPE CONSOLE}
uses
Windows;
var
b : Array[0..3000000] of byte;
i : integer;
begin
for i := 1 to 2000 do
CreateBitmap(1000, 1000, 3, 8, #b);
ReadLn;
end.
This consumes 6GB of session memory due to the allocation of bitmap objects that are not subsequently released. Application memory consumption remains low because the objects are not created on the application's heap.
Without knowing more about your application, however, it is very difficult to be more specific. The above is one way to demonstrate the behaviour you are observing. Beyond that, I think you need to debug.
In this case, there are a large number of GDI objects allocated - this isn't necessarily indicative, however, since there are often a large number of small GDI objects allocated in an application rather than a large number of large objects (The Delphi IDE, for example, will routinely create >3000 GDI objects and this is not necessarily a problem).
In #Abelisto's example (in comments), by contrast :
Program Project1;
{$APPTYPE CONSOLE}
uses
SysUtils;
var
i : integer;
sr : TSearchRec;
begin
for i := 1 to 1000000 do FindFirst('c:\*', faAnyFile, sr);
ReadLn;
end.
Here the returned handles are not to GDI objects but are rather search handles (which fall under the general category of Kernel Objects). Here we can see that there are a large number of handles used by the process. Again, process memory consumption is low but there is a large increase in session memory used.
Similarly, the objects might be User Objects - these are created by calls to things like CreateWindow, CreateCursor, or by setting hooks with SetWindowsHookEx. For a list of WinAPI calls that create objects and return handles of each type, see :
Handles and Objects : Object Categories -- MSDN
This can help you start to track down the issue by narrowing it to the type of call that could be causing the problem. It may also be in a buggy third-party component, if you are using any.
A tool like AQTime can profile Windows allocations, but I'm not sure if there is a version that supports Delphi5. There may be other allocation profilers that can help track this down.

Related

How to have my Visual Studio running without debugging program keep expanding RAM usage till RAM is full and even then use SSD as "RAM"?

I got an "Out of memory exception" and my 32 GB RAM isn't even full, it's downloading pictures, I want it to keep downloading and expanding RAM usage at night and even use the SSD as alternative "RAM" after the RAM is filled (I have 2 TB SSD and it has 390 GB empty).
And why is it expanding RAM, I'm doing a loop that contains a lot of this:
using (WebClient wc = new WebClient())
{
wc.DownloadFileAsync(new Uri(url), #"g:\Folder1\" + i.ToString() + ".jpg");
}
So why isn't the RAM being released after each time the file (image) has completed downloading.
Set a pointer to the object to null to encourage garbage collection.
That said, languages such as Java an C# that rely on garbage collection, rather than having true destructors such as C++ and Delphi (Object Pascal) that can be called explicitly, may be slow to put out the trash.

How to (temporary) release memory from VirtualAlloc?

When using VirtualAlloc I can (ab)use the following property to simplify memory management.
Actual physical pages are not allocated unless/until the virtual addresses are actually accessed.
I run the following code to allocate the block.
type
PArrayMem = ^TArrayMem; //pointer
TArrayMem = packed record //as per documentation
RefCount: Integer;
Length: NativeInt;
Elements: Integer;
end;
var
a: array of integer; //dynamic array, structure see above
procedure TForm38.Button1Click(Sender: TObject);
const
AllocSize = 1024 * 1024 * 1024; //1 GB
var
ArrayMem: PArrayMem;
begin
//SetLength(a, 1024*1024*1024); //1G x 8*16
ArrayMem:= VirtualAlloc(nil, AllocSize, MEM_COMMIT or MEM_RESERVE, PAGE_READWRITE);
ArrayMem.RefCount:= 1;
ArrayMem.Length:= AllocSize div SizeOf(Integer);
a:= #ArrayMem.Elements; //a:= AddressOf(elements)
a[1]:= 10; //testing, works
a[0]:= 4;
a[500000]:= 56; //Works, autocommits, only adds a few k to the used memory
button1.Caption:= IntToStr(a[500000]); //displays '56'
end;
All this works great. If my structure grows to 1.000.000 elements everything works.
However suppose afterwards my structure shrinks back to 1.000 elements.
How do I release the RAM so that it will get auto-magically committed when needed again?
WARNING
David warned my that allocating an committing large (huge) continous pages of memory carries a large cost.
So it may be more advantageous to split up the array in smaller blocks and abstract away the internals using a class/record.
You can decommit pages using VirtualFree passing the MEM_DECOMMIT flag. Then you can commit again using VirtualAlloc.
Or you may use the DiscardVirtualMemory function introduced in Windows 8.1.
Use this function to discard memory contents that are no longer needed, while keeping the memory region itself committed. Discarding memory may give physical RAM back to the system. When the region of memory is again accessed by the application, the backing RAM is restored, and the contents of the memory is undefined.
You may find something useful in the comments to this related question: New Windows 8.1 APIs for virtual memory management: `DiscardVirtualMemory()` vs `VirtualAlloc()` and `MEM_RESET` and `MEM_RESET_UNDO`

Does Windows clear memory pages?

I know that Windows has an option to clear the page file when it shuts down.
Does Windows do anything special with the actual physical/virtual memory when it goes in or out of scope?
For instance, let's say I run application A, which writes a recognizable string to a variable in memory, and then I close the application. Then I run application B. It allocates a large chunk of memory, leaves the contents uninitialized, and searches it for the known string written by application A.
Is there ANY possibility that application B will pick up the string written by application A? Or does Windows scrub the memory before making it available?
Windows does "scrub" the freed memory returned by a process before allocating it to other processes. There is a kernel thread specifically for this task alone.
The zero page thread runs at the lowest priority and is responsible for zeroing out free pages before moving them to the zeroed page list[1].
Rather than worrying about retaining sensitive data in the paging file, you should be worried about continuing to retain it in memory (after use) in the first place. Clearing the page-file on shutdown is not the default behavior. Also a system crash dump will contain any sensitive info that you may have in "plain-text" in RAM.
Windows does NOT "scrub" the memory as long as it is allocated to a process (obviously). Rather it is left to the program(mer) to do so. For this very purpose one can use the SecureZeroMemory() function.
This function is defined as the RtlSecureZeroMemory() function ( see WinBase.h). The implementation of RtlSecureZeroMemory() is provided inline and can be used on any version of Windows ( see WinNT.h)
Use this function instead of ZeroMemory() when you want to ensure that your data will be overwritten promptly, as some C++ compilers can optimize a call to ZeroMemory() by removing it entirely.
WCHAR szPassword[MAX_PATH];
/* Obtain the password */
if (GetPasswordFromUser(szPassword, MAX_PATH))
{
UsePassword(szPassword);
}
/* Before continuing, clear the password from memory */
SecureZeroMemory(szPassword, sizeof(szPassword));
Don't forget to read this interesting article by Raymond Chen.

what is the size of windows semaphore object?

How to find size of a semaphore object in windows?
I tried using sizeof() but we cannot give name of the sempahore object as an argument to sizeof. It has to be the handle. sizeof(HANDLE) gives us the size of handle and not semaphore.
This what is known as an "opaque handle.". There is no way to know how big it really is, what it contains or how any of the functions work internally. This gives Microsoft the ability to completely rewrite the implementation with each new version of Windows if they want to without worrying about breaking existing code. It's a similar concept to having a public and private interface to a class. Since we are not working on the Windows kernel, we only get to see the public interface.
Update:
It might be possible to get a rough idea of how big they are by creating a bunch and monitoring what happens to your memory usage in Process Explorer. However, since there is a good chance that they live in the kernel and not in user space, it might not show up at all. In any case, there are no guarantees about any other version of Windows, past or future, including patches/service packs.
It's something "hidden" from you. You can't say how big it is. And it's a kernel object, so it probably doesn't even live in your address space. It's like asking "how big is the Process Table?", or "how many MB is Windows wasting?".
I'll add that I have made a small test on my Windows 7 32 bits machine: 100000 kernel semaphores (with name X{number} with 0 <= number < 100000)) : 4 mb of kernel memory and 8 mb of user space (both measured with Task Manager). It's about 40 bytes/semaphore in kernel space and 80 bytes/semaphore in user space! (this in Win32... In 64 bits it'll probably double)

Memory Usage in R

After creating large objects and running out of RAM, I will try and delete the objects in my current environment using
rm(list=ls())
When I check my RAM usage, nothing has changed. Even after calling gc() nothing has changed. I can only replenish my RAM by quitting R.
Anybody have advice for dealing with memory-intensive objects within R?
Memory for deleted objects is not released immediately. R uses a technique called "garbage collection" to reclaim memory for deleted objects. Periodically, it cycles through the list of accessible objects (basically, those that have names and have not been deleted and can therefore be accessed by the user), and "tags" them for retention. The memory for any untagged objects is returned to the operating system after the garbage-collection sweep.
Garbage collection happens automatically, and you don't have any direct control over this process. But you can force a sweep by calling the command gc() from the command line.
Even then, on some operating systems garbage collection might not reclaim memory (as reported by the OS). Older versions of Windows, for example, could increase but not decrease the memory footprint of R. Garbage collection would only make space for new objects in the future, but would not reduce the memory use of R.
On Windows, the technique you describe works for me. Try the following example.
Open the Windows Task Manager (CTRL+SHIFT+ESC).
Start RGui. RGui.exe mem usage is 27 460K.
Type
gcinfo(TRUE)
x <- rnorm(1e8)
RGui.exe mem usage is now 811 100K.
Type rm("x"). RGui.exe mem usage is still 811 100K.
Type gc(). RGui.exe mem usage is now 28 332K.
Note that gc shoud be called automatically if you have removed objects from your workspace, and then you try to allocate more memory to new variables.
My impression is that multiple forms of gc() are tried before R reports failed memory allocation. I'm not aware of a solution for this at present, other than restarting R as you suggest. It appears that R does not defragment memory.
An old question, I realize, but I've found that (on OS Mojave), invoking pryr::mem_used() in the R session causes the activity monitor to immediately update the reported memory usage to reflect only the objects retained in the R environment.

Resources