When I monitor my application using instruments and the instrument "allocations" I see a big amount of memory being marked as "dirty". What does that mean? I have no memory leaks in my application yet this pile of "dirty" memory keeps increasing.
Dirty is a computer term used to denote cached data that needs to be sync'ed with the main memory. Don't worry, since this is automatically done by the hardware.
"...I am trying to find out what is using up all my memory."
The WWDC 2010 Session 311 presentation, Advanced Memory Analysis with Instruments, includes a section on 'Responding to Memory Warnings' (at ~38:40 in the video) with a demo that illustrates how to find "resident, dirty memory" with the Instruments VM Tracker and one way to flush it.
Related
I am building an app in Xcode and am now deep into the memory management portion of the project. When I use Allocations and Leaks I seem to get entirely different results from what I see in Xcode's debug panel: particularly the debug panel seems to show much higher memory usage than what I see in Allocations and it also seems to highlight leaks that as far as I can tell (1) do not exist and (2) are confirmed to not exist by the Leaks tool. Is this thing useless, or even worse, misleading?
Here was a new one: today it told me I was using >1 GB of memory but its little memory meter read significantly <1 GB (and was still wrong if the Allocations data is accurate). Picture below.
UPDATE: I ran VM Tracker in a 38-minute session and it does appear virtual memory accounts for the difference between allocations / leaks and the memory gauge. Picture below. I'm not entirely sure how to think about this yet. Our game uses a very large number of textures that are swapped. I imagine this is common in most games of our scale (11 boards, 330 levels; each board and map screen has unique artwork).
You are probably using the Memory Gauge while running in the Simulator using a Debug build configuration. Both of those will give you misleading memory results. The only reliable way to know how memory is being managed is to run on a device using a Release build. Instruments uses the Release build configuration, so it's already going to be better than just running and using the Memory Gauge.
Moreover, it is a known flaw that the Xcode built-in memory tools, such as the Memory Debugger, can generate false positives for leaks.
However, Instruments has its flaws as well. My experience is that, for example, it fails to catch leaks generated during app startup. Another problem is that people don't always understand how to read its output. For example, you say:
the debug panel seems to show much higher memory usage than what I see in Allocations
Yes, but Allocations is not the whole story. You are probably failing to look at the VM allocations. Those are shown separately and often constitute the reason for high memory use (because they include the backing stores for images and the view rendering tree). The Memory Gauge does include virtual memory, so this alone might account for the "difference" you think you're seeing.
So, the answer to your question is: No, the Memory Gauge is not useless. It gives a pretty good idea of when you might need to be alert to a memory issue. But you are then expected to switch to Instruments for a proper analysis.
I am new to swift and coding in general. I have made my first OS X app over the last few days. It is a simple ticker app that lives in the menu bar.
My issue is that over the space of 3 hours, my app goes from 10mb or ram being used to over 1gb. It slowly and slowly uses more and more. I noticed after about 6 hours the app stops working, I can only assume that OS X has stopped the process because it's hogging too much memory?
Anyway, I have looked online and I have used Xcode instruments to try and find a memory leak, but I don't know exactly how to pin point it. Can anyone give me some general good ways to find memory leaks and sources of bugs when using Xcode? Any general practices are good too.
If the memory loss is not due to a leak (Run Leaks and Analyzer) the lost is to inadvertently retained and unused memory.
Use instruments to check for leaks and memory loss due to retained but not leaked memory. The latter is unused memory that is still pointed to. Use Mark Generation (Heapshot) in the Allocations instrument on Instruments.
For HowTo use Heapshot to find memory creap, see: bbum blog
Basically the method is to run Instruments allocate tool, take a heapshot, run an iteration of your code and take another heapshot repeating 3 or 4 times. This will indicate memory that is allocated and not released during the iterations.
To figure out the results disclose to see the individual allocations.
If you need to see where retains, releases and autoreleases occur for an object use instruments:
Run in instruments, in Allocations set "Record reference counts" on (For Xcode 5 and lower you have to stop recording to set the option). Cause the app to run, stop recording, drill down and you will be able to see where all retains, releases and autoreleases occurred.
When confronted with a memory leak, the first thing I do is look at where variables are created and destroyed; especially if they are defined in looping logic (although generally not a good idea).
Generally most memory leaks come from there. I would venture a guess that the leak occurs somewhere in the logic that tracks your timed iterations.
Good luck!
I've been reading a lot about tracking memory usage in Instrument's but found little in combination with Monotouch.
There seem to be to three oposing claims here:
Use the Allocations utility of Instruments. The number of "live bytes" is the amount of physical memory used by the application.
Use the Memory Monitor plugin. From the list of processes, pick your app and check the "Real memory" column. That's the amount of RAM currently in use.
Use VM Tracker and make automatic snapshots. The "Dirty Size" if what you're after.
From what I've noticed:
"Real Memory" drops as soon as GC is triggered
Even if my "Live Bytes" remain around 30MB I will eventually catch memory warnings
With constant "Live Bytes", "Real Memory" can increase significantly and easily grow to 200MB or more.
While using QLPreviewController and viewing an insanly big Word document (1000 pages), scrolling through that document will grow real memory like crazy. If a memory warning is received, neither real memory, not live bytes drop at all. Eventually, the app will crash; Monotouch problem or Apple's problem?
Sometimes, real memory seems to grow and nothing can stop it. Then again, GC seems to clear big chunks of it. There is no real pattern in this.
So what is the correct answer? Is there exactly one?
EDIT: I attached two images. One showing memory usage in a stage in the middle of my app's life and the seconds one from way later. Both images reflect memory usage at the same point in the UI where nothing but two controllers are on screen. Maybe somebody can still comment what can be read from those number, especially the magic "Memory Tag 70".
Instruments is somewhat of a black box, but here is how I think it is:
There seem to be to three opposing claims here:
1. Use the *Allocation*s utility of Instruments. The number of "live bytes" is the amount of physical memory used by the application.
I don't know exactly what "Live Bytes" is, but it's not the amount of physical memory used by the application. I think it is the amount of physical memory used by all ObjectiveC objects (if this theory is correct "Live Bytes" does not contain any memory used by managed code, nor any memory used indirectly by ObjectiveC objects (such as image data), which seems to be true). "Live Bytes" is definitively useful if you want to track down leaked objects, but it's not (necessarily) a good indicator on how much memory is actually in use.
2 . Use the Memory Monitor plugin. From the list of processes, pick your app and check the "Real memory" column. That's the amount of RAM currently in use.
This is a bit closer: "Real Mem" is the amount of physical memory the app is using which isn't shared with other apps. The total amount of physical memory the app is using is "Virtual Mem", but big chunks of "Virtual Mem" is shared between apps (i.e. a shared library will of course use memory when it's loaded in memory, but since it's immutable it will only be loaded once for all processes. It will however be added to each process's "Virtual Mem", so if you add up the "Virtual Mem" used by all processes you will go way beyond the actual physical memory your device has).
3 . Use VM Tracker and make automatic snapshots. The "Dirty Size" if what you're after.
Correct. "Dirty Size" is what you're after - this is however closely related to "Real Mem", it's just "Real Mem" split into categories so you can easily see what's using the memory.
For the typical case of using a lot of memory due to leaking images, the process goes like this:
1. Verify with the Memory Monitor that your app really has a memory problem.
2. See in VM Tracker / "Dirty Size" that a lot of memory is used by image data (that's the magic "Memory Tag 70").
3. Use Allocations to find out where CGImages are created, see the corresponding stack trace and track down why those images aren't freed.
Each app is different though, so it's not possible to come up with a short recipe which works for all cases.
"Real Memory" drops as soon as GC is triggered
Even if my "Live Bytes" remain around 30MB I will eventually catch memory warnings
With constant "Live Bytes", "Real Memory" can increase significantly and easily grow to 200MB or more.
All these are explained above.
While using QLPreviewController and viewing an insanly big Word document (1000 pages), scrolling through that document will grow real memory like crazy. If a memory warning is received, neither real memory, not live bytes drop at all. Eventually, the app will crash; Monotouch problem or Apple's problem?
It could be your problem too :) It's impossible to tell without actually knowing where the memory goes.
Sometimes, real memory seems to grow and nothing can stop it. Then again, GC seems to clear big chunks of it. There is no real pattern in this.
You mean you're watching real memory grow while your app is doing absolutely nothing? If you're actually doing something in your app this is completely normal.
Okay so this is my issue and I apologize if its a duplicate. I searched but couldn't find anything I considered relevant.
When I run instruments from xcode and begin testing my application for memory leaks or allocations my iMac eventually begins to run very very slowly.
This caused me to run activity monitor while using instruments and I notice that every second instruments is open it takes up more and more real memory. Roughly 100MB's a sec.
It doesn't take long for it to consume all my iMacs free memory (2gbs) and then it starts to lag.
Anyways this doesn't occur with every application. I've done the same test with some application/projects I downloaded and instruments only seems to use about 250mbs of space and doesn't dramatically increase.
Is there something obvious that I'm doing incorrectly? Any help would be appreciated.
Thanks.
instruments consumes a lot of memory.
depending on what you are recording, you may reduce its memory usage. for example, you can often specify what (or what not) to record, or lower sampling frequencies(if applicable).
100MB/s is unusually high. can you give a more exact description of what you are recording in that time? (instruments you use, what the process you record is doing, etc).
Xcode 3 used a lot less memory - not sure if that's also the case for Instruments.
You can reduce the memory usage somewhat by running the toolset as 32 bit processes.
lastly, 2GB physical memory is nothing for Xcode + Instruments + iOS Sim. fwiw, i regularly exhaust physical memory with 8 or more GB. boo. fortunately, memory is cheap when you want 4 or 8GB.
Update
I tried using instruments for Allocations, Leaks and Zombies
You can run these tests individually, if you must.
Allocations
By itself, allocations should not consume a ton of memory if your app is not creating a lot of allocations.
To reduce memory with this instrument, you can disable some options you are not interested in:
do not record each ref count operation
only track active allocs
disable zombie detection
do not identify c++ objects
Leaks
implies Allocations instrument only if you want history of leaks.
Leaks detection itself can consume a lot of memory because it scans memory, basically cloning your allocations. say you have 100MB allocated - leaks will periodically pause the process, clone the memory and scan it for patterns. this could consume more memory than your app. iirc, it's executed as a subprocess in instruments.
Zombies
implies Allocations instrument.
Zombie detection usually implies ref count recording. When debugging zombies, it's most effective to never free them. If you do free them, you may only detect transient zombies (not sure if there's an option for that in instruments...). Never freeing objc allocations will obviously consume more memory. Running leaks on a process will then consume more memory because your heap sizes will be larger - leaks and zombies should not be combined.
you should be able to reduce total consumption by disabling some of these options and testing for them individually.
Notes
The bleeding edge Developer Tools releases can be really shaky. If you are having problems, it helps to stick to the official releases.
I can run a osx unit test (primarily c/c++ apis) with allocations alone, it consumes about 1MB/s when recording. something seems wrong, but perhaps that indicates an issue in your program (many transient allocations?).
changing the way the data is displayed and/or the charge/focus settings can require a lot of memory. e.g. "Restore All" can require a few GB to process a large sample.
if 100MB/s is an accurate number, i'd file a bug. I know Instruments consumes a lot of memory, but that's very high for recording an idle app, even with the expectation that instruments consumes a lot of memory.
good luck
I have a long-running memory hog of an experimental program, and I'd like to know it's actual memory footprint. The Task Manager says (in windows7-64) that the app is consuming 800 mb of memory, but the total amount of memory allocated, also according to the task manager, is 3.7gb. The sum of all the allocated memory does not equal 3.7gb. How can I determine, on the fly, how much memory my application is actually consuming.
Corollary: What memory is the task manager actually reporting? It doesn't seem to be all the memory that's allocated to the app itself.
As I understand it, Task manager shows the Working Set;
working set: The set of memory pages
recently touched by the threads of a
process. If free memory in the
computer is above a threshold, pages
are left in the working set of a
process even if they are not being
used. When free memory falls below a
threshold, pages are trimmed from the
working set.
via http://msdn.microsoft.com/en-us/library/cc432779(PROT.10).aspx
You can get Task Manager to show Virtual Memory as well.
I usually use perfmon (Start -> Run... -> perfmon) to track memory usage, using the Private Bytes counter. It reflects memory allocated by your normal allocators (new/HeapAlloc/malloc, etc).
Memory is a tricky thing to measure. An application might reserve lots of virtual memory but not actually use much of it. Some of the memory might be shared; that is, a shared DLL might be loaded in to the address space of several applications but it is only loaded in to physical memory once.
A good measure is the working set, which is the set of pages in its virtual address space that have been accessed recently. What the meaning of 'accessed recently' is depends on the operating system and its page replacement algorithm. In other words, it is the actual set of virtual pages that are mapped in to physical memory and are in use at the moment. This is what the task manager shows you.
The virtual memory usage is the amount of virtual pages that have been reserved (note that not all of these will have actually been committed, that is, had physical backing store allocated for it. You can add this to the display in task manager by clicking View -> Select Columns.
The most important thing though: If you want to actually measure how much memory your program is using to see if you need to optimize some of it for space or choose better data structures or persist some things to disk, using the task manager is the wrong approach. You should almost certainly be using a profiler.
That depends on what memory you are talking about. Unfortunately there are many different ways to measure memory. For instance ...
Physical Memory Allocated
Virtual Memory Allocated
Virtual Memory Reserved (but not committed)
Private Bytes
Shared Bytes
Which metric are you interested in?
I think most people tend to be interested in the "Virtual Memory Allocated" category.
The memory statistics displayed by task manager are not nearly all the statistics available, nor are particularly well presented. I would use the great free tool from Microsoft Sysinternals, VMMap, to analyse the memory used by the application further.
If it is a long running application, and the memory usage grows over time, it is going to be the heap that is growing. Parts of the heap may or may not be paged out to disk at any time, but you really need to optimize you heap usage. In this case you need to be profile your application. If it is a .Net application then I can recommend Redgate's ANTS profiler. It is very easy to use. If it's a native application, then the Intel vtune profiler is pretty powerful. You don't need the source code for the process you are profiling for either tool.
Both applications have a free trial. Good luck.
P.S. Sorry I didn't include more hyperlinks to the tools, but this is my first post, and stackoverflow limits first posts to one hyperlink :-(