Is Xcode's debug navigator useless? - xcode

I am building an app in Xcode and am now deep into the memory management portion of the project. When I use Allocations and Leaks I seem to get entirely different results from what I see in Xcode's debug panel: particularly the debug panel seems to show much higher memory usage than what I see in Allocations and it also seems to highlight leaks that as far as I can tell (1) do not exist and (2) are confirmed to not exist by the Leaks tool. Is this thing useless, or even worse, misleading?
Here was a new one: today it told me I was using >1 GB of memory but its little memory meter read significantly <1 GB (and was still wrong if the Allocations data is accurate). Picture below.
UPDATE: I ran VM Tracker in a 38-minute session and it does appear virtual memory accounts for the difference between allocations / leaks and the memory gauge. Picture below. I'm not entirely sure how to think about this yet. Our game uses a very large number of textures that are swapped. I imagine this is common in most games of our scale (11 boards, 330 levels; each board and map screen has unique artwork).

You are probably using the Memory Gauge while running in the Simulator using a Debug build configuration. Both of those will give you misleading memory results. The only reliable way to know how memory is being managed is to run on a device using a Release build. Instruments uses the Release build configuration, so it's already going to be better than just running and using the Memory Gauge.
Moreover, it is a known flaw that the Xcode built-in memory tools, such as the Memory Debugger, can generate false positives for leaks.
However, Instruments has its flaws as well. My experience is that, for example, it fails to catch leaks generated during app startup. Another problem is that people don't always understand how to read its output. For example, you say:
the debug panel seems to show much higher memory usage than what I see in Allocations
Yes, but Allocations is not the whole story. You are probably failing to look at the VM allocations. Those are shown separately and often constitute the reason for high memory use (because they include the backing stores for images and the view rendering tree). The Memory Gauge does include virtual memory, so this alone might account for the "difference" you think you're seeing.
So, the answer to your question is: No, the Memory Gauge is not useless. It gives a pretty good idea of when you might need to be alert to a memory issue. But you are then expected to switch to Instruments for a proper analysis.

Related

My OS X app slowly consumes more and more ram over time, how can I debug this and find the cause?

I am new to swift and coding in general. I have made my first OS X app over the last few days. It is a simple ticker app that lives in the menu bar.
My issue is that over the space of 3 hours, my app goes from 10mb or ram being used to over 1gb. It slowly and slowly uses more and more. I noticed after about 6 hours the app stops working, I can only assume that OS X has stopped the process because it's hogging too much memory?
Anyway, I have looked online and I have used Xcode instruments to try and find a memory leak, but I don't know exactly how to pin point it. Can anyone give me some general good ways to find memory leaks and sources of bugs when using Xcode? Any general practices are good too.
If the memory loss is not due to a leak (Run Leaks and Analyzer) the lost is to inadvertently retained and unused memory.
Use instruments to check for leaks and memory loss due to retained but not leaked memory. The latter is unused memory that is still pointed to. Use Mark Generation (Heapshot) in the Allocations instrument on Instruments.
For HowTo use Heapshot to find memory creap, see: bbum blog
Basically the method is to run Instruments allocate tool, take a heapshot, run an iteration of your code and take another heapshot repeating 3 or 4 times. This will indicate memory that is allocated and not released during the iterations.
To figure out the results disclose to see the individual allocations.
If you need to see where retains, releases and autoreleases occur for an object use instruments:
Run in instruments, in Allocations set "Record reference counts" on (For Xcode 5 and lower you have to stop recording to set the option). Cause the app to run, stop recording, drill down and you will be able to see where all retains, releases and autoreleases occurred.
When confronted with a memory leak, the first thing I do is look at where variables are created and destroyed; especially if they are defined in looping logic (although generally not a good idea).
Generally most memory leaks come from there. I would venture a guess that the leak occurs somewhere in the logic that tracks your timed iterations.
Good luck!

MS Application Verifier bloats stack?

Does anyone have an idea how Application Verifier works?
I am currently working on a tree parsing application, which heavily uses recursion. The program seems to work as intended, however I do use "new" in a few places, so I thought of checking for memory leaks with Application Verifier. AV doesn't report any errors, however, in a couple of minutes the image of the application quickly grows to about a gigabyte, whereas without it only got to around 60 megs.
I can't seem to find any memory leaks, and seeing how much recursion is going on, I am starting to suspect that AV places extra items on the stack for testing purposes, and as the recursion goes deeper the extra "junk" builds up and crashes the program.
Does anyone have any insight into the matter?
It may depend which AppVerifier features you've turned on. There's a heap checking feature that puts each allocation in its own page and allocates guard pages between allocations. If you're allocating lots of small objects, this feature will dramatically increase memory usage. This is normal behavior for this kind of testing and not something to worry about.
Off hand, I don't know of any features that affect stack usage. I believe it would be difficult to mess with the stack without recompiling the code with instrumentation, and AppVerifier doesn't require compiling with instrumentation.

Xcode Instruments using lots of memory.

Okay so this is my issue and I apologize if its a duplicate. I searched but couldn't find anything I considered relevant.
When I run instruments from xcode and begin testing my application for memory leaks or allocations my iMac eventually begins to run very very slowly.
This caused me to run activity monitor while using instruments and I notice that every second instruments is open it takes up more and more real memory. Roughly 100MB's a sec.
It doesn't take long for it to consume all my iMacs free memory (2gbs) and then it starts to lag.
Anyways this doesn't occur with every application. I've done the same test with some application/projects I downloaded and instruments only seems to use about 250mbs of space and doesn't dramatically increase.
Is there something obvious that I'm doing incorrectly? Any help would be appreciated.
Thanks.
instruments consumes a lot of memory.
depending on what you are recording, you may reduce its memory usage. for example, you can often specify what (or what not) to record, or lower sampling frequencies(if applicable).
100MB/s is unusually high. can you give a more exact description of what you are recording in that time? (instruments you use, what the process you record is doing, etc).
Xcode 3 used a lot less memory - not sure if that's also the case for Instruments.
You can reduce the memory usage somewhat by running the toolset as 32 bit processes.
lastly, 2GB physical memory is nothing for Xcode + Instruments + iOS Sim. fwiw, i regularly exhaust physical memory with 8 or more GB. boo. fortunately, memory is cheap when you want 4 or 8GB.
Update
I tried using instruments for Allocations, Leaks and Zombies
You can run these tests individually, if you must.
Allocations
By itself, allocations should not consume a ton of memory if your app is not creating a lot of allocations.
To reduce memory with this instrument, you can disable some options you are not interested in:
do not record each ref count operation
only track active allocs
disable zombie detection
do not identify c++ objects
Leaks
implies Allocations instrument only if you want history of leaks.
Leaks detection itself can consume a lot of memory because it scans memory, basically cloning your allocations. say you have 100MB allocated - leaks will periodically pause the process, clone the memory and scan it for patterns. this could consume more memory than your app. iirc, it's executed as a subprocess in instruments.
Zombies
implies Allocations instrument.
Zombie detection usually implies ref count recording. When debugging zombies, it's most effective to never free them. If you do free them, you may only detect transient zombies (not sure if there's an option for that in instruments...). Never freeing objc allocations will obviously consume more memory. Running leaks on a process will then consume more memory because your heap sizes will be larger - leaks and zombies should not be combined.
you should be able to reduce total consumption by disabling some of these options and testing for them individually.
Notes
The bleeding edge Developer Tools releases can be really shaky. If you are having problems, it helps to stick to the official releases.
I can run a osx unit test (primarily c/c++ apis) with allocations alone, it consumes about 1MB/s when recording. something seems wrong, but perhaps that indicates an issue in your program (many transient allocations?).
changing the way the data is displayed and/or the charge/focus settings can require a lot of memory. e.g. "Restore All" can require a few GB to process a large sample.
if 100MB/s is an accurate number, i'd file a bug. I know Instruments consumes a lot of memory, but that's very high for recording an idle app, even with the expectation that instruments consumes a lot of memory.
good luck

What isĀ "dirty" memory in Instruments?

When I monitor my application using instruments and the instrument "allocations" I see a big amount of memory being marked as "dirty". What does that mean? I have no memory leaks in my application yet this pile of "dirty" memory keeps increasing.
Dirty is a computer term used to denote cached data that needs to be sync'ed with the main memory. Don't worry, since this is automatically done by the hardware.
"...I am trying to find out what is using up all my memory."
The WWDC 2010 Session 311 presentation, Advanced Memory Analysis with Instruments, includes a section on 'Responding to Memory Warnings' (at ~38:40 in the video) with a demo that illustrates how to find "resident, dirty memory" with the Instruments VM Tracker and one way to flush it.

Help w/ memory management...allocations shows no leaks but program leaks like crazy

So I have autoreleased/released every object that I alloc/init/copy...and the allocations instrument seems to show minimal leaks...however...my program's memory usage does not stop increasing. I have included a screenshot of my allocations run (I have run allocations for longer but it remains relatively constant...it certainly does not compare to the amount the program gains when actually running. When running my program it will double in memory over the course of about 10 hours. The memory drastically increases in the first 5 minutes however (2-3MB), and just keeps on going. I don't understand why allocations would remain constant when running in instruments but my program would just keep gaining memory when actually run.
Since I can't post images yet...here is the link to the screenshot:
allocations run
UPDATE: Here are some screenshots from my memory heapshot analysis...I am not allocating these objects explicitly and don't really know where they are coming from. Almost all of them have their source with something similar to the second screenshot details on the right (lots of HTTPs and URLs in the call tree). Anybody know where these are coming from? I know I've read about some NSURLConnection leaks but I have tried all of the cache clearing that those suggest to no avail. Thanks for all the help so far!
memory heap analysis 1
memory heap analysis 2
Try heapshots.
Are you running with different environment variables when you run in different environments?
For example, you could have NSZombie enabled when you launch your app (causing all your objects to not be free'd) but not when you run in Instruments?
Just for a sanity check - How are you determining memory usage? You say that memory usage keeps going up, but not when you run in Instruments. Given that Instruments is a reliable way of measuring memory usage (the most reliable way?) this sounds a little odd - a bit like saying memory keeps going up except when i try to measure it.
If you are using autoreleased objects (like [NSString stringWithFormat:]) in a loop the pool won't be drained until that loop is exited and the program is allowed to complete the main event loop, at which point the autorelease pool is drained and a new one is instantiated.
If you have code like this the solution is to instantiate a new auto release pool before entering your loop, then draining it periodically during your loop (and reinstantiating the auto release pool after you drain it).
You can use Instruments to find out the location of where your allocations are originating. When running Instruments in the Allocation mode:
Move your mouse over the Category field in the Object Summary
Click on the Grey Circle with an arrow which appears next to the field name
This will bring up a list of locations where the objects in that category have been instantiated from, and the stats of how many allocations each have made.
If your memory usage is rising (but not leaking) you should be able to see where that memory was created, and then track down why it is hanging around.
This tool is also very useful in reducing your memory profile for mobile applications.

Resources