I have windows application I wrote.
I installed it on virtual server (vmware) that holds windows server 2008 and for some reason the application getting bigger and bigger. I used perfmon in order to see maybe there is a memory leak - but as I understand, there isn't:
Here is the proccess in task manager:
There are also two proccesses that use a lot of memory and cpu but are steady and not growing like SimeserManager.exe.
The memory growing slows the surfing on sited that this server holds.
Before this week I used the my application on phizical server with windows server 2003 and there were no problem with surfing. I can't restore the situtaion in the phizical machine since I don't have it anymore, but I don't believe there was memory error when using the phizical server.
The application is written in c# .net using visual studio 2010.
What can be the problem?
Where can I get some clues?
UPDATE
I get ANTS memory profiler and tried to seek for the problem. I created memory snapshots and here is the results:
Now I'm really lost.
I tryed the standart filters but didn't manage to find a clue for the problem. In the image you can see there is increase in the private bytes. Does that sais there is a memory leak?
Can anyone give me some clues how to continue?
Thanks!
We don't have enough information here to really debug your application. However, there are tools you can use to identify and solve this issue in your application. I would suggest you use the ANTS Memory Profiler from RedGate to help you look for your problem. Here is a link to it:
http://www.red-gate.com/products/dotnet-development/ants-memory-profiler/
It isn't free but it is cheap and extremely effective. Get the 14-day free trial and run it on your application. I would go as far as to say that if it doesn't find the issue, the issue probably isn't with your application.
As for the other processes that are taking a lot of memory, this is normal. SQL Server tries to get as much memory as possible. Running other applications on the same box as a SQL server may cause you performance issues if you aren't careful. Here is a good article on how SQL Server uses memory:
http://sqlblog.com/blogs/jonathan_kehayias/archive/2009/08/24/troubleshooting-the-sql-server-memory-leak-or-understanding-sql-server-memory-usage.aspx
As for IIS memory usage (the other process that was using lots of memory), there could be multiple reasons for this. I would suggest you read through this forum to get a better idea of what it could be (if it truly is an issue):
http://forums.iis.net/t/1150494.aspx
Related
My question is simple. I just can't seem to find much information on it googling around. If I set up Windows Error Reporting to create memory dumps when an IIS app pool crashes (See article here for rundown.) could it cause any noticeable degradation in performance in terms of IIS serving up apps and websites?
We are looking to set this up in production to assist in tracking down issues when an app pool crashes. Also, is something like this recommended for production servers?
I assume you're using the LocalDumps Registry key to achieve this.
That Registry key has a setting called DumpType where you can specify the amount of data being collected.
The time it takes to capture the crash dump will mainly depend on the amount of data to be written to disk. A full memory crash dump of IIS can easily take 4 GB, which may take 8 seconds to be written to disk at a 500 MB/s disk throughput rate.
During those 8 seconds, another instance of IIS may still serve pages from cached files very fast, but might have trouble serving files that need to be read from disk.
You could mitigate this a bit by spending an extra disk or different partition for the crash dumps. You'll then just have the memory, CPU and SATA overhead.
Anyway, you'll not leave this setting on for very long. Just capture a few crash dumps and then disable it. You'll make your customers happier if you resolve the crash. Therefore, the performance impact IMHO is acceptable.
If you want to know the exact impact, you'd need to set up a load test system, serve pages and implement a website that crashes. You'll then notice (or not) the performance degradation and you'll be able to measure it.
I am tired of how slow the VS2010 is. I know there are a lot of topics here about tuning the settings and I've read/applied them all with not much luck though. Namely the things I've already done:
removed all the extensions
never had a resharper
tuned the settings to get maximum performance
tried SSD and RAM disks
Nothing helped it is still unacceptably slow. I know what I am saying because with VS2008 I never had such problems.
Now, I am working on a quite big C# solution with about 20 projects in it. Visual Studio works quite fast when just opened, but as time goes it starts lagging and eventually gets so slow that I have to restart it. The resource monitor shows that the amount of memory consumed by it is about 200 MB in the beginning and goes up to ~600 MB and then doesn't go any higher. I have 8 GB of total RAM on a x64 laptop with about 4GB that are always free. I find it weird how little memory the VS uses and from what my common sense tells me the more memory the faster the app should work. So I believe my question is how to make the VS use more of the available memory.
PS
I tried a recipe from Configure Visual Studio to use more ram Didn't work out.
There is no way to make Visual Studio use more memory. The application itself has no preset limitation. It will simply use the amount of memory that is granted it by the operating system (just like other apps).
The reason you see it increase to 600MB and then stop is just a side effect of how the managed GC works. As it performs operations like displaying intellisense, performing edits, etc ... more managed objects will be created. Eventually the GC is triggered and it reclaims all of the free objects and the longer lived ones are promoted. Overall though memory usage will be lowered but not as much as before you started editing. Then you edit some more and this process continues until it reaches the appearance of a steady state. If you deeply analyze it you'll see that it's actually more of a saw tooth graph of memory usage.
As to why your particular instance of Visual Studio is slow though is hard to determine remotely. 20 projects is a larger solution but performance should still be acceptable even with that many. Couple of things to try in order to isolate the problem
Try editing a smaller solution. It's possible there is one project in particular which is giving VS a problem. Breaking the project down into smaller solutions could help isolate it.
Try disabling Aero on your computer. It's possible that WPF is a problem here
Hi
I am trying to find a memory leak issue at a client site. They are using our application and over time while using it the application runs out of memory and throw an OutOfMemory exception. It would not be easy to replicate the issue inhouse since we will have to sit for hours to replicate their workflow. So I need to put a tool (possibly free) on their machine which should be able to tell me how the memory is being uesd by the application and some generation infor. So doen anybody know of a tool that can achieve this or can anybody point me in a direction that can help me find the problem without profiling the whole application in dev env
Assuming Java, you can set the -XX:+HeapDumpOnOutOfMemoryError flag, which will cause the JVM to dump the heap when it throws out of memory exceptions. Then, you can take the dump and run it through jhat to see where the memory is being allocated. (There's also an Eclipse based heap dump analysis tool.) I've used this in the past with great success.
It so happens that the new CLR Profiler with support for .NET 2.0 through 4.0 was just released by Microsoft:
David Broman's CLR Profiling API Blog: CLRProfiler V4 Released
It's free.
I'm trying to reproduce a bug that seems to appear when a user is using up a bunch of RAM. What's the best way to either limit the available RAM the computer can use, or fill most of it up? I'd prefer to do this without physically removing memory and without running a bunch of arbitrary, memory-intensive programs (ie, Photoshop, Quake, etc).
Use a virtual machine and set resource limits on it to emulate the conditions that you want.
VMWare is one of the leaders in this area and they have a free vmware player that lets you do this.
I'm copying my answer from a similar question:
If you are testing a native/unmanaged/C++ application you can use AppVerifier and it's Low Resource Simulation setting which will use fault injection to simulate errors in memory allocations (among many other things). It's also really useful for finding a ton of other subtle problems that often lead to application crashes.
You can also use consume.exe, which is part of the Microsoft Windows SDK for Windows 7 and .NET Framework 3.5 Service Pack 1 to easily use a lot of memory, disk space, cpu time, the page file, or kernel pool and see how your application handles the lack of available resources. (Does it crash? How is the performance affected? etc.)
Use either a job object or ulimit(1).
Create a virtual machine and set the ram to what you need.
The one I use is Virtual Box from SUN.
http://www.virtualbox.org/
It is easy to set up.
If you are developing in Java, you can set the memory limits for the JVM at startup.
I'm involved with a project using DotNetNuke version 05.01.04 Community Edition. We are building our new Intranet using it, but performance is terrible.
We have five people adding pages and content to it and every 15-30 seconds they experience a pause of 10 seconds or longer before the system continues and the next screens loads.
The server is Windows 2003, 3.8GHz with 1GB of RAM. I'm told by our server admin that the CPU and memory performance don't appear to be the bottleneck.
We currently have 350 pages in the system, we a plan to add 1000. So we need to resolve this performance problem so that we can enter content and so we can go live.
I just can't see where the bottleneck is. Is there a good why to determine the bottleneck when using DotNetNuke?
Modules installed
Publish:Engage (Not currently in
use)
Page Blaster (Doesn't appear
to providing caching when users
logged in using Integrated
Authentication)
SimpleGallery
XMod
Content Manager
IIS Setup
Application recycling completely disabled (Apart from a 2am recycle)
New findings: 18th March 2010
The main bottleneck was due to version 5.1.4 having a bug which caused 1300 database roundtrips on an average page, due to broken database in-memory caching. We've upgraded to 5.2.4 which has resolved this bottleneck.
Now the next biggest bottleneck is the navigation. We've used both DDR:Menu and DDN:Nav, but both have a major impact on performance.
Is there a navigation interface out there that doesn't drain performance so badly?
I think you need to start investigating this using performance profiling tools. For the DNN application itself I'd grab something like JetBrains DotTrace or Red Gate's ANTS Performance Profiler.
For the database SQL Server Profiler would be the first choice or a tool such as Red Gate's SQL Response.
Without profiling the application these you're going to be pulling at straws.
And as Tim pointed out in his comment, installing Firebug in Firefox with the YSlow add-in to see what resources are taking longest to serve to the browser.
Mitchel Sellers has some good tutorials and checklists to go through with regards to performance in DNN. Start with Explaining High Performance DotNetNuke Configuration and Management (which points to some of his earlier articles).
I have several years of dnn development and maintainance experience, when I have this kind of problem, I start doing things from database clean up. Next thing is, find for missing indexes, and/or rebuild all the indexes periodically (sql job scheduled for that) but major performance gain would be from clean up of table
Another good considerations would be, disabling trace, debug mode to false and turn off features of dnn that you don't use (scheduler is the first one to turn off)
Edit: consider keep alive as well
Hope this helps
Is your database on that server? If so, just throw in some more RAM, or get a faster disk array...
Have you considered creating this lot of pages directly through TSQL? It's not hard to do and may save you a lot of time.