How to use WSAdmin to analyze Heap Dump files (.dmp) - wsadmin

I have a .dmp file of a WAS machine.. How can I use WSAdmin.sh to analyze the heap dump?
There are not many resources on this..

WSAdmin is only for WAS administrative operations and does not provide any tooling to perform dump analysis, at least that I'm aware of.
I think what you're looking for is something like the Eclipse Memory Analyzer Tool (MAT), which can be found here. It's specifically designed for processing and analyzing Java dump files.

Related

get a log file on system performance

I need to monitor my computer's processor usage, RAM usage and disk usage in certain time period and get a log file or some text output with the details of above resource usage. I don't need any complex data but simply usage over time should be enough. I tried few tools including Windows OS's built in Performance Monitor but had no luck. I only need usage over time as a log file or some other simple text manner.
My OS is windows and I don't need to do this in any other OS.
Can anybody help me? Even point me in right direction would be also helpful.
You can create a Data Collector Set which will store the metrics of your choice into .blg file, the file can be opened with Windows Performance Monitor so you will be able to see the counters values in form of zoomable charts.
If you need the data to be in a text format, i.e. in CSV, you can use Relog program to convert it.
Alternative option which might be easier is using Servers Performance Monitoring plugin which is designed for using with Apache JMeter, this way you will be able to integrate Windows performance counters information with the performance test report. Moreover, it will work on any operating system supported by underlying SIGAR libraries. See How to Monitor Your Server Health & Performance During a JMeter Load Test article if interested.

WebSphere - built-in memory profiler

I have tried some java profilers to profile WAS memory, but I guess due to the all SUN/IBM java thing, they don't support WAS.
Is there any built-in way to profile memory / analyze the heap dump a bit more / something in tracing and monitoring, perhepas?
If not, I know of some products - but the things is that we are in a closed enviornment - I can't just download and run. So if there is anything that comes with WAS, I would like to know.
Thank you.
Check out the HealthCenter
http://www.ibm.com/developerworks/java/jdk/tools/healthcenter/
It should be a reasonable one for you!
I am not aware of any profiler that comes with WebSphere but you can download IBM Support Assistant (Free) and download the HeapAnalyzer which is very good for analyzing heap dumps.
Personally I did not try v5 yet but used the heap analyzer a lot.I
There is no built-in thing to do that, but IBM offers a graphical GC and Heap Analysing Tool. You will have to download it from IBM.
Please have a look here:
https://www.ibm.com/developerworks/java/jdk/tools/gcmv/
The Eclipse Memory Analyzer can analyse both IBM and non-IBM heap dumps. It's also available in the IBM Support Assistant as a supported tool.

Performance Testing Tool That Can Produce a Graph

Is anybody know a good testing tool that can produce a graph containing the CPU cycle and RAM usage?
What I will do for ex. is I will run an application and while the application is running the testing tool will record CPU cycle and RAM Usage and it will make a graph as an output.
Basically what I'm trying to test is how much heavy load an application put on RAM and CPU.
Thanks in advance.
In case this is Windows the easiest way is probably Performance Monitor (perfmon.exe).
You can configure the counters you are interested in (Such as Processor Time/Commited Bytes/et) and create a Data Collector Set that measures these counters at the desired interval. There are even templates for basic System Performance Report or you can add counters for the particular process you are interested in.
You can schedule the time where you want to execute the sampling and you will be able to see the result using PerfMon or export to a file for further processing.
Video tutorial for the basics: http://www.youtube.com/watch?v=591kfPROYbs
Good Sample where it shows how to monitor SQL:
http://www.brentozar.com/archive/2006/12/dba-101-using-perfmon-for-sql-performance-tuning/
Loadrunner is the best I can think of ; but its very expensive too ! Depending on what you are trying to do, there might be cheaper alternatives.
Any tool which can either hook to the standard Windows or 'NIX system utilities can do this. This has been a defacto feature set on just about every commercial tool for the past 15 years (HP, IBM, Microfocus, etc). Some of the web only commercial tools (but not all) and the hosted services offer this as wekll. For the hosted services you will generally need to punch a hole through your firewall for them to get access to the hosts for monitoring purposes.
On the open source fron this is a totally mixed bag. Some have it, some don't. Some support one platform, but not others (i.e. support Windows, but not 'NIX or vice-versa).
What tools are you using? It is unfortunately common for people to have performance tools in use and not be aware of their existing toolset's monitoring capabilities.
All of the major commercial performance testing tools have this capability, as well as a fair number of the open source ones. The ability to integrate monitor data with response time data is key to the identification of bottlenecks in the system.
If you have a commercial tool and your staff is telling you that it cannot be done then what they are really telling you is that they don't know how to do this with the tool that you have.
It can be done using jmeter, once you install the agent in the target machine you just need to add the perfmon monitor to your test plan.
It will produce 2 result files, the pefmon file and the requests log.
You could also build a plot that compares the resource compsumtion to the load, and througput. The throughput stops increasing when some resource capacity is exceeded. As you can see in the image CPU time increases as the load increases.
JMeter perfmon plugin: http://jmeter-plugins.org/wiki/PerfMon/
I know this is an old thread but I was looking for the same thing today and as I did not found something that was simple to use and produced graphs I made this helper program for apachebench:
https://github.com/juanluisbaptiste/apachebench-graphs
It will run apachebench and plot the results and percentile files using gnuplot.
I hope it helps someone.

How can I monitor disk access for a certain file?

I want to use performance monitors to determine when a file is being accessed (read/write). Is this possible? If not, is there any other way?
My OS is Windows Server 2008 R2, and I am writing the code in C#.
For what its worth, you can use FileSystemWatcher to monitor writes to a specific file.
Unfortunately I don't think there is an API available for doing this using managed code. If you need to hook a file system read or write event, you should look into writing a filter driver. Filter drivers are pretty low-level constructs and if it's only to do performance monitoring then it's probably not worth it. This API is often utilized by anti-virus or backup/replication software developers.

DTS vs. SSIS vs. Informatica vs. PL/SQL Scripting

In the past, I have used Informatica for some ETL (Extraction Transformation Loading) but found it rather slow and usually replaced it with some PL/SQL scripts (was using Oracle at the time).
(questions revised based on feedback in answers)
I gather that DTS was Microsoft's ETL tool prior to SSIS.
Would it be difficult to convert an existing application using DTS to SSIS?
Given that SSIS is a Microsoft tool and tightly integrated with SQL Server (virtually a part of it) are there any drawbacks to using it? I don't see any efficiency issues, since I imagine that you can do anything in SSIS that you could without it with regard to ETL.
I believe SSIS is Microsoft's ETL tool today, replacing DTS.
It's important to remember that ETL performance has as much to do with your schema and how you're doing the transfer as it does the tool. For example, if you've got indexes they'll run slower than if you do a bulk transfer and create the indexes after it's done. If you do a large batch all at once you're creating rollback logs that increase in size and slow the process down. It could be that smaller batches will run faster, because the rollback log doesn't have to be as big.
Don't give in to the knee-jerk reaction and blame the tool. Look critically at how you're doing it to make sure that you're not shooting yourself in the foot.
That's correct, DTS was MS tool for ETL prior to SSIS. While I have never seen DTS before, I believe SSIS is much more user friendly and GUI based in comparison to DTS. Speaking of user-friendly, my first experience with ETL was with Informatica, and I strongly believe that the user-friendliness of Informatica beats SSIS. Inudstry does recognize Informatica to be much more stable and advanced as opposed to SSIS.
SSIS has got it's problems
Does not work with Excel correctly (because of mixed data types, well known problem)
Does everything in the memory = you need a lot of memory.
Especially for sorting large files.
You cannot specify which algorithm to use for sorting.
For example it would be nice to be able to use Merge sort
because does not require a lot of memory.
Your information is badly out of date. The current Microsoft ETL tool is SQL Server Integration Services (SSIS).

Resources