Profiling visualization tools? - performance

I need to display profiling information pulled from a deeply embedded CPU, presenting it in a way which other developers on my team will be able to act upon. The profiling data is a snapshot of a cycle counter at the entry and exit of every function, so we have a call graph annotated with sub-microsecond timing accuracy. I'd prefer not to just dump out function names and timing like gprof, I'm looking for something easier to understand and act upon.
Has anyone worked with a particularly good profiling tool (on any platform), which made it easy to identify areas of the code to drill into? I'm looking for an inspirational example to follow for how to display the call graph, but if there is good tool with an input format I can massage my data to I'll use it. I could use Windows, Linux, or MacOS X to run the visualization tool.
A profiling article on IBM DeveloperWorks led me to GraphViz, with a profiling example on their site. Barring another suggestion here, I'll use GraphViz and mimic their profiling example.

Another neat tool to visualize profiling data is the gprof2dot.py python script.
It can be used to visualize several different formats: "This is a Python script to convert the output from prof, gprof, oprofile, Shark, AQtime, and python profilers into a dot graph." This is what the output look like:
(source: googlecode.com)

I use Kprof
http://kprof.sourceforge.net/
it is old, but I never found a better tool to inspect the results from gprof.

How about "GTKWave"?
But you have to insert the probe in your code.

Valgrind does profiling (and more), and there are GUIs for visualization.

I suggest you drop gprof+graphviz for OProfileUI, unless you don't have a choice.

JetBrains dotTrace (has a trial demo you can play with). It organizes the call stacks and can easily find the trouble spots. Has a lot of filtering capabilities as well. Very easy to navigate and find what you're looking for.

IE 8b2 offers a simple display of the call tree for javascript that I believe is much more useful than the GraphViz chart.
The GraphViz chart is wonderful for visualizing the call tree but makes it very difficult to visualize timing issues (IMHO the more important data).
**Edit: I thought it is worth pointing out that all of the tools suggested use a grid based tree to visualize the call tree. This allows you to see the calling structure without downplaying the timing data as I believe you do with the GraphViz chart.*

You can use Senseo, a plugin for Eclipse. It shows you the performance, memory allocation, objects created, time spent, actual methods invoked, hover over method signatures or calls, call context tree, package explorer and more.

I've written a browser-based visualization tool, profile_eye, which operates on the output of gprof2dot.
gprof2dot is great at grokking many profiling-tool outputs, and does a great job at graph-element placement. The final rendering is a static graphic, which is often very cluttered.
Using d3.js it's possible to remove much of that clutter, through relative fading of unfocused elements, tooltips, and a fisheye distortion.
For comparison, see profile_eye's visualization of the canonical example used by gprof2dot.

Related

how to extract an object from an image

I want to extract an object such as a man ,a car or something like that from an image.The image is just an ordinary iamge, not medical image or other types for specific purpose.
I have searched for a long time and found that the automatic image segmentation algorithms just segment the image into a set of regions or gives out the contour in the image,not a semantic object. so I turned to the interactive image segmentation algorithms and I found some popular algorithms like interactive graph cuts and SIOX and so on. I think these algorithms just meet my demand.
Further more, I also downloaded two interactive image segmentation tool,the first one is the interactive segmentation tool, the second one is the interactive segmentation tool-box.
So my quesions are
1.if the interactive image segmentation algorithm is the right solution for my task since the performance is the most important.
2.and if I want to use the automatic image segmentation algorithm, what should I do next?
Any suggestion will be approciated.
If you want to pick out a object from a single static image just by a few scribbles. I recommend you have a read of
'Closed-form solution to image matting'
or 'Spectral matting',
or 'lazy snapping'
but as in my tests, the last doesn't perform as well as the first two methods when dealing with subtle objects like hairs.
However you can find their source matlab codes very easily from google.
But the first two method are't so pleasant to use actually, I think you'll need to do lots of modification to make them easy to use. It's main problem IMHO, is it requires very decent scribbles on the image, that's if you draw some extra scribbles or at wrong positions, you'll ruin your object cutting .
Apart from these, you may try 'bayesian matting, possion matting, etc.' which all request some helping image called trimap, and it's hard to draw really.
Extracting objects from an image, specially a picture is not that easy as you think, you may want to take a look at the OpenCV project.
OpenCV
Other than OpenCV, I would suggest looking at ITK. It is very popular in medical image analysis projects, because there it is known that semi-automatic segmentation tools provide the best results. I think that the methods apply to natural images as well.
Try looking at tools like livewire segmentation, and level-set based image segmentation. ITK has several demos that allow you to play with these tools on your own images. The demo application such as this is part of the open source distribution, but it can be downloaded directly from the itk servers (look around for instructions)
If this is a business case, you'd better look for companies specialized in "video content analysis". I mean it: reliable people and vehicle detection aren't a single man's project.
Genreral purpose segmentation tools won't do the trick because they have no notion of what a man or a car look like. All they are deemed to do is to find uniform regions in an image.
It is quite late but there is an algorithm called connected component labeling, which you may find useful.
here is wiki link of the algorithm

How to do algorithm visualization?

I am looking for an algorithm visualization library/tool that is well documented and you can call from your source code.
I took a look at jhave - example of usage. And I liked it, it seems it has some documentation but I do not trust its future.
I found this article about Algorithm explorer it has a nice idea. It is implemented as a c++ api but I cannot find it anywere.
My main idea is that I want to do some unit tests for the brain.
So I construct various exercises and in future when I want to test my knowledge I redo them.
I found that images stick longer with me, so that is why I want to visualize algorithms in certain states. ( I might remember better a tricky case like what happens when data is sorted in reverse and I use quick sort if I view it.)
An ideal tool:
1. Has to integrate with any language.
2. Has to be well documented with a growing comunity and examples.
3. Be implemented on top of a capable rendering engine(ogre, xna).
Here is the place you need to visit: The Algorithm Visualization Portal!

3d modeling for data structures

I'm looking for a 3D modeling/animation software. Honestly, I don't know if this is something achievable - but what I want to have is some kind of visual representation of various ideas.
Speaking in future tense: if I were to read about of the boot process of an OS, I would visualize the various data structures building up; and I can step through the process with a sliding bar or so. If I were to think about a complex data structure, I would have a 3D representation of various links and relations between them. Another would be a Git repository at work - how commits/trees/blobs are linked in space, and how they progress as time passes. And all of these would be interactive.
The reason why I want to do this is that it'd be very easy to explain the process. Not just to others, but also to self. I can revisit my model, and it'd be a quick brush up.
I'm sure there are no ready-to-use softwares for this. What I could think of are Flash, with action scripting, or Blender 3D (Python scripting?); or Synfig. Whatever it's, I've to learn up start; and I'm looking for suggestions as to which (even if not in my list) is the right one to choose.
Thanks
I've used Blender, but it requires a large upfront investment of time, especially to learn the UI. Blender is all about the hotkeys. Once you have them memorized, it's great. But getting there takes a while.
Alice might be worth a look. It looks easy to use and supports scripting.
There are many tools available for 3D modeling. I'm a fan of 3D Studio max. But there is Blender, Maya, and truespace.
You may want to take a look at the field of visualization to help with illustrating your message.
I suspect that packages such as 3D Studio Max and Blender are too powerful, in the sense that your relatively simple requirements will force you on too long a learning path. Try Googling for Data Structure Animations to get an idea of what others have used. Also, head over to Information Aesthetics, they recently featured a tool for visualising commits and checkouts to/from repositories and similar.
My favourite is nearly the Lego Designer, very good for 3D block animations, but so far I haven't figured out how to add text to the blocks.

Tools/Techniques to use our ability to think spatially

What software/UI techniques can leverage our spatial memory? I think and remember in physical space, often the location of something is as important as it's content. For instance I keep an untidy desk, but I know where to find things, I use different parts of my (multiscreen) desktop for different windows/icons. I annotate books (with post its) and can remember facing page, top/bottom etc. In the good old days we used to file things so we could find them later, now we use search, but that doesn't really use our spatial abilities. Google maps etc are brilliant but they're only really being used for the real real world, what about our internal locations? How can we leverage the wet ware to best advantage.
EDIT -> I've thought about a code tool that would profile the running code and then build a visualisation with classes/methods scaled to match their use, with large/small motorways/footpaths between them. Spatial layout still escapes me though - UI at the top, DB at the bottom, but how do you position a class in 3D based on it's usage?
Slightly off topic since it's not code per say but I've built my own tools to translate some of out complicated XML config files into DOT format and run them through Graphviz so that I could visualise them. We were able to strip out lots of pointless stuff from them after just looking at them.
Wetware win :o)

Calculate code metrics [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
Are there any tools available that will calculate code metrics (for example number of code lines, cyclomatic complexity, coupling, cohesion) for your project and over time produce a graph showing the trends?
On my latest project I used SourceMonitor. It's a nice free tool for code metrics analysis.
Here is an excerpt from SourceMonitor official site:
Collects metrics in a fast, single
pass through source files.
Measures metrics for source code
written in C++, C, C#, VB.NET, Java,
Delphi, Visual Basic (VB6) or HTML.
Includes method and function level
metrics for C++, C, C#, VB.NET,
Java, and Delphi.
Saves metrics in checkpoints for
comparison during software
development projects.
Displays and prints metrics in
tables and charts.
Operates within a standard Windows
GUI or inside your scripts using XML
command files.
Exports metrics to XML or CSV
(comma-separated-value) files for
further processing with other tools.
For .NET beside NDepend which is simply the best tool, I can recommend vil.
Following tools can perform trend analysis:
CAST
Klocwork Insight
Sonar is definitively a tool that you must consider, especially for Java projects. However it will also handle PHP or C/C++, Flex and Cobol code.
Here is a screenshot that show some metrics on a project:
alt text http://sonar.codehaus.org/wp-content/uploads/2009/05/squid-metrics.png
Note that you can try the tool by using their demo site at http://nemo.sonarsource.org
NDepend for .net
I was also looking for a code metrics tool/plugin for my IDE but as far as I know there are none (for eclipse that is) that also show a graph of the complexity over a specified time period.
However, I did find the eclipse metrics plugin, it can handle:
McCabe's Cyclomatic Complexity
Efferent Couplings
Lack of Cohesion in Methods
Lines Of Code in Method
Number Of Fields
Number Of Levels
Number Of Locals In Scope
Number Of Parameters
Number Of Statements
Weighted Methods Per Class
And while using it, I didn't miss the graphing option you are seeking as well.
I think that, if you don't find any plugins/tools that can handle the graphing over time, you should look at the tool that suits you most and offers you all the information you need; even if the given information is only for the current build of your project.
As a side note, the eclipse metrics plugin allows you to export the data to an external file (link goes to an example), so if you use a source control tool, and you should!, you can always export the data for the specific build and store the file along with the source code, that way you still have a (basic) way to go back in time and check the differences.
keep in mind, What you measure is what you get. loc says nothing about productivity or efficency.
rate a programmer by lines of code and you will get.. lines of code.
the same argument goes for other metrics.
otoh.. http://www.crap4j.org/ is a very conservative and useful metric. it sets complexity in relation with coverage.
NDepend, I am using it and its best for this purpose.
Check this :
http://www.codeproject.com/KB/dotnet/NDepend.aspx
Concerning the tool NDepend it comes with 82 different code metric, from Number of Lines of Code, to Method Rank (popularity), Cyclomatic Complexity, Lack of Cohesion of Methods, Percentage Coverage (extracted from NCover or VSTS), Depth of Inheritance...
With its rule system, NDepend can also find issues and estimates technical debt which is an interesting code metric (amount of dev-effort to fix problems vs. amount of dev-time spoiled per year to let problems unfixed).
All these metrics are detailled here.
If you're in the .NET space, Developer Express' CodeRush provides LOC, Cyclomatic Complexity and the (rather excellent, IMHO) Maintenance Complexity analysis of code in real-time.
(Sorry about the Maintenance Complexity link; it's going to Google's cache. The original seems to be offline ATM).
Atlassian FishEye is another excellent tool for the job. It integrates with your source control system (currently supports CVS, SVN and Perforce), and analyzes all your files that way. The analysis is rather basic though, and the product itself is commercial (but very reasonably priced, IMO).
You can also get an add-on for it called Crucible that facilitates peer code reviews.
For Visual Studio .NET (at least C# and VB.NET) I find the free StudioTools to be extremely useful for metrics. It also adds a number of features found in commercial tools such as ReSharper.
Code Analyzer is simple tool which generates this kind of metrics.
(source: teel.ws)
For Python, pylint can provide some code quality metrics.
There's also a code metrics plugin for reflector, in case you are using .NET.
I would recommend Code Metrics Viewer Exention for visual studio.
It is very easy to analyze solution at once, also do comparison if you made progress ;-)
Read more here about the features
On the PHP front, I believe for example phpUnderControl includes metrics through phpUnit (if I am not mistaken).
Keep in mind that metrics are often flawed. For example, a coder who's working on trivial problems will produce more code and there for look better on your graphs, than a coder who's cracking the complex issues.
If you're after some trend analysis, does it really mean anything to measure beyond SLOC?
Even if you just doing a grep for trailing semi-colons and counting the number of lines returned, what you are after is consistency in the SLOC measurement technique. In this way today's measurement can be compared with last month's measurement in a meaningful way.
I can't really see what would a trend of McCabe Cyclometric Complexity give? I think that CC should be used more for a snapshot of quality to provide feedback to the developers.
Edit: Ooh. Just thought of a couple of other measurements that might be useful. Comments as a percentage of SLOC and test coverage. Neither of which you want to let slip. Coming back to retrofit either of these is never as god as doing them "in the heat of the moment!"
HTH.
cheers,
Rob
Scitools' Understand does have the capability to generate a lot of code metrics for you. I don't have a lot of experience with the code metrics features, but the static analysis features in general were nice and the price was very reasonable. The support was excellent.
Project Code Meter gives a differential development history report (in Excel format) which shows your coding progress metrics in SLOC, time and productivity percentage (it's time estimation is based on cyclomatic complexity and other metrics). Then in Excel you can easily produce the graph you want.
see this article which describes it step by step:
http://www.projectcodemeter.com/cost_estimation/help/FN_monsizing.htm
For Java you can try our tool, QualityGate that computes more than 60 source code metrics, tracks all changes through time and also provides an overall rating for the maintainability of the source code.

Resources