OSGI Integration Testing and Code Coverage - osgi

We have Desktop app deployed in OSGI bundles and have integration tests to test bundles loaded in OSGI container.
I am seeking a tool that calculates code coverage for integration tests with OSGI bundles
Currently we are trying to do with Jacoco and Sonar that is good for integration tests code coverage, but we aren't sure whether they are good enough to handle OSGI integration test code coverage
also any other tools available to calculate OSGI integration test code coverage.

Most, if not all code coverage tools should work with OSGi. Their general strategy is to post process the bytecode to inject extra code that allows them to measure such coverage. The biggest issue that causes is that this code now usually has dependencies on extra code (the code coverage library). Such dependencies can either be made explicit (by adding Import-Package statements) just like with any other dependency.
The other option you have is to add the code coverage library to your bootclasspath so you don't need those extra imports (which breaks modularity, normally not something you want, but in this case irrelevant). Once you solve this problem, the rest is a matter of instrumenting the right bundles and aggregating the results of multiple different test runs.

We proceeded in second approach and it worked..Jacoco is able to provide Test Coverage of OSGI integration test and show in Sonar DashBoard.

Related

In Maven, what is the difference between a unit test and an integration test?

I am adding basic regression testing to a Maven project that has no automated testing. My initial idea was to create a number of test classes, called IT<whatever>.java, to run in the integration-test phase. However, during packageing we do obfuscation and optimization and I want to be sure that the tests run against the final JAR (or at least the final classes).
The thing is, I can't tell from reading the docs what the actual difference is between the two kinds of test. The docs mention that integration-test is run after package, which sounds promising, but the tests are excluded from the JAR so it's unlikely they're running against the final artifact. Are they run against the packaged classes? Or is the only distinction between the two test types that they are run in different phases of the build lifecycle?

Writing a Sonar plugin to measure class usage

I need to write a Sonar plugin to keep track of the library classes that are used the most in a project.
So far I read the Coding a Plugin guide but I am a little bit confused. Does Sonar provide any facility to perform analysis (Something like parsing of Java code, creation of Abstract Syntax Trees, ...) or should I look for an external tool that does it and use Sonar only as a reporting tool?
Sonar provides a framework for publishing your own code analysis results into to Sonar so that they are in a single place. Although it does some analysis of it's own it mostly relies on other static code analysis tools and just integrates them into the lifecycle, e.g., test coverage can be implemented by cobertura or clover.
Sounds to me though like you just to get a measure of the Afferent couplings which can be configured for a single library. Not sure how you would manage it for cross library dependencies as most of the plugins work by using instrumenting the code at compile time which would not be possible for classes already in a jar.
If you just want to generate an AST then you should check out this question.

Maven Multi-Module plus Extrenal Tests plus JaCoCo plus Sonar

On our company we have the several modules on project and each module has several unit tests, but we have system tests that based on classes and not modules. Our system tests use several classes of each module (not all). We cannot calculate intergration coverage and unit coverage of this tests. We want to merge results of system tests to calculate the coverage of all probuct.
Anyone have any idea how we can do this? Anyone can provide any tutorial with examples?
You can find a sample application that reproduces this case here: https://github.com/SonarSource/sonar-examples/tree/master/projects/code-coverage/combined%20ut-it/combined-ut-it-multimodule-maven-jacoco
This should help you.

How do I configure maven-cobertura-plugin to instrument a dependency jar

I have classes in two modules. One of the modules contains some integration tests that exercise some classes from the other module. I would like my coverage reports to include classes from both modules but I can't find out how to configure the cobertura plugin so that it will instrument the other module's jar file.
I think that is not possible: the unit tests metrics for project A should be complete on their own without executing anything from project B. Unit tests should be written in a manner to cover the code completely. However you may consider re-using the testing code between A and B (see test-jar goal of maven-jar-plugin).

How can I boost the Integration build using Maven

I have around 20 modules need to built and how I could achieve this in faster way using maven 2.0+
Currently my build starts from root pom.xml using hudson
just calls mvn clean install
I do not want to skip any of the unit tests as well.
Currently it is taking almost an hour.Please advice.
Thanks
Wish I had more "high tech" suggestion but I would start by splitting tests into two groups:
"Fast" ones which will be part of every build and
"Slow" ones which will be part of "Slow" build. The "Slow" build is run several times a day.
Usually, unit tests are fast and become part of the "Fast" build while integration tests are part of the "Slow" build.
I would be careful with parallel tests: they can create more problems than they solves.
You can read more on integration test implementation in Maven here: http://docs.codehaus.org/display/MAVENUSER/Maven+and+Integration+Testing
What Sasha O said is really important. If your build takes so much times, maybe your tests are not simple unit tests. So splitting the tests among two categories will be a good idea. Your fast tests should be real unit tests, which means that they should not involve Spring context, important disk I/O, or real database connection (you can use in-memory database, such as H2 or HSQLDB).
If you are using TestNG for your tests, you can use the groups feature to split your tests. If you are using JUnit, the #Category can be useful. Using JUnit, this #Category is not perfect, and it was a problem for me. I solve my problem by creating a custom JUnit Runner (the solution is not perfect neither, but can be helpful).
Another advice: maybe you could consider using a better hardware for your continuous integration server. Indeed, compilation and tests execution can be really improved by using a computer with better performances. As well, if you are not using a recent JDK, you may migrate to JDK 1.6 for example...
Finally, there is an interesting option with Maven since version 2.1: the incremental build. This option is available when you click on the "Advanced" button of the Maven build options if you are using Hudson / Jenkins.
The principle is to build only the modules that are impacted by your changes (i.e. commits). Let's take an example. I have this project:
project
+- commons
+- persistence
+- business
The business project is dependent on persistence, which is dependent on commons. Now, if you have a commit on the persistence project, why would you need to compile, test and package commons, as this project did not change? The incremental build option will only rebuild the modified project, as well as the dependent modules. In my example, persistence will be rebuilt, as well as business.

Resources