We have a number of Unit Tests written using IID for the modules we've developed. We want them to run on our CI server
We use Maven for build sand JUnit to run the tests. Is there a way to mavenise BPM tests and run them via JUnit. If no then how could we implement a build and deploy to our CI server?
Thanks
Actually, you can. Have a look at http://www.ibm.com/developerworks/bpm/bpmjournal/1412_cai/1412_cai.html .
The solution you are looking for called IBM Business Process Manager Testing Asset.
There is one clue - you have to contact IBM Software Services for WebSphere to get it.
Related
I recently started working on creating jacoco reports for maven projects including unit and integration tests and they seem to work out correctly.
Now I have encountered a different scenario which I am not sure how to approach.
I have one workspace which consists of integration test cases - application A, but the source code does not exist in the same workspace/code base. The source code which actually runs on invoking these integration test scripts are in a different workspace/code base - application B(they are invoked using rest api calls with the localhost urls. The jboss server is started for application B so that the localhost context is up) from the integration tests.
The aim is to invoke these integration tests from application A, which in turn calls the source code of these tests in application B generating the jacoco report of the code coverage for application B.
I am not actually sure how to achieve this.
Can someone provide some input.
Thanks.
If I understand you correctly, you actually have 2 different processes in your scenario:
The "client" process that runs the integration tests and for which jacoco can be easily applied, but it's not what you need
The "server" process that runs the actual JBoss server and executes the actual code.
Client process contacts the server via HTTP.
In this case, I'm afraid jacoco won't be able to provide a coverage for you if you're running the tests from maven/gradle, because jacoco instruments only bytecode on the running JVM. So you have to be "creative" here :)
I'll list here some possible approaches
Disclaimer: I haven't tried them though (didn't work with jboss/java ee), but maybe you'll be able to at least borrow some ideas
The first approach would be running the tests together with the application somehow, like its done for example in spring tests (I'm not sure whether JBoss provides similar capabilities).
The idea is simple:
You run the integration test, it runs the jboss "embedded in the same jvm" and you can inject beans / EJB session beans into the test (like autowiring with spring).
The advantage of such a method is that you'll be able just to use jacoco maven plugin and it will instrument everything for you
I don't know how easy will be achieving this architecture technically, I know that recent jboss versions support embedded mode, So maybe you'll find This link to be a useful foundation
Another direction is to take a look at Arquillian project. They have some jacoco extension that probably will help, but I've never tried it.
And the last approach I can think of is running the jboss server with jacoco agent directly instead of relying on the build system that runs jacoco for you.
The idea here is to stream the results of covered server code into some file / tcp endpoint. So you run the jboss with -javaagent:[yourpath/]jacocoagent.jar and it starts streaming the results wherever you need it to stream. After the tests you should gather these results and prepare a report. You can find Here more information about this approach
I need few answer for my doubt:
Pact-mock-service Vs pact-jvm-server, is both are same? Pls describe this.
Am implementing the PACT in java-maven
I can able to run this:
https://github.com/anha1/microservices-pact-maven
https://github.com/warmuuh/pactbroker-maven-plugin
Help me to understand this with pact-mock-service and pact-jvm-server
Pact-mock-service is a general mock server built into the pact libraries to support mocking out the other dependency in an integration during a consumer test. If you use any of the consumer test support libraries, you do not need to use it directly.
pact-jvm-server is a controllable server that bundles the Pact-mock-service and allows you to setup and tear down mock servers via HTTP requests. It exists for people who can not,or do not wish to use the consumer test support libraries.
For people using Maven, there is a plugin provided as part of the pact-jvm project that can do provider verification tests and publish to a pact broker. For the consumer tests, they just run as JUnit tests so you don't need any Maven specific plugin.
Of the two links you posted, the first is an example project using a spring-boot application, and the second is a maven plugin that provides publishing to a pact broker only.
How can maven be configured to support this type of workflow:
One Time Setup Invoke maven to do one time setup of a developers machine such as
Create a custom version of tomcat configured for this application
Create a local postgres database on the developers machine
load sample data into the database
run a junit test to configure other resources needed to run the application
Integration Tests Invoke maven to do run integration tests which should do the following
Create an integration test db
setup the db
Run command line integration tests against the db
Run a test version of tomcat with the application in it
Run command line junit tests that test the restful services exposed by the application
Release Build Invoke maven to do a release build of the system
do all the steps for an integration test
generate resources and configurations that are used on the server rather than production
deposit the end result in a git repo, commit, and push the changes to production
Test Build Invoke maven to do a test build of the system
do all the steps of a release build but configure the test release package with test server configuration
The main thing I am struggling with is that maven has a single build life-cycle with a well defined sequence of phases not sure if the workflow I want to build is a good fit for maven.
Can maven be configured for this type of workflow? If yes what are the key features of maven that allow for the different configurations of the four main ways that I want to use maven?
Update What I mean by this workflow, is that I want to be able to do something like
mvn setup
mvn integration
mvn prod-release
mvn test-release
I know the above example look like ant, I am long time ant user and total noob with maven.
You could setup Maven to do all that...
You probably would use (shock horror) profiles to achieve some of this...
BUT you don't want to do that
You are following ANT style thinking... if you like that style of thinking then use ANT or Gradle and be happy.
If you want to follow the Maven way, then you will solve the problem differently.
Coming from the Maven way, here are my thoughts:
Why do you need one-time setup? I usually have a run profile that dynamically provisions the correct application server and starts it with the App deployed, tearing down everything afterwards when I hit ^C. Typically this involves starting up a database server or two... hence things I have developed like the cassandra-maven-plugin. That way when I am working on a different project (which could be in 10 minutes time) I don't have to worry about background database servers eating up all my laptop's ram.
Integration tests are actually trivial when you have the above working... in fact I created the Maven Failsafe Plugin to make it easy to have plugin execution tied to the appropriate phases for integration testing. The Maven convention is to have a profile called run-its for running integration tests.
Release builds being different from test builds... ugh! You should be building environment agnostic artifacts. Have them pick up their configuration from the environment they are deployed in. That removes the worry that something has changed between the "test" build and the "production" build. If you really need to bundle the config, then I usually would resort to a separate module for taking the agnostic artifact and rebundling with the required configuration. That way it is easy to prove that you have a reproducible transformation and that nothing has changed inbetween what went to QA vs what is going to Ops.
I always make the release builds include the integration testing.
So typically I have my projects such that
$ mvn -Prun
will fire up the application starting from zero. Hitting ^C will tear everything back down again, and mvn clean or in extreme situations if I have a more complex setup process and need some caching mvn post-clean (think really clean) will remove anything that the run profile put into play
To run the integration tests I typically do
$ mvn -Prun-its verify
To make a release I typically do
$ mvn release:prepare release:perform -B
That is (in my view) the ideal way of handling the above steps you need.
HTH.
BTW I have not had to use PostgreSQL specifically (typically my integration tests and run profile can get away with a pure java database such as derby or hsqldb and because the artifacts are environment agnostic it is easy to have the integration test/dev flyweight app server inject the correct JDBC url) so you may hit some issues with regard to PostgreSQL
I would like to set up an infrastructure for integration testing.
Currently we bootstrap tomcat using maven and then execute httpunit tests.
But the current solution has few drawbacks.
Any changes committed to the database need to be rollback manually in the end if the test
Running code coverage on integration test is not straight forward (we are using sonar).
My goals are:
Allow automatic rollback between tests (hopefully using String #transaction and #rollback)
Simple straight forward code coverage
Using #RunWith that will bootstrap the system from JUnit and not externally
Interacting with live servlets and javascript (I consider switching from httpuinit to selenium…)
Reasonable execution time (at least not longer than the existing execution time)
The goals above look reasonable to me and common to many Java/J2ee projects.
I was thinking to achieve those goals by using Arquillian and Arquillian Spring Framework Extension component.
See also https://github.com/arquillian/arquillian-showcase/
Does anyone have and experience with Arquillian and with Arquillian Spring Framework Extension?
Can you share issues best practices and lesson learned?
Can anyone suggest an alternative approach to the above?
I can't fully answer your question. only some tips
Regarding the automatic rollback. In my case. Using liquibase to init the test data on "hsqldb" or "h2" which could be set as in-memory pattern. Then no need to roll back.
For Arquillian. It's a good real testing approach. What i learned is that "Arauillian Spring Framework Extension" is just a extension. You have to bind to a specific container like "jboss, glasshfish,tomcat" to make the test run.
But i don't know how to apply for a spring-based javaSE program which do not need application server support.
My lesson learned is the jboss port conflict. since jboss-dist is set 8080 as default http port. But our company proxy is same as 8080. So i can't use maven to get the jboss-dist artifact.
Hope others can give more info.
I have developed a stack of web Services based on:
Spring ws 2.0 with jaxb2 maven plugin (to ease the pain).
Hibernate.
PostgResql.
We are using the following to test:
Junit test with Mockito.
Spring test for Dao & service layer.
The new Spring ws test & Smock api.
SoapUi Api for testing with their maven plugin.
We have TracWiki for the wiki side.
All is fully automated in a maven build with Hudson, even the deployment of the webapp with cargo
on distant server.
We have 5 virtual servers on a single machine on Debian (using vserver).
We don't have a single performance test and we don't have any webapp tools to monitor.
What do you recommend to go a step further?
I'm really looking for new ways and/or tools to improve everything.
Hey.
Incorporate Sonar into your builds. You will get lots of informations about your code.
I don't see you mentioning any code coverage tools. While coverage isn't everything, it can help finding the parts of your code which aren't covered by the tests (or perhaps even dead).