I would like to set up an infrastructure for integration testing.
Currently we bootstrap tomcat using maven and then execute httpunit tests.
But the current solution has few drawbacks.
Any changes committed to the database need to be rollback manually in the end if the test
Running code coverage on integration test is not straight forward (we are using sonar).
My goals are:
Allow automatic rollback between tests (hopefully using String #transaction and #rollback)
Simple straight forward code coverage
Using #RunWith that will bootstrap the system from JUnit and not externally
Interacting with live servlets and javascript (I consider switching from httpuinit to selenium…)
Reasonable execution time (at least not longer than the existing execution time)
The goals above look reasonable to me and common to many Java/J2ee projects.
I was thinking to achieve those goals by using Arquillian and Arquillian Spring Framework Extension component.
See also https://github.com/arquillian/arquillian-showcase/
Does anyone have and experience with Arquillian and with Arquillian Spring Framework Extension?
Can you share issues best practices and lesson learned?
Can anyone suggest an alternative approach to the above?
I can't fully answer your question. only some tips
Regarding the automatic rollback. In my case. Using liquibase to init the test data on "hsqldb" or "h2" which could be set as in-memory pattern. Then no need to roll back.
For Arquillian. It's a good real testing approach. What i learned is that "Arauillian Spring Framework Extension" is just a extension. You have to bind to a specific container like "jboss, glasshfish,tomcat" to make the test run.
But i don't know how to apply for a spring-based javaSE program which do not need application server support.
My lesson learned is the jboss port conflict. since jboss-dist is set 8080 as default http port. But our company proxy is same as 8080. So i can't use maven to get the jboss-dist artifact.
Hope others can give more info.
Related
We are developing test cases for a micro service using Spring Boot. One of the requirement is that for each Junit test case we need to:
start the project
test a unit case and
then stop the project .
I feel this is an anti pattern, but this is the requirement.
I looked around internet but couldn't find a solution for the same. I was able to start a web server but it provided no response and this might be because the project is not assigned to the server.
Does anyone have any idea on how this can be achieved?
PS: We don't want to use Mockito
Before hand i want to make clear that this a very bad practice and should be avoided. This approach does not implement unit tests concept correctly because you are testing an entire system up, so JUnit wouldn't be the correct tool.
I pocked around and i don't seem to find a Runner that may be able to do this (does not surprise me although), the most similar Runner may be SpringJUnit4ClassRunner which provides you a complete Spring context in your test space, but won't go live with the application.
An approach i'd suggest if you really want to go with this is to use tools like REST Assured to do End-to-End API layer tests against the live application, but this implies that you have to find another way to start the app, and then point the REST Assured tests to that started app. Maybe a shell script that starts the app and then starts the REST Assured tests suits, then when the suit ends put down the server.
I highly suggest you to chat with your product/management teams to avoid this kind of stuff since the tests will take FOREVER to run and you will be polluting your local or remote DBs if you are persisting data or other systems through REST or SOAP calls.
I recently started working on creating jacoco reports for maven projects including unit and integration tests and they seem to work out correctly.
Now I have encountered a different scenario which I am not sure how to approach.
I have one workspace which consists of integration test cases - application A, but the source code does not exist in the same workspace/code base. The source code which actually runs on invoking these integration test scripts are in a different workspace/code base - application B(they are invoked using rest api calls with the localhost urls. The jboss server is started for application B so that the localhost context is up) from the integration tests.
The aim is to invoke these integration tests from application A, which in turn calls the source code of these tests in application B generating the jacoco report of the code coverage for application B.
I am not actually sure how to achieve this.
Can someone provide some input.
Thanks.
If I understand you correctly, you actually have 2 different processes in your scenario:
The "client" process that runs the integration tests and for which jacoco can be easily applied, but it's not what you need
The "server" process that runs the actual JBoss server and executes the actual code.
Client process contacts the server via HTTP.
In this case, I'm afraid jacoco won't be able to provide a coverage for you if you're running the tests from maven/gradle, because jacoco instruments only bytecode on the running JVM. So you have to be "creative" here :)
I'll list here some possible approaches
Disclaimer: I haven't tried them though (didn't work with jboss/java ee), but maybe you'll be able to at least borrow some ideas
The first approach would be running the tests together with the application somehow, like its done for example in spring tests (I'm not sure whether JBoss provides similar capabilities).
The idea is simple:
You run the integration test, it runs the jboss "embedded in the same jvm" and you can inject beans / EJB session beans into the test (like autowiring with spring).
The advantage of such a method is that you'll be able just to use jacoco maven plugin and it will instrument everything for you
I don't know how easy will be achieving this architecture technically, I know that recent jboss versions support embedded mode, So maybe you'll find This link to be a useful foundation
Another direction is to take a look at Arquillian project. They have some jacoco extension that probably will help, but I've never tried it.
And the last approach I can think of is running the jboss server with jacoco agent directly instead of relying on the build system that runs jacoco for you.
The idea here is to stream the results of covered server code into some file / tcp endpoint. So you run the jboss with -javaagent:[yourpath/]jacocoagent.jar and it starts streaming the results wherever you need it to stream. After the tests you should gather these results and prepare a report. You can find Here more information about this approach
I have a large spring project, using xml configuration. I'm looking for a quick way to verify changes to the xml configuration.
I can load the whole project locally - the problem is this takes more than 5 minutes, loads a huge amount of data.
My XML editor catches XML formatting errors.
I'm looking for something intermediate - to catch obvious problems like references to beans that aren't defined, or calling constructors with the wrong arguments. Is there a quick way to do this, without having to actually invoke all the constructors and bring up the whole environment?
I'm building with Maven and editing with Eclipse, although my question isn't specific to either.
Since you already use Eclipse, you could try Spring Tool Suite (comes either standalone or as an add-on). It's essentially Eclipse with extra Spring-specific features, like Beans Validator. I'm not sure how thorough the validation is, but it should catch most configuration problems.
It's maintained by SpringSource so its integration with Spring "just works" and it's guaranteed not be more or less in sync with Spring Framework's release cycle.
Beanoh :
http://beanoh.org/overview.html#Verify
this project does exactly what I'm looking for. Verify obvious problems with spring config, but without the overhead of initializing everything.
You can use a Spring testing support to integration test your Spring configuration. However if the loading of the context is taking 5 mins, then the tests will also take the same amount of time. Spring does cache the context so if you have multiple tests using the same set of Spring contexts, then once cached the tests should be very quick.
I can suggest a few ways to more efficiently test your configuration:
Organize your project in modules, with each module being responsible for its own Spring configuration - this way, each module can be independently developed and tested.
If you have a modular structure, the testing can be more localized by mocking out the dependent modules, again this is for speed.
I'm a web developer ended up in some Java EE development (Richfaces, Seam 2, EJB 3.1, JPA). To test JPA I use hypersonic and Mockito. But I lack deeper EJB knowledge.
Some may argue that we should use OpenEJB and Arquillian, but for what?
When do I need to do container dependent tests? What are the possible test scenarios where I need OpenEJB and Arquillian?
Please enlighten me :)
There are two aspects in this case.
Unit tests. These are intended to be very fast (execute the whole test suite in seconds). They test very small chunks of your code - i.e. one method. To achieve this kind of granularity, you need to mock the whole environment using i.e. Mockito. You're not interested in:
invoking EntityManager and putting entities into the database,
testing transactions,
making asynchronous invocations,
hitting the JMS Endpoint, etc.
You mock this whole environment and just test each method separately. Unit tests are fine-grained and blazingly fast. It's because you can execute them each time you make some important changes in code. If they were more complex and time-consuming, the developer wouldn't hit the 'test' button so often as he should.
Integration tests. These are slower, as you want to test the integration between your modules. You want to test if they 'talk' to each other appropriately, i.e.:
are the transactions propagated in the way you expect it,
what happens if you invoke your business method with no transaction at all,
does the changes sent from your WebServices client, really hits your endpoint method and it adds the data to the database?
what if my JMS endpoint throw an ApplicationException - will it properly rollback all the changes?
As you see, integration tests are coarse-grained and as they're executed in the container (or basically: in production-like environment) they're much slower. These tests are normally not executed by the developer after each code change.
Of course, you can run the EJB Container in embedded mode, just as you can execute the JPA in Java SE. The point is that the artificial environment is giving you the basic services, but you'll end with tweaking it and still end with less flexibility than in the real container.
Arquillian gives you the ability to create the production environment on the container of your choice and just execute tests in this environment (using the datasources, JMS destinations, and a whole lot of other configurations you expect to see in production environment.)
Hope it helps.
I attended Devoxx this year and got a chance to answer the JBOSS dudes this question.
Some of the test scenarios (the stuff i managed to scribble down):
Configuration of the container
Container integration
Transaction boundaries
Entity callback methods
Integration tests
Selenium recordings
I have developed a stack of web Services based on:
Spring ws 2.0 with jaxb2 maven plugin (to ease the pain).
Hibernate.
PostgResql.
We are using the following to test:
Junit test with Mockito.
Spring test for Dao & service layer.
The new Spring ws test & Smock api.
SoapUi Api for testing with their maven plugin.
We have TracWiki for the wiki side.
All is fully automated in a maven build with Hudson, even the deployment of the webapp with cargo
on distant server.
We have 5 virtual servers on a single machine on Debian (using vserver).
We don't have a single performance test and we don't have any webapp tools to monitor.
What do you recommend to go a step further?
I'm really looking for new ways and/or tools to improve everything.
Hey.
Incorporate Sonar into your builds. You will get lots of informations about your code.
I don't see you mentioning any code coverage tools. While coverage isn't everything, it can help finding the parts of your code which aren't covered by the tests (or perhaps even dead).