Testing RabbitMQ and Spring Integration - spring

I'm developing an application with Spring Integration and RabbitMq and I'm wondering how to test it (integration tests).
I think SoapUI can be a great Solution but it doesn't support RAbbitMq, hermesjms.com has support for Qpid so i thought that could be easy to do a new plugin to support Rabbitmq but it's being more difficult than I thought due to the project is a little old and has a bunch of dependencies.
So I'm starting to think in doing something myself, like a DSL in python, something like this:
tests = [{ 'name': 'start',
'routing_key': 'returned',
'payload' : "xxxxx",
'timeOut' : '10000',
'expected': '',
'threads': '1'
},
{ 'name': 'second',.....
]
And then with Pika execute the actions and check the results.
I know it's very stupid and sopaui is huge and awesome, but at least it'd allow me to do small tests.
What would you recommend?

RabbitMQ provides you a Webfrontend (so called Management View 1).
So: What exactly do you want to test? Let's say, you want to verify that an incoming message on requestChannel down to the service and back, you could just autowire the channel directly (i.e. #Autowired private Channel requestChannel;) and put a message into it.
However, only if you design your architecture right: Each step of your process can be tested using mocks or special modified injected dependencies.
In addition to you own components, these testability applies for the spring components (interfaces). Let's say, you have implement your own router: Test und verify in- and output. The same is for a transformer.
If you try to verify the "bic picture", you will have to rebuild the complete scenario. But this should not be so complicated with non persistent and non durable queues and messages.
Is there something else you want test?

For rabbitMq , my advice is to use a real RabbiMQ : this can be done by using Vagrant with chef for provisioning the RabbitMq and the Vagrant maven plugin to start the Box before Integration tests and halt it in the post phase of integration tests :
The Vagrant Maven plugin : http://nicoulaj.github.io/vagrant-maven-plugin/
Vagrant WebSite : http://www.vagrantup.com/
Cookbook Chef for RabbitMQ : https://github.com/opscode-cookbooks/rabbitmq
To Summarize you must :
Install Vagrant and create an empty Box(Centos or Ubunutu).
provision the VM with rabbitMQ cookbook .
place .box into you home folder (rabbitMQ.box).
Configure you maven Project to start the VM with vagrant up (~/rabbitMQ.box) in the pre phase of integration tests .
Configure you maven Project to stop the VM with vagrant halt (~/rabbitMQ.box) in the pre phase of integration tests .
Hope that this help

RabbitMQ now has an HTTP API so you could use this instead of its JMS
http://hg.rabbitmq.com/rabbitmq-management/raw-file/rabbitmq_v2_8_4/priv/www/api/index.html

Related

EmbeddedKafka Spring boot test fails only on Github actions but not locally

I am creating a demo application for Groovy using Spring boot with Kafka and elastic.
I used #EmbeddedKafka annotation in my Spock tests and they are working really nice locally; both on Windows and Ubuntu. They work from within Intellij by just running or debuggin, no issue. It's the same when trying in my shell "./gradlew test". Everything is good.
As soon as I pushed it to github.com, my github action fails. But it's calling the same command.
the action definition: https://github.com/besessener/GroovySpringBootKafkaElasticsearchDemo/blob/main/.github/workflows/test.yml
remote failing test case:
https://github.com/besessener/GroovySpringBootKafkaElasticsearchDemo/blob/main/src/test/groovy/me/spring/GroovyDemo/stream/KafkaSendAndReceiveTest.groovy
action: https://github.com/besessener/GroovySpringBootKafkaElasticsearchDemo/runs/3019862203?check_suite_focus=true
The only thing that looks like an error to me from the actions output, is this:
2021-07-08 14:05:35.896 WARN 2693 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-UserGroup-1, groupId=UserGroup] Error while fetching metadata with correlation id 4 : {topic-user=LEADER_NOT_AVAILABLE}
I read many things about not using static ports for the kafka tests. but this is my only kafka test, so I don't really understand how there should be a conflict. Furthermore LEADER_NOT_AVAILABLE could be a problem with a non-existing topic or maybe the consumer is simply not able to properly conenct to the broker. But I don't see any of this.
I still have the feeling it is more related to "localhost:9092" as brokerProperties. Is there an issue in regards to that when using Github actions? Or anything else I am missing?

Running jacoco report where integration tests are in one code base and source code is in another code base

I recently started working on creating jacoco reports for maven projects including unit and integration tests and they seem to work out correctly.
Now I have encountered a different scenario which I am not sure how to approach.
I have one workspace which consists of integration test cases - application A, but the source code does not exist in the same workspace/code base. The source code which actually runs on invoking these integration test scripts are in a different workspace/code base - application B(they are invoked using rest api calls with the localhost urls. The jboss server is started for application B so that the localhost context is up) from the integration tests.
The aim is to invoke these integration tests from application A, which in turn calls the source code of these tests in application B generating the jacoco report of the code coverage for application B.
I am not actually sure how to achieve this.
Can someone provide some input.
Thanks.
If I understand you correctly, you actually have 2 different processes in your scenario:
The "client" process that runs the integration tests and for which jacoco can be easily applied, but it's not what you need
The "server" process that runs the actual JBoss server and executes the actual code.
Client process contacts the server via HTTP.
In this case, I'm afraid jacoco won't be able to provide a coverage for you if you're running the tests from maven/gradle, because jacoco instruments only bytecode on the running JVM. So you have to be "creative" here :)
I'll list here some possible approaches
Disclaimer: I haven't tried them though (didn't work with jboss/java ee), but maybe you'll be able to at least borrow some ideas
The first approach would be running the tests together with the application somehow, like its done for example in spring tests (I'm not sure whether JBoss provides similar capabilities).
The idea is simple:
You run the integration test, it runs the jboss "embedded in the same jvm" and you can inject beans / EJB session beans into the test (like autowiring with spring).
The advantage of such a method is that you'll be able just to use jacoco maven plugin and it will instrument everything for you
I don't know how easy will be achieving this architecture technically, I know that recent jboss versions support embedded mode, So maybe you'll find This link to be a useful foundation
Another direction is to take a look at Arquillian project. They have some jacoco extension that probably will help, but I've never tried it.
And the last approach I can think of is running the jboss server with jacoco agent directly instead of relying on the build system that runs jacoco for you.
The idea here is to stream the results of covered server code into some file / tcp endpoint. So you run the jboss with -javaagent:[yourpath/]jacocoagent.jar and it starts streaming the results wherever you need it to stream. After the tests you should gather these results and prepare a report. You can find Here more information about this approach

Reset database container on openshift

I have a multi-modules vertx application deployed on OpenShift. For integration testing purposes, I would like to deploy a database container with pre-defined data, and destroy it when the test is finished.
How can I achieve this ?
My application uses junit and maven fabric8 plugin to deploy containers in Openshift.
This is something that could be done relatively easy using arquillian-cube, which does support Kubernetes and Openshift.
What arquillian-cube can do for you, is to (optionally) create an ephemeral project, deploy everything you need for your test and once everything is up and running, then start your tests. In the end it can also do the cleaning up for you.
It is quite flexible so according to your needs and requirements it can work with either ephemeral or fixed projects. And also there are pletny of configuration options when it comes to cleaning up.
Last but not least, it does play quite nicely with the fabric8 maven plugin.
https://github.com/arquillian/arquillian-cube/blob/master/docs/kubernetes.adoc

PACT: java-maven

I need few answer for my doubt:
Pact-mock-service Vs pact-jvm-server, is both are same? Pls describe this.
Am implementing the PACT in java-maven
I can able to run this:
https://github.com/anha1/microservices-pact-maven
https://github.com/warmuuh/pactbroker-maven-plugin
Help me to understand this with pact-mock-service and pact-jvm-server
Pact-mock-service is a general mock server built into the pact libraries to support mocking out the other dependency in an integration during a consumer test. If you use any of the consumer test support libraries, you do not need to use it directly.
pact-jvm-server is a controllable server that bundles the Pact-mock-service and allows you to setup and tear down mock servers via HTTP requests. It exists for people who can not,or do not wish to use the consumer test support libraries.
For people using Maven, there is a plugin provided as part of the pact-jvm project that can do provider verification tests and publish to a pact broker. For the consumer tests, they just run as JUnit tests so you don't need any Maven specific plugin.
Of the two links you posted, the first is an example project using a spring-boot application, and the second is a maven plugin that provides publishing to a pact broker only.

Integration tests with Arquillian and Arquillian Spring Framework Extension

I would like to set up an infrastructure for integration testing.
Currently we bootstrap tomcat using maven and then execute httpunit tests.
But the current solution has few drawbacks.
Any changes committed to the database need to be rollback manually in the end if the test
Running code coverage on integration test is not straight forward (we are using sonar).
My goals are:
Allow automatic rollback between tests (hopefully using String #transaction and #rollback)
Simple straight forward code coverage
Using #RunWith that will bootstrap the system from JUnit and not externally
Interacting with live servlets and javascript (I consider switching from httpuinit to selenium…)
Reasonable execution time (at least not longer than the existing execution time)
The goals above look reasonable to me and common to many Java/J2ee projects.
I was thinking to achieve those goals by using Arquillian and Arquillian Spring Framework Extension component.
See also https://github.com/arquillian/arquillian-showcase/
Does anyone have and experience with Arquillian and with Arquillian Spring Framework Extension?
Can you share issues best practices and lesson learned?
Can anyone suggest an alternative approach to the above?
I can't fully answer your question. only some tips
Regarding the automatic rollback. In my case. Using liquibase to init the test data on "hsqldb" or "h2" which could be set as in-memory pattern. Then no need to roll back.
For Arquillian. It's a good real testing approach. What i learned is that "Arauillian Spring Framework Extension" is just a extension. You have to bind to a specific container like "jboss, glasshfish,tomcat" to make the test run.
But i don't know how to apply for a spring-based javaSE program which do not need application server support.
My lesson learned is the jboss port conflict. since jboss-dist is set 8080 as default http port. But our company proxy is same as 8080. So i can't use maven to get the jboss-dist artifact.
Hope others can give more info.

Resources