GCP Cloud Build - Buildpack -> gradle -> testcontainers - gradle

having issue by switching to cloud build. Before we were using other platform and just started the grade build. We use spring boot and testcontainers for tests. Now in cloud build the gradle project is going to be built by buildpack. Gradle builds our project and runs tests. These integration tests are failing because testcontainers cannot start required containers. What can be enabled in the cloudbuild.yml to make it possible?
steps:
- name: gcr.io/k8s-skaffold/pack
args:
- build
- '$_GCR_HOSTNAME/$PROJECT_ID/$_SERVICE_NAME:$COMMIT_SHA'
- '--env'
- 'BP_GRADLE_BUILD_ARGUMENTS=$_GRADLE_ARGS'
- '--tag=$_GCR_HOSTNAME/$PROJECT_ID/$_SERVICE_NAME:$_TAG_2'
- '--builder=paketobuildpacks/builder:base'
- '--path=.'
id: Buildpack
entrypoint: pack
Thank you in advance.

To keep this question from being complete unanswered, I recommend that anyone who wishes to perform multi-container integration tests to use the following Github Repository as a reference:
https://github.com/GoogleCloudPlatform/cloudbuild-integration-testing
And to answer OP's question specifically:
There's no need to enable anything specific in order to test things, but integration testing for containers is best performed by other containers that wait for the containers to be built before running tests, as can be seen in this file

Related

How to use Kubernetes e2e framework to build custom kubernetes tests

I stumbled upon this blog post, which describes the Kubernetes e2e test suite.
Unfortunately, there are not enough resources to explain how to use the framework to build custom tests that can run part of the CI pipeline.
How can I use that Go framework? or am I better off building my own framework using any Kubernetes clients' libraries?

Reset database container on openshift

I have a multi-modules vertx application deployed on OpenShift. For integration testing purposes, I would like to deploy a database container with pre-defined data, and destroy it when the test is finished.
How can I achieve this ?
My application uses junit and maven fabric8 plugin to deploy containers in Openshift.
This is something that could be done relatively easy using arquillian-cube, which does support Kubernetes and Openshift.
What arquillian-cube can do for you, is to (optionally) create an ephemeral project, deploy everything you need for your test and once everything is up and running, then start your tests. In the end it can also do the cleaning up for you.
It is quite flexible so according to your needs and requirements it can work with either ephemeral or fixed projects. And also there are pletny of configuration options when it comes to cleaning up.
Last but not least, it does play quite nicely with the fabric8 maven plugin.
https://github.com/arquillian/arquillian-cube/blob/master/docs/kubernetes.adoc

Gradle - Compile submodules in parallel

I have a project with two sub modules.
Client - A UI based on Google's web developer kit
Server - A spring boot based server.
Now in my Gradle config(build file) on server, Im creating a jar file from the client and then including it on the server through the below snippet. Lastly, I create a .war file based on the server config.
dependencies {
compile project(':client')
}
The architecture is similar to Spring Boot's proposed ways of resource handling.
Now, when I run the Gradle build, because server is dependent on the client, the server compilation doesnt start until the client compilation and tests are done.
I feel that I'm not making use of Gradle's parallelism with this way of compiling client and server.
Are there any ways such that a compile and run test cases in parallel and then create a .war file only when both submodule's tasks are complete? How do i access the configurations of the client and server modules and then create a new war file on the rootProject?
You can try to add flag --parallel to your Gradle command. However this is still incubating feature. I noticed significant improvement on building time when running Gradle daemon, so you can try it out as well.
No, this level of parallelism is not currently available. I think the team are slowly working towards general parallel task execution, as described in their spec. That should allow for the kind of behaviour you're requesting.
That said, you can run tests in parallel if they're independent, via the maxParallelForks and forkEvery options. MrHaki gives a short how-to for this on his blog. Note that this only applies to a single Test task instance.

Mavenising WebSphere/BPM tests and running via JUnit

We have a number of Unit Tests written using IID for the modules we've developed. We want them to run on our CI server
We use Maven for build sand JUnit to run the tests. Is there a way to mavenise BPM tests and run them via JUnit. If no then how could we implement a build and deploy to our CI server?
Thanks
Actually, you can. Have a look at http://www.ibm.com/developerworks/bpm/bpmjournal/1412_cai/1412_cai.html .
The solution you are looking for called IBM Business Process Manager Testing Asset.
There is one clue - you have to contact IBM Software Services for WebSphere to get it.

What is the "maven way" for this ant development workflow?

How can maven be configured to support this type of workflow:
One Time Setup Invoke maven to do one time setup of a developers machine such as
Create a custom version of tomcat configured for this application
Create a local postgres database on the developers machine
load sample data into the database
run a junit test to configure other resources needed to run the application
Integration Tests Invoke maven to do run integration tests which should do the following
Create an integration test db
setup the db
Run command line integration tests against the db
Run a test version of tomcat with the application in it
Run command line junit tests that test the restful services exposed by the application
Release Build Invoke maven to do a release build of the system
do all the steps for an integration test
generate resources and configurations that are used on the server rather than production
deposit the end result in a git repo, commit, and push the changes to production
Test Build Invoke maven to do a test build of the system
do all the steps of a release build but configure the test release package with test server configuration
The main thing I am struggling with is that maven has a single build life-cycle with a well defined sequence of phases not sure if the workflow I want to build is a good fit for maven.
Can maven be configured for this type of workflow? If yes what are the key features of maven that allow for the different configurations of the four main ways that I want to use maven?
Update What I mean by this workflow, is that I want to be able to do something like
mvn setup
mvn integration
mvn prod-release
mvn test-release
I know the above example look like ant, I am long time ant user and total noob with maven.
You could setup Maven to do all that...
You probably would use (shock horror) profiles to achieve some of this...
BUT you don't want to do that
You are following ANT style thinking... if you like that style of thinking then use ANT or Gradle and be happy.
If you want to follow the Maven way, then you will solve the problem differently.
Coming from the Maven way, here are my thoughts:
Why do you need one-time setup? I usually have a run profile that dynamically provisions the correct application server and starts it with the App deployed, tearing down everything afterwards when I hit ^C. Typically this involves starting up a database server or two... hence things I have developed like the cassandra-maven-plugin. That way when I am working on a different project (which could be in 10 minutes time) I don't have to worry about background database servers eating up all my laptop's ram.
Integration tests are actually trivial when you have the above working... in fact I created the Maven Failsafe Plugin to make it easy to have plugin execution tied to the appropriate phases for integration testing. The Maven convention is to have a profile called run-its for running integration tests.
Release builds being different from test builds... ugh! You should be building environment agnostic artifacts. Have them pick up their configuration from the environment they are deployed in. That removes the worry that something has changed between the "test" build and the "production" build. If you really need to bundle the config, then I usually would resort to a separate module for taking the agnostic artifact and rebundling with the required configuration. That way it is easy to prove that you have a reproducible transformation and that nothing has changed inbetween what went to QA vs what is going to Ops.
I always make the release builds include the integration testing.
So typically I have my projects such that
$ mvn -Prun
will fire up the application starting from zero. Hitting ^C will tear everything back down again, and mvn clean or in extreme situations if I have a more complex setup process and need some caching mvn post-clean (think really clean) will remove anything that the run profile put into play
To run the integration tests I typically do
$ mvn -Prun-its verify
To make a release I typically do
$ mvn release:prepare release:perform -B
That is (in my view) the ideal way of handling the above steps you need.
HTH.
BTW I have not had to use PostgreSQL specifically (typically my integration tests and run profile can get away with a pure java database such as derby or hsqldb and because the artifacts are environment agnostic it is easy to have the integration test/dev flyweight app server inject the correct JDBC url) so you may hit some issues with regard to PostgreSQL

Resources