How to build Quarkus container based on profile with Gradle? - gradle

8.1
I'm able to build my container image with Jib running: ./gradlew clean build -Dquarkus.container-image.build=true
And then I can run it with docker run
The container built is running using the production profile.
I have a separate dev/staging container environment, where I can actually deploy and obviously the configs for passwords and domains are different.
Is there a way to specify the profile during the container build, so for example when it runs it uses the dev profile configurations?

This did the trick:
./gradlew clean build -Dquarkus.container-image.build=true -Dquarkus.profile=dev

Related

Run Gradle build offline in a docker container

I am trying to run gradle build test on my Kotlin app in a docker container which does not have access to the internet hence I copy Gradle in Dockerfile
ADD https://services.gradle.org/distributions/gradle-${GRADLE_VERSION}-bin.zip /usr/bin/gradle.zip
but the problem is that my app has got the following plugin in the app.
plugins {
kotlin("jvm") version "1.3.40"
}
so how would I go about to add these plugins to my container before running gradle offline in container?
You should put all your GRADLE_USER_HOME into the container as well. It contains Gradle wrapper distributions, plugins and dependencies. By default, GRADLE_USER_HOME is ~/.gradle directory.
I suppose, what you can do is run your full build once in a fresh container with Internets access and then just copy /root/.gradle (or whatever your user is configured to be inside the container) from it. Or just use that existing container as a base for your build image.

Running two maven projects in separate docker containers

I have two maven spring boot applications and I was set up two docker file for that.
Inside each container, I am performing the maven install.
The two containers are performing a lot of download for the dependencies and finally packing the application.
Since these two containers are built sequentially, Can I share the maven's local repository of the first container to the second container, so that the second container's maven install will skip the locally available dependency and only fetch extra libraries mentioned in its pom?
Yes, you can.
We do something similar so that our builds are always clean. But to save time on the maven download, we use a docker volume mounted to the m2 directory so that the downloads can be used between builds & docker containers.
docker run -v m2Repository:/root/.m2 some-image
docker run -v m2Repository:/root/.m2 some-other-image
First run takes a while, but the following builds are much faster.
You did not mention, which environment you are running builds on, so I would like to share my solution for Bitbucket Pipelines CI.
We build our ~20 containers in Bitbucket Pipelines CI and every container is Java application with almost same set of dependencies. Bitbucket Pipelines CI uses Docker images for running builds (yes, they are using Docker images to build Docker images). There is an option to specify which Docker image to use for builds.
To avoid downloading all dependencies over and over again and reduce build time, I built custom Docker image which contains all external dependencies of all our modules. All dependencies were gathered using Maven's command in each module:
mvn -Dmaven.repo.local=c:/projects/bitbucket-pipelines-baseimage/local-maven-repo clean install
After that I removed project's artifacts from temp repository "c:/projects/bitbucket-pipelines-baseimage/local-maven-repo" and built Docker image, which includes that temp repository. That image was deployed to Docker Hub and now all our build in Bitbucket Pipelines are using it. Build times were reduced drastically!

Setting up docker agents in TFS

We are using TFS in our organisation and we manage our whole build through shell scripts (which isn't great)...
Our agent have docker installed and we run our build script inside docker. We have several images for maven, gradle, NodeJs, ...
Because of our use of Docker we cant use the maven plugin for example.
I am wondering if I can somehow benefits from the maven plugin while still running on docker?
You could directly use Docker Integration instead of managing build through shell scripts.
The Docker extension adds a task that enables you to build Docker
images, push Docker images to an authenticated Docker registry, run
Docker images or execute other operations offered by the Docker CLI.
It also adds a Docker Compose task that enables you to build, push and
run multi-container Docker applications or execute other operations
offered by the Docker Compose CLI. Lastly, it offers a Docker Deploy
task that enables you to deploy single Docker images or
multi-container Docker applications into Azure.

Running integration tests remotely with maven

How can I run integration tests on an environment other than the one I'm building (running maven) from? I suppose I should use the failsafe plugin, but how would it find the artifact remotely, run the tests and return results?
Specifically: I want to run my tests on a controlled environment, a docker container, regardless of building from the build server or a dev machine.

Running both default and production profile during release

My build is as follows:
The first is the normal build (mvn clean install)
The other is a profile activated by property (mvn clean install -Dbuild=prod)
The first deploys to Nexus.
The second profile deploys to a production server.
How can I run both builds during the Maven release cycle.
I would separate the nexus-deploy out to a different profile and use multiple target execution:
Create a different profile to cater for the normal build and execute both targets on the build server like so:
mvn clean install -Dbuild=prod -Pdeploy
mabe Cargo can to do this. look Appfuse for example, it use mvn jetty:run-war to deploy in jetty and mvn cargo:start start to deploy to tomcat

Resources