What is the best practice when you run Spring Boot Tests that depend on Testcontainers on Google Cloud Build?
Currently we have the following setup (unrelevant steps ommited):
steps:
# Git
- name: 'gcr.io/cloud-builders/git'
args: ['-c', 'http.sslVerify=false', 'clone', '<my source repository>',
'--branch', 'my-tag', '--single-branch']
# Build
- name: 'gcr.io/cloud-builders/mvn'
args: ['package', '-DskipTests=true']
dir: src-dir
We have added some tests that use Testcontainers, and we want to enable the tests for the build on the cloud, but to run them on the cloud we need a cloud builder that have both Maven and docker.
I can't run the tests, on different step, separately from the build, as far as I know.
Is there a cloud builder that combines maven and docker or the only way is to create a custom one?
Thanks!
Related
I have a spring boot app which is deployed to AWS Beanstalk via Bitbucket pipelines.
In the bitbucket-pipelines.yml, I create the app.jar file first which has 2 script files. Then in the script section, I deploy the jar file to AWS beanstalk as follows:
script:
- pipe: atlassian/aws-elasticbeanstalk-deploy:0.6.7
variables:
AWS_ACCESS_KEY_ID: '$AWS_ACCESS_KEY_ID'
AWS_SECRET_ACCESS_KEY: '$AWS_ACCESS_KEY_SECRET'
AWS_DEFAULT_REGION: 'xxxxxx'
APPLICATION_NAME: 'xxxxx'
ENVIRONMENT_NAME: 'xxxxx'
ZIP_FILE: 'myapp-$version.jar'
S3_BUCKET: '$S3_BUCKET'
VERSION_LABEL: 'myapp-${BITBUCKET_BUILD_NUMBER}'
COMMAND: 'all'
This works fine. The app is getting deployed. Now I want to run the 2 scripts which I put inside the jar file after the deployment. Basically these scripts do some tasks just before the app is ready to be used (soon after the successful deployment). Where do I define how to run those 2 scripts after the deployment to EBS?
I know bitbucket-pipelines.yml can have after-script section. But that runs before the EBS deployment and it is not happening inside the ec2 instance. Probably it is happening inside the bitbucket build server.
I also noticed that there are post deploy hooks in EBS. How do I declare those scripts in bitbucket-pipelines.yml? Or where do I put those scripts inside the myapp.jar file? Inside the myapp.jar, I have BOOT-INF, META-INF and orgspringframework stuff. I can copy those scripts inside the BOOT-INF/classes by Maven build.
Anyone has done this using spring boot, maven, bitbucket pipelines, AWS elasticbeanstalk etc?
The application source bundle I created using maven is the spring boot executable jar file (myapp.jar). Should I create a jar file wrapping the myapp.jar and include those .platform/hooks/postdeploy/scripts... etc?
I'm new to github actions (comming from gitlab-ci) I'm trying to run a integration test with testcontainers in the pipeline and I'm stucked. Here is my current definition.
name: Run Gradle
on: push
jobs:
gradle:
strategy:
matrix:
os: [ ubuntu-18.04 ]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout#v1
- uses: actions/setup-java#v1
with:
java-version: 11
- uses: eskatos/gradle-command-action#v1
with:
build-root-directory: backend
wrapper-directory: backend
arguments: check assemble
How can I ensure that the docker deamon for testcontainers project is available during my run?
You can check the installed packages/software for each GitHub Actions Runner as part of the virtual-environment GitHub repository.
For ubuntu-18.04 you can find the list here. Docker and Docker Compose are already installed on the runner and you can use them without any additional configuration for Testcontainers.
I'm using GitHub Actions for a lot of projects that make heavy use of Testcontainers without any problems.
We have AutoDevops feature implemented with help of gitlab runner and managing the CD stage with ArgoCD. So the CI pipeline builds a docker image , pushes it to gitlab registry and CD stages use the pushed image to deploy the application with help of ArgoCD. On every commit, gitlab runner will trigger the pipeline. Is there are way in which we can use ArgoCD alone to handle this scenario so that the pipeline gets triggered automatically without having to configure runners?
To avoid having both gitlab runner and argocd running in your cluster, you would configure a gitlab webhook pointing to an ArgoCD Git Webhook Configuration.
Your ArgoCD application would then handle all the rest.
I have a Spring boot application which I want to auto-deploy to App Engine. I don't want to create the docker image then deploy it. The build is failing due to 'Cloud SDK not found error'
[ERROR] Failed to execute goal com.google.cloud.tools:appengine-maven-plugin:1.3.2:deploy (default-cli) on project location-finder-rest-api: Execution default-cli of goal com.google.cloud.tools:appengine-maven-plugin:1.3.2:deploy failed: The Google Cloud SDK could not be found in the customary locations and no path was provided.
I followed all the guidelines at https://cloud.google.com/source-repositories/docs/quickstart-triggering-builds-with-source-repositories.
As per the documentation, app.yaml file is created at src/main/appengine. The contents of app.yaml is
# [START runtime]
runtime: java
env: flex
handlers:
- url: /.*
script: this field is required, but ignored
runtime_config: # Optional
jdk: openjdk8
manual_scaling:
instances: 1
# [END runtime]
In order to trigger the build, I have to specify the cloudbuild.yaml file. The contents of this file are:
steps:
- name: 'gcr.io/cloud-builders/mvn'
args: ['appengine:deploy','-Pprod']
The official document for cloud-builder suggest using 'install' as an argument to the mvn step. But this step does not deploy the application.
Am I missing any configuration?
Under the hood, the appengine:deploy goal uses the Cloud SDK to actually deploy your app. It isn't provided by the gcr.io/cloud-builders/mvn image (each Cloud Build step runs in its own container).
You could use separate build steps to install and deploy your app, something like:
steps:
- name: 'gcr.io/cloud-builders/mvn'
args: ['install']
- name: 'gcr.io/cloud-builders/gcloud'
args: ['app', 'deploy']
It worked by making slight modifications to the solution suggested above by LundinCast. Moreover, appengine maven plugin needs to be updated to 2.0.0+. This version automatically downloads the necessary dependencies.
steps:
- id: 'Stage app using mvn appengine plugin on mvn cloud build image'
name: 'gcr.io/cloud-builders/mvn'
args: ['package', 'appengine:stage', '-Pprod']
- id: "Deploy to app engine using gcloud image"
name: 'gcr.io/cloud-builders/gcloud'
args: ['app', 'deploy', 'target/appengine-staging/app.yaml']
In my project I use Travis-CI for continuous integration (builds on every MR to master branch) and also for deploying the artifact to Heroku. Here is my .travis.yml file:
language: java
jdk: oraclejdk8
branches:
only:
- master
script:
mvn package
deploy:
provider: heroku
api_key: $HEROKU_API_KEY
notifications:
email:
on_success: never
on_failure: always
And here is my Procfile:
web java -Dserver.port=$PORT -jar target/my-artifact.jar
Here you can see that I use PORT Heroku variable, but I also use few custom variables. Sometimes I need to update their values after new build. Previously I did it manually, but I'm looking how I can automate this. I need to update Heroku environment variables with values which I determine in time of Travis-CI build. How can I do that?
You can set your environment variables using the Heroku platform API: https://devcenter.heroku.com/articles/platform-api-reference#config-vars
In Travis, you can run a task pre-deploy using the 'before_deploy' step (https://docs.travis-ci.com/user/customizing-the-build#The-Build-Lifecycle)
So create a script that uses the Heroku platform API to update your environment and run it as part of your before_deploy step.