How to deploy image to kubernetes with jib and maven - maven

I have environment where I can simply push images(created with Jib) to local repository. I want to now to be able to deploy this on kubernetes, but from the "safety" of maven.
I know I can spin some Skaffold magic, but I don't like to have it installed separately. Is there some Jib-Skaffold workflow I can use to continuously force Skaffold to redeploy on source change(without running it in command line)
Is there some Skaffold plugin? I really like what they have here, but proposed kubernetes-dev-maven-plugin is probably internal only.

Skaffold can monitor your local code and detect changes that will trigger a build and deployment in your cluster. This is built-in on Skaffold using the dev mode so it solves the redeploy on source change part.
As for the workflow, Jib is a supported builder for Skaffold so the same dynamic applies.
Although these features automate the tasks, it is still necessary to run it once with skaffold dev and let it run in the "background".

Related

Deploying mvn with multiple executibles on Cloud Run

I have a maven project structured:
/root
/CommonProject
/executable1
/executable2
/subroot
/subrootCommon
/...
So far I am trying to just deploy executable1.
I wanted to the project to use Java 19, I am fine with Java 17 if it's easier.
When I activate cloud shell, I am able to:
Change $JAVA_HOME to jdk 17
clone project
maven package it
run it in cloud shell.
However my project has no mapping for "/", just specific endpoints like "/test/hello" and I do not see anything in WebPreview on port 8080.
I have tried different ways to deploy, I am not familiar enough with Docker, so I tried CloudRun with Cloud Build with trigger from Source.
Here lies my current problem - every build has failed so far. It is using jdk 11 which is a problem (or at least one of them).
I have tried also adding cloudbuild.yaml or local Dockerfile just to deploy jar built manually, but I am still failing.
+FROM openjdk:17
+COPY root/target/executable1-1.0-SNAPSHOT-jar-with-dependencies.jar /home/user/var/run/executable1.jar
+CMD ["java", "-jar", "/home/user/var/run/executable1.jar"]
I have done the same steps to deploy, which were shown in how-to-guides or available online labs, so I think the issue is with the fact that maybe buildpacks do not process correctly projects with dependencies?
executable1 and executable2 depend on CommonProject. Do I need to split my big maven project into separate projects, to build each of them individually?
I have tried Dockerfile, cloudbuild.yaml, something like project.toml.
I would like to deploy for now just 1 project, at one point in the future all executable projects from this maven.

Implement Gitlab AutoDevops with ArgoCD

We have AutoDevops feature implemented with help of gitlab runner and managing the CD stage with ArgoCD. So the CI pipeline builds a docker image , pushes it to gitlab registry and CD stages use the pushed image to deploy the application with help of ArgoCD. On every commit, gitlab runner will trigger the pipeline. Is there are way in which we can use ArgoCD alone to handle this scenario so that the pipeline gets triggered automatically without having to configure runners?
To avoid having both gitlab runner and argocd running in your cluster, you would configure a gitlab webhook pointing to an ArgoCD Git Webhook Configuration.
Your ArgoCD application would then handle all the rest.

Using Kubernetes plugin for Jenkins project instead of pipeline

I'm using the Kubernetes plugin for Jenkins (https://github.com/jenkinsci/kubernetes-plugin).
Using their documentation, I was able to create a Jenkins Pipeline to create a pod and run some maven commands inside the maven container within the pod with the use of a Jenkins pipeline script. There is another kubectl container running some kubectl commands. I haven't done anything fancy with it yet other than trying it out.
I would like to create two Jenkins Projects (or Jobs). One for the maven step and the other for the kubectl step.Then combine the two jobs into a single pipeline. At the end, there would be two individual jobs, and one pipeline running those two jobs. The pipeline being equivalent to what I described in the previous paragraph. I did not see a way to do this for doing Kubernetes things; specifically, I did not see a way to create a script that creates a pod with maven container and do something within that container with a Jenkins project, unlike a Jenkins Pipeline.
Is it possible to do what I'm saying by using the Kubernetes Plugin or not?
Is there a better way to do this?
If not possible, is there another way to do something similar?

How to configure Azure-DevOps release pipeline task to kick off automated UI test scripts on xcode in MacBook?

I have setup the Build pipeline in Azure-DevOps to generate build of xcode automation project. For that, I have used Microsoft hosted MacOS agent on my macbook. Now, i want to setup release pipeline to kick off automated test scripts from TFS/Azure-DevOps Server on the same macBook? Not sure what are the configuration I need to use in release pipeline task. If someone has done this, could you please help me step-by-step?
Did you mean you want the automated test from azure devops to run against your local macBook.
If this is your intention. You may need to setup a self-hosted macOS agent on you local macBook.
Please check here to create a self-hosted agent.
And in the release pipeline, associate your release pipeline to the build artifacts from your build pipeline. Make sure the build artifacts include your test code.If not you may need to add a publish artifact task in your build pipeline to include your test code in the build artifacts which will be downloaded and used in release pipeline.
In your stage create an agent job with the agent pool set to your agent pool with your self-host agent. And add a xcode task to run your test. When you run the release pipeline the test will run on your local macBook.
Here is documents about how to build, test and deploy with xcode. Hope you find above helpful.

Continuous Deployment of builds onto servers from build server

I'm using ansible to deploy and install builds on to my servers, but I have to feed Ansible with build name, to grab it and deploy. I would like to close this loop since I have to deploy the builds thrice a day. Is there a tool to do this so that everytime it sees a new build it will automatically invoke the ansible playbook. Or should I go ahead and write my own tool to do this. I'm open to suggestions.
Ansible itself can't do this for you.
But actually there are zillion of other options available: from simple crontab script to complete CI/CD tools such as Jenkins.
I have used jenkins for a while and I can confirm that Jenkins can do that for you.
Once a commit is done, can it compile your solution and deploy to required environment

Resources