Using Kubernetes plugin for Jenkins project instead of pipeline - maven

I'm using the Kubernetes plugin for Jenkins (https://github.com/jenkinsci/kubernetes-plugin).
Using their documentation, I was able to create a Jenkins Pipeline to create a pod and run some maven commands inside the maven container within the pod with the use of a Jenkins pipeline script. There is another kubectl container running some kubectl commands. I haven't done anything fancy with it yet other than trying it out.
I would like to create two Jenkins Projects (or Jobs). One for the maven step and the other for the kubectl step.Then combine the two jobs into a single pipeline. At the end, there would be two individual jobs, and one pipeline running those two jobs. The pipeline being equivalent to what I described in the previous paragraph. I did not see a way to do this for doing Kubernetes things; specifically, I did not see a way to create a script that creates a pod with maven container and do something within that container with a Jenkins project, unlike a Jenkins Pipeline.
Is it possible to do what I'm saying by using the Kubernetes Plugin or not?
Is there a better way to do this?
If not possible, is there another way to do something similar?

Related

Spring boot environment based configuration with docker in Pipeline

I have one micro-service, that is running using a pipeline on AWS EKS cluster. So it passes through each environment. I need to update this application, so that whenever, it will pass through from different environment, it takes specific environment variables. What I thought to do this to add application.properties file for different environment builds and add the SPRING_PROFILES_ACTIVE=prod|dev|test, whatever required. But I am new to this pipeline stuffs and need to understand, where I will add this profile specific properties so that each time, the build runs in different environment, it take the specific environment based profile to activate it.
If you are using jenkins pipeline or any other pipeline for the build, have environment variable for for each environment and during docker build it will automatically build based one environment you select
something like this
docker build --build-arg ENVIRONMENT=$environment-variable(specific env can be from pipeline parameter) .
SPRING_PROFILES_ACTIVE=$ENVIRONMENT
Or else
You can have one file for each env and have that specific file picked during docker build

How to deploy image to kubernetes with jib and maven

I have environment where I can simply push images(created with Jib) to local repository. I want to now to be able to deploy this on kubernetes, but from the "safety" of maven.
I know I can spin some Skaffold magic, but I don't like to have it installed separately. Is there some Jib-Skaffold workflow I can use to continuously force Skaffold to redeploy on source change(without running it in command line)
Is there some Skaffold plugin? I really like what they have here, but proposed kubernetes-dev-maven-plugin is probably internal only.
Skaffold can monitor your local code and detect changes that will trigger a build and deployment in your cluster. This is built-in on Skaffold using the dev mode so it solves the redeploy on source change part.
As for the workflow, Jib is a supported builder for Skaffold so the same dynamic applies.
Although these features automate the tasks, it is still necessary to run it once with skaffold dev and let it run in the "background".

Where does Jenkins store the project source

I have a Jenkins job that uses a script to build my project. On the following line, the script fails mvn -e -X -Dgit='$git' release:prepare.
Because I want to search for the cause of this, I want to go to the Jenkins server and run mvn -e -X -Dgit='$git' release:prepare from the command line, to see if it works.
Does Jenkins store the projects' source code somewhere, such that I can go to that folder and call Maven?
If yes, then where?
Yes, It Stores the project files for the job by default at
/var/lib/jenkins/workspace/{your-job-name}
This is where jenkins suppose the project files to be present or it pulls it from a source before start working/building from it.
Quote from Andrew M.:
"Hudson/Jenkins doesn't quite work that way. It stores configurations and job information in /var/lib/jenkins by default (if you're using the .deb package). If you want to setup persistence for a specific application, that's something you'll want to handle yourself - Hudson is a continuous integration server, not a test framework.
Check out the Wiki article on Continuous Integration for an overview of what to expect."
From this Question on serverfault.
This worked for me:
/var/jenkins/workspace/JobNameExample
but, if your build machine (node) is a different than the one where Jenkins is running (manager), You need specify it:
/var/jenkins/workspace/JobNameExample/label/NodeName
Where you can define label too:
jenkins stores its workspace files currently in /var/jenkins_home/workspace/project_name
I am running from docker though!

Continuous Deployment of builds onto servers from build server

I'm using ansible to deploy and install builds on to my servers, but I have to feed Ansible with build name, to grab it and deploy. I would like to close this loop since I have to deploy the builds thrice a day. Is there a tool to do this so that everytime it sees a new build it will automatically invoke the ansible playbook. Or should I go ahead and write my own tool to do this. I'm open to suggestions.
Ansible itself can't do this for you.
But actually there are zillion of other options available: from simple crontab script to complete CI/CD tools such as Jenkins.
I have used jenkins for a while and I can confirm that Jenkins can do that for you.
Once a commit is done, can it compile your solution and deploy to required environment

Jenkins Multi-Branch Pipeline -- include another script into local Jenkinsfile

We are just starting out using Jenkins Multi-branch pipelines. I like the idea of Jenkins automatically creating a new Jenkins job when a new branch is created. It will make sure that all releasable development is being built in Jenkins. We have about 40 or 50 projects that get branched for almost every release, and creating those 40 or so jobs every time we branch is error prone work.
However, I see there are two types of pipeline builds in Jenkins:
Regular Pipeline builds: You specify the location and branch in your Jenkins job. However, you can specify whether you want to use the script inside your Jenkins job configuration, or a script from your source repository. This would allow us to maintain a single Jenkinsfile for all of our jobs. If we change something in the build procedure, we only have to edit a single Jenkinsfile.
Multi-Branch Pipeline builds: Jenkins will automatically create a new Jenkins job for you when a new branch is created. This means we no longer have to create dozens of new Jenkins projects when a new branch occurs. However, it looks like the Jenkinsfile must be located on the root of the project. If you make a basic change in your build procedure, you have to update all Jenkins projects.
I'd like to be able to use the Multi-branch Pipeline build, but I want to either specify where to pull up the Jenkinsfile from our repository, or include a master Jenkinsfile from a repository URL.
Is there a way to do this with Jenkins Multi-branch pipelines?
If you have common build logic across repos, you can move most of the pipeline logic to a separate groovy script. This script can then be referenced in any Jenkinsfile.
This could be done either by checking another checkout of the repo that the the groovy script is in to another directory and then doing a standard groovy load or, probably the better approach would be by storing it as a groovy script in the Jenkins Global Script Library - which is essentially a self-contained git repo within Jenkins
(see https://github.com/jenkinsci/workflow-cps-global-lib-plugin/blob/master/README.md for more details).
We had a similar requirement, and created a global groovy method in a script that was maintained in Git and deployed to Jenkins' Global script library under /vars/ when it changed:
e.g. the script 'scriptName.groovy' has
def someMethod(){
//some build logic
stage 'Some Stage'
node(){
//do something
}
}
That way the common function could be called in any Jenkinsfile via
scriptName.methodName()

Resources