Jenkins Multi-Pipeline Build Not Detecting Changes in Repository - jenkins-pipeline

We have a Subversion repository setup in this manor:
http://svn.vegicorp.net/svn/toast/api/trunk
http://svn.vegicorp.net/svn/toast/api/1.0
http://svn.vegicorp.net/svn/toast/data/trunk
http://svn.vegicorp.net/svn/toast/data/branches/1.2
http://svn.vegicorp.net/svn/toast/data/branches/1.3
I've setup a Jenkins Multi-Pipeline build for the entire toast project including all sub-projects -- each sub-project is a jarfile. What I want is for Jenkins to fire off a new build each time any file is changed in one of the toast projects. That project should rebuild. This way, if we create a new sub-project in toast or a new branch in one of the toast sub-projects, Jenkins will automatically create a new build for that.
Here's my Jenkins Multi-Branch setup:
Branch Sources
Subversion
Project Repository Base: http://svn.vegicorp.net/svn/toast
Credentials: builder/*****
Include Branches: */trunk, */branches/*
Exclude Branches: */private
Property Strategy: All branches get the same properties
Build Configuration
Mode: By Jenkinsfile
Build Triggers (None selected)
Trigger builds remotely (e.g., from scripts) Help for feature: Trigger * builds remotely (e.g., from scripts)
Build periodically Help for feature: Build periodically
Build when another project is promoted
Maven Dependency Update Trigger Help for feature: Maven Dependency Update Trigger
Periodically if not otherwise run
Note that the list of Build Triggers list does not include Poll SCM. Changes in the repository does not trigger any build. Jenkinsfiles are located at the root of each sub-project. If I force a reindex, all changed sub-projects get built and all new branches are found. I did originally checked Periodically and reindexed every minute to pick up a change, but that's klutzy and it seems to cause Jenkins to consume memory.
Triggering a build on an SCM change should be pretty basic, but I don't see a configuration parameter for this like I do with standard jobs. I also can't seem to go into sub-projects and set those to trigger builds either.
There must be something really, really simple that I am missing.
Configuration:
Jenkins 2.19
Pipeline 2.3
Pipeline API: 2.3
Pipeline Groovy: 2.17
Pipeline Job: 2.6
Pipeline REST API Plugin: 2.0
Pipeline Shared Groovy Libraries: 2.3
Pipeline: Stage View Plugin: 1.7
Pipeline: Supporting APIs 2.2
SCM API Plugin: 1.2

I finally found the answer. I found a entry in the Jenkins' Jira Database that mentioned this exact issue. The issue is called SCM polling is not being performed in multibranch pipeline with Mercurial SCM. Other users chimed in too.
The answer was that Jenkins Multi-branch projects don't need to poll the SCM because indexing the branches does that for you:
Branch projects (the children) do not poll in isolation. Rather, the multibranch project (the parent folder) subsumes that function as part of branch indexing. If there are new heads on existing branches, new branch project builds will be triggered. You need merely check the box Periodically if not otherwise run in the folder configuration.
So, I need to setup reindexing of the branches. I'm not happy with this solution because it seems rather clumsy. I can add post-commit and post-push hooks in SVN and Git to trigger builds when a change takes place, and then reindex on a periodic basis (say once per hour). The problem means configuring these hooks and then keeping them up to date. Each project needs its own POST action which means updating the repository server every time a project changes. With polling, I didn't have to worry about hook maintenance.

You never mentioned setting up a webhook for your repository, so this may be the problem (or part of it).
Jenkins by itself can't just know when changes to a repository have been made. The repository needs to be configured to broadcast when changes are made. A webhook defines a URL that the repository can POST various bits of information to. Point it to a URL that Jenkins can read, and that allows Jenkins to respond to specific types of information it receives.
For example, if you were using github, you could have Jenkins listen on a url such as https://my-jenkins.com/github-webhook/. Github could be configured to send a POST as soon as a PR is opened, or a merge is performed. This POST not only symbolizes that the action was performed, but will also contain information about the action, such as a SHA, branch name, user performing the action... etc.
Both Jenkins and SVN should be capable of defining the URL they each respectively POST and listen on.
My knowledge lies more specifically with git. But this may be a good place to start for SVN webhooks: http://help.projectlocker.com/knowledge_base/topics/how-do-i-use-subversion-webhooks

Maybe you need something under version control in the base directory. Try putting a test file here http://svn.vegicorp.net/svn/toast/test.txt. That may make the poll SCM option show up.

Related

Can Bitbucket rerun pull request checks when target branch is modified?

I am currently setting up a CI system that will check for a passing deployment against a test environment as part of a pre-merge pull request check. This system is using Bamboo and Bitbucket, and will stop devs from merging their feature branches into the main branch if this validation fails. However, I am running into the (possibly common on my project) corner case of multiple pull requests being open at the same time, passing validation, and then being merged. In this scenario the PRs might all separately pass validation while all of them combined would break the build (I.E: PR#1 modifies a method name referenced by PR#2).
Is there a way to configure Bitbucket / Bamboo to rerun builds on pull requests if the target branch has been modified since the check last ran?
On git (bitbucket) level you could make sure to synchronize any feature or bugfix branch with an outgoing pull request by merging the latest common target branch (for example develop) immediately after a successful pull request merge to this target. This way you invalidate the 'latest' build result on feature or bugfix branches and they would be re-built because their git commit hash has changed. If any re-build of the feature branch fails, it won't be merged.
Ulrich
// Izymes - our mission is to eliminate boring from work. We build apps that turbo-charge team velocity through contextual automation.

Converting DSL jobs into a pipeline in Jenkins

I'm trying to migrate old fashioned Jenkins DSL jobs (in Groovy) to the new descriptive pipeline form.
Since I'm very new to the pipeline and I could not find any answer to my noob problem, I'll firstly describe my scenario here:
Supposing I have 3 DSL jobs, one to build and save the artifact generated in a repository like Artifactory, another to tag the master branch and the last one is used to deploy to prod. All jobs use the same Git repository.
The building job is usually run many times during development. It can be triggered manually or as a response to events in the Git repo, e.g. merge requests and pushes.
For simplicity, let's assume the tagging job only needs to tag the master branch in the repo. This will only be run once in a while, manually, when we are pretty sure the master branch will go to prod.
Artifact gets deployed using a third job, also manually.
So here are my questions:
As I understand we can only have one file per branch in the repo, so how can I configure such a setup using a pipeline defined in only one Jenkinsfile?
How can I manually trigger the tagging job only (meaning compile/test/generate the artifact without uploading and then if everything tests ok, tag the version)?
In this situation, will it be easier for me if I just implement the building job in the pipeline and keep the others as DSL scripts?
Many thanks for any suggestions!

Jenkins autobuild goes into a loop when github commit triggers the build

I created a webhook in jenkins and connected it to github webhook & services.
I came upon the following issue When the build is completed, the pom.xml is updated with the version and tag . This triggers build job again and its goes into a loop until, I manually stop it .
I have set the build trigger to "Build when a change is pushed to GitHub"
I would like to find out how to stop the build trigger when the pom.xml is updated only as part of the build?
In the source code management job configuration section add Additional Behaviors and select Polling ignores commits from certain users and provide the user name your Jenkins job uses to checkin pom.xml. You can also use Polling ignores commits in certain paths and provide path to pom.xml.
I'd suggest not committing the version update to the master branch but create a separate tag every time. Something like this:
v1 v2 v3
/ / /
--A----B----C (master)
I got this approach from Real-World Strategies for Continuous Delivery with Maven and Jenkins video (corresponding slides) - it contains other tips on setting up build pipelines with Maven as well.

Choose multiple branches when performing a build

We have multiple layers of our products split into different build configurations for continuous integration. For the sake of this question, let's just say we have a "Front-End CI" build, and a "API CI" build. The VCS roots are configured to pull in all branches, and triggered to run upon checkin, as should be expected for CI.
Now I have my User Acceptance project, where I use CloudFormation to dynamically spin up servers to which I deploy. I have snapshot dependencies set up for the CI builds mentioned above, and everything works as expected for the default branches on each of the VCS roots and dependencies. I expect that a feature branch for the front-end may not necessarily necessitate a branch from the default for the API, and the current way I have it setup accounts for that as well.
That's where I begin to have issues. If I have to branch both the front-end and API, I cannot get TeamCity to do what I want in this regard. My question is this: how do I tell Team City to run a UA build using branch "A" from the Front-End CI build config and branch "B" from the API CI build config, where "A" and "B" can be any arbitrary branch? Currently right now, all branches from both snapshots are shown when I look at the UA build config. Here's a good picture:
If I run api-branch, it will always use the default branch from the Front-end CI snapshot. Same for any branch on the front-end snapshot. I cannot seem to find a way to specify this in the configuration or when starting a build.
I'm up for just about anything to address this, including build configs that are just cloned off of each other to specify branches the way they're meant to, but I'm just not seeing how I can do that either. Thanks!
Create a teamcity template target which monitors both the front end and API repositories and can trigger on changes. This should be one target (and not 2 different targets). Parameterize the branch names so that actual targets have to give the branch name
I would suggest creating a mapping of the frontend:api branches in a datastore( file,db,nosql) . Then dynamically create teamcity targets (through REST API) for each new/modified combination and explicitly set the branch names. Once the targets are created they will automatically run whenver there are any changes.

Trigger local Jenkins build with a push to a Bitbucket repository

So I currently have Jenkins setup on a Mac Mini that is connected to my local network. What I would like to do is have Jenkins execute a build when a push is made to my remote Git repository on Bitbucket. Based on the research that I have done so far, there are people who use Bitbucket POST hook to notify Jenkins when a push is made to the repository. However, this method seems to work if Jenkins is hosted on a remote server. Is there a way to trigger a local Jenkins build from a remote Git repository? Perhaps there is a specific plugin that I should install?
If you don't want to expose your Jenkins machine to the world, you can instead have it poll your Git repository looking for changes:
Builds by source changes
You can have Jenkins poll your Revision Control System for changes. You can specify how often Jenkins polls your revision control system using the same syntax as crontab on Unix/Linux. However, if your polling period is shorter than it takes to poll your revision control system, you may end up with multiple builds for each change. You should either adjust your polling period to be longer than the amount of time it takes to poll your revision control system, or use a post-commit trigger. You can examine the Polling Log for each build to see how long it took to poll your system.
Alternatively, instead of polling on a fixed interval, you can use a URL trigger (described above), but with /polling instead of /build at the end of the URL. This makes Jenkins poll the SCM for changes rather than building immediately. This prevents Jenkins from running a build with no relevant changes for commits affecting modules or branches that are unrelated to the job. When using /polling the job must be configured for polling, but the schedule can be empty.

Resources