So I currently have Jenkins setup on a Mac Mini that is connected to my local network. What I would like to do is have Jenkins execute a build when a push is made to my remote Git repository on Bitbucket. Based on the research that I have done so far, there are people who use Bitbucket POST hook to notify Jenkins when a push is made to the repository. However, this method seems to work if Jenkins is hosted on a remote server. Is there a way to trigger a local Jenkins build from a remote Git repository? Perhaps there is a specific plugin that I should install?
If you don't want to expose your Jenkins machine to the world, you can instead have it poll your Git repository looking for changes:
Builds by source changes
You can have Jenkins poll your Revision Control System for changes. You can specify how often Jenkins polls your revision control system using the same syntax as crontab on Unix/Linux. However, if your polling period is shorter than it takes to poll your revision control system, you may end up with multiple builds for each change. You should either adjust your polling period to be longer than the amount of time it takes to poll your revision control system, or use a post-commit trigger. You can examine the Polling Log for each build to see how long it took to poll your system.
Alternatively, instead of polling on a fixed interval, you can use a URL trigger (described above), but with /polling instead of /build at the end of the URL. This makes Jenkins poll the SCM for changes rather than building immediately. This prevents Jenkins from running a build with no relevant changes for commits affecting modules or branches that are unrelated to the job. When using /polling the job must be configured for polling, but the schedule can be empty.
Related
Question Summary
Is it possible to have "Wait for CI to pass before deploy" enabled on Heroku and have it wait for CircleCI only, not any Github Actions?
Details
WE have a heroku setup linked to a github repository (not using the heroku git repo) which is set to automatically deploy from the default branch when CI (CircleCI) passes. On the github repo we have a job that runs SonarQube over the repository.
The CI job itself takes ~7 minutes, but the SonarQube job is longer, taking upwards of 15 minutes to run.
Our issue is that Heroku appears to be waiting for the GHA job to finish before running the automatic deployment, which is not what I want. Analysis is separate to tests passing.
Tried
Originally, the GHA job was set to autorun on pushes to master. We thought that might be the issue, so changed the setup so that the GHA job only runs on a repository action, which we then triggered from a job on CircleCI on pushes to master. So the CCI job runs to completion in seconds, leaving the GHA to run. Heroku is still waiting.
Ideas
It seems sensible to trigger the deploys from CircleCI jobs after the tests have all passed, but the APIs for heroku seem more focused on deploying by pushing to heroku repos, rather than triggering a build from a github repo.
Move completely to containerised builds rather than building on Heroku (longer term that's the plan, but we'd like to claw back the dead build time before that)
We have a Subversion repository setup in this manor:
http://svn.vegicorp.net/svn/toast/api/trunk
http://svn.vegicorp.net/svn/toast/api/1.0
http://svn.vegicorp.net/svn/toast/data/trunk
http://svn.vegicorp.net/svn/toast/data/branches/1.2
http://svn.vegicorp.net/svn/toast/data/branches/1.3
I've setup a Jenkins Multi-Pipeline build for the entire toast project including all sub-projects -- each sub-project is a jarfile. What I want is for Jenkins to fire off a new build each time any file is changed in one of the toast projects. That project should rebuild. This way, if we create a new sub-project in toast or a new branch in one of the toast sub-projects, Jenkins will automatically create a new build for that.
Here's my Jenkins Multi-Branch setup:
Branch Sources
Subversion
Project Repository Base: http://svn.vegicorp.net/svn/toast
Credentials: builder/*****
Include Branches: */trunk, */branches/*
Exclude Branches: */private
Property Strategy: All branches get the same properties
Build Configuration
Mode: By Jenkinsfile
Build Triggers (None selected)
Trigger builds remotely (e.g., from scripts) Help for feature: Trigger * builds remotely (e.g., from scripts)
Build periodically Help for feature: Build periodically
Build when another project is promoted
Maven Dependency Update Trigger Help for feature: Maven Dependency Update Trigger
Periodically if not otherwise run
Note that the list of Build Triggers list does not include Poll SCM. Changes in the repository does not trigger any build. Jenkinsfiles are located at the root of each sub-project. If I force a reindex, all changed sub-projects get built and all new branches are found. I did originally checked Periodically and reindexed every minute to pick up a change, but that's klutzy and it seems to cause Jenkins to consume memory.
Triggering a build on an SCM change should be pretty basic, but I don't see a configuration parameter for this like I do with standard jobs. I also can't seem to go into sub-projects and set those to trigger builds either.
There must be something really, really simple that I am missing.
Configuration:
Jenkins 2.19
Pipeline 2.3
Pipeline API: 2.3
Pipeline Groovy: 2.17
Pipeline Job: 2.6
Pipeline REST API Plugin: 2.0
Pipeline Shared Groovy Libraries: 2.3
Pipeline: Stage View Plugin: 1.7
Pipeline: Supporting APIs 2.2
SCM API Plugin: 1.2
I finally found the answer. I found a entry in the Jenkins' Jira Database that mentioned this exact issue. The issue is called SCM polling is not being performed in multibranch pipeline with Mercurial SCM. Other users chimed in too.
The answer was that Jenkins Multi-branch projects don't need to poll the SCM because indexing the branches does that for you:
Branch projects (the children) do not poll in isolation. Rather, the multibranch project (the parent folder) subsumes that function as part of branch indexing. If there are new heads on existing branches, new branch project builds will be triggered. You need merely check the box Periodically if not otherwise run in the folder configuration.
So, I need to setup reindexing of the branches. I'm not happy with this solution because it seems rather clumsy. I can add post-commit and post-push hooks in SVN and Git to trigger builds when a change takes place, and then reindex on a periodic basis (say once per hour). The problem means configuring these hooks and then keeping them up to date. Each project needs its own POST action which means updating the repository server every time a project changes. With polling, I didn't have to worry about hook maintenance.
You never mentioned setting up a webhook for your repository, so this may be the problem (or part of it).
Jenkins by itself can't just know when changes to a repository have been made. The repository needs to be configured to broadcast when changes are made. A webhook defines a URL that the repository can POST various bits of information to. Point it to a URL that Jenkins can read, and that allows Jenkins to respond to specific types of information it receives.
For example, if you were using github, you could have Jenkins listen on a url such as https://my-jenkins.com/github-webhook/. Github could be configured to send a POST as soon as a PR is opened, or a merge is performed. This POST not only symbolizes that the action was performed, but will also contain information about the action, such as a SHA, branch name, user performing the action... etc.
Both Jenkins and SVN should be capable of defining the URL they each respectively POST and listen on.
My knowledge lies more specifically with git. But this may be a good place to start for SVN webhooks: http://help.projectlocker.com/knowledge_base/topics/how-do-i-use-subversion-webhooks
Maybe you need something under version control in the base directory. Try putting a test file here http://svn.vegicorp.net/svn/toast/test.txt. That may make the poll SCM option show up.
My Google-fu is failing me for what seems obvious if I can only find the right manual.
I have a Gitlab server which was installed by our hosting provider
The Gitlab server has many projects.
For some of these projects, I want that Gitlab automatically pushes to a remote repository (in this case Github) every time there is a push from a local client to Gitlab.
Like this: client --> gitlab --> github
Any tags and branches should also be pushed.
AFAICT I have 3 options:
Configure the local client with two remotes, and push simultaneous to Gitlab and Github. I want to avoid this because developers.
Add a git post-receive hook in the repository on the Gitlab server. This would be most flexible (I have sufficient Linux experience to write shell scripts as git hooks) and I have found documentation on how to do this, but I want to avoid this too because then the hosting provider will need to give me shell access.
I use webhooks in Gitlab. I am unfamiliar with what the very basics of webhooks are, and I am unable to locate understandable documentation or even a simple step-by-step example. This is the documentation from Gitlab that I found and I do not understand it: http://demo.gitlab.com/help/web_hooks/web_hooks
I would appreciate good pointers, and I will summarize and document a solution when I find it.
EDIT
I'm using this Ruby code for a web hook:
class PewPewPew < Sinatra::Base
post '/pew' do
push = JSON.parse(request.body.read)
puts "I got some JSON: #{push.inspect}"
end
end
Next: find out how to tell the gitlab server that it has to push a repository. I am going back to the GitLab API.
EDIT
I think I have an idea. On the server where I run the webhook, I pull from GitLab and then I push to Github. I can even do some "magic" (running tests, building jars, deploying to Artifactory,...) before I push to GitHub. In fact it would be great if Jenkins were able to push to a remote repository after a succesful build, then I wouldn't need to write my own webhook, because I'm pretty sure Jenkins already provides a webhook for Gitlab, either native or via a plugin. But I don't know. Yet.
EDIT
I solved it in Jenkins.
You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
would work of course.
is possible but dangerous because GitLab shell automatically symlinks hooks into repositories for you, and those are necessary for permission checks: https://github.com/gitlabhq/gitlab-shell/tree/823aba63e444afa2f45477819770fec3cb5f0159/hooks so I'd rather stay away from it.
Web hooks are not suitable directly: they make an HTTP request with fixed format on certain events, in your case push, not Git protocol requests.
Of course, you could write a server that consumes the hook, clones and pushes, but a service (single push and no deployment) or GitLab CI (already implements hook management) would be strictly better solutions.
services are a the best option if someone implements it: live in the source tree, would do a single push, and require no extra deployment overhead.
GitLab CI or othe CIs like Jenkins are the best option currently available. They are essentially already implemented server for the webhooks, which automatically clone for you: all you have to do then is to push from them.
The keywords you want to Google for are "gitlab mirror github". That has led me to: Gitlab repository mirroring for instance. There seems to be no perfect, easy solution today.
Also this has already been proposed at the feature request forum at: http://feedback.gitlab.com/forums/176466-general/suggestions/4614663-automatic-push-to-remote-mirror-repo-after-push-to Always check there ;) Go and upvote the request.
The key difficulty now is how to store the push credentials.
I solved it in Jenkins. You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
I added "-publisher" jobs that run after "" is built successfully. I could have done it in one job, but I decided to split it up. The build jobs are triggered by a web hook in GitLab; the publisher jobs are using a #daily schedule from the BuildResultTrigger plugin.
Rather than having teamcity log onto the gitolite server several tens of thousands of times each day - and also sitting around waiting for the poll to happen (or starting it manually).
It would be nice if it was possible to set it up gitolite hooks that inform TeamCity that the repository has changed.
Is such a configuration possible with TeamCity and gitolite?
I know Jenkins has a github plugin that works nicely - I use that setup for some Minecraft CI I am running privately.
One way would be to gitolite (through a VREF hook) to call TeamCity through its REST API, in order to launch a build through web request.
You just need to make web request to the following URL:
http://YOURSERVER/httpAuth/action.html?add2Queue=btId
, where btId is build type Id – unique identifier for each build configuration.
To get it, you can just look for it in browser address bar, when clicking on build configuration, or use TeamCity REST API for details.
The OP Morten Nilsen didn't need a VREF:
add a file "post-receive" to .gitolite/hooks/common and
run gitolite setup --hooks-only
With the Hudson or Jenkins continuous integration servers, when a build is triggered either by an anonymous user, or by the CI server polling the repository, a pseudo-user is created with the data scraped from the commit information of the last commit.
How do I prevent this, as it's cluttering the list of registered users? I try to default to using post-receive hooks for scheduling builds, but for some repositories (e.g. those hosted by SourceForge), this is not an option as the machine running the repository is prevented from accessing external URLs
You can't prevent these from being created, as they are involved with how Jenkins logging and tracking works. However, if you need to see a list of only "real" users, you can do this easily by going to manage jenkins/manage users - users that lack a login will not appear.