How to prevent sonarqube with SCM integration (jgit) enabled from trying to mess with the global jgit config - sonarqube

we run automated builds for several projects on a shared pool of build machines. For security reasons, the build agent can not write to the config files in git home. (to prevent people from messing with the global git configs)
We recently enabled sonarqubes SCM integration to get blame information (which is kind of required to get meaningful PR decorations). Works like a charm. But I noticed, that sonarqube (or jgit to be precise) tries to mess with the global (jgit) config ($HOME/.config/jgit/config).
Since the process is not allowed to write to $HOME, an exception is thrown. Everything works as expected, but the exception is logged and clutters the builds output.
Is there any way to either
Use a custom 'home' for jgit (and essentially write the changes to some file we trash anyway right after the build)
Tell sonarqube to stop trying to mess with that configuration
Any help and tips are much appreciated.

Related

Is it possible to run a TeamCity build with a VCS Root that is unavailable?

My team has using TeamCity to automate some tiresome maintenance tasks, and over time, we've found we want to re-use common pieces, so we've got some common functions in a repository on Bitbucket.
For better or for worse, our Bitbucket has a daily backup/maintenance period that, when active, blocks all of our builds from running with the following error:
Failed to collect changes ... Bitbucket is currently unavailable
I've looked at the various checkout modes, though we're generally limited to checking out files on the server (Rather than agent). I had figured that if the files are checked out to the server, then if Bitbucket were unavailable, there would be some way to fall back on "Whatever is already there". Especially as we don't have Clean build checked.
Is there some way that we can fall back on whatever is already checked out on the TeamCity server? Or do we need to set up some kind of redundancy?
Yes you can.
you need to simply build a project in a directory location of a teamcity machine and then call the run from that location instead.
You can have a build with No VCS at all. This build run will utilize the folder you mentioned instead of cloning a VCS.

Getting Maven to Report Which File Has Local Modifications

I have a Jenkins job that builds a simple Maven project. If all I do is build, it works just fine. The problem arises when I try and do a release, dry run or regular. It consistently fails with the Cannot prepare the release because you have local modifications error. I have wiped out the workspace, but the problem persists. Is there any way I can get Maven to tell me which file it thinks has been modified? I would assume that by wiping out the local workspace and immediately running the dry run release that there wouldn't be any opportunity for anything to get modified.
Please note, I do not have access to the Jenkins server or the slave that is running the actual release build, so I can't use any tools there (like SVN) to determine what is supposedly modified.
You can use the Maven SCM plugin to do a diff.
https://maven.apache.org/scm/maven-scm-plugin/diff-mojo.html
Basically, integrate the maven plugin upstream of the failure, and see if anything has been changed. I imagine you might be able to see the output in the log, but if you cannot, you might be able to move your "real" maven pom.xml aside and replace it with one that generates a diff file and with the help of the maven build helper plugin, attaches that file as an additional aritfact (to a pom target).
It turned out the solution to my problem was to not use the "Local to the workspace" strategy for my private Maven repository in the Jenkins job configuration. By changing that to the "Local to the executor" strategy the problem went away. I'm still not sure why it was having the problem in the workspace, but this solution resolved it form me, and might work for others.

Can I upload TeamCity definitions as XML?

TeamCity appears to store the definitions for builds, projects, templates etc as XML internally.
This is exposed in the "Administration > Audit" view where you can see diffs that people made to individual configurations, at URLs like http://teamcityserver/admin/settingsDiffView.html?id=project:project10&versionBefore=8&versionAfter=9&actionId=3151
I'd like to manage a TeamCity setup partially from outside the web interface - e.g. for example keep the build definitions in version control and perhaps programmatically generate them.
Is there any way I can directly upload definitions in this format (or any similar alternative)? I'm aware that there are various APIs and extension points to TeamCity but haven't managed to find any that gives direct access to anything like this.
I can live with the format changing with TeamCity versions if necessary - it would be a reasonable price to pay for the other benefits.
For TeamCity 9.x and newer
As reported by Ganesh in the comments to this answer, an option was added in 9.x that supports changes and versioning through Source Code Management (SCM) tools. Please see his answer for 9.x and beyond.
For TeamCity 8.x and older
It might not be the "approved" way, but you can edit the project files on disk, and those changes will appear in your build configs. I have successfully edited them outside of the Web UI after they were created.
So, you could probably open that folder up as a restricted network share or set up ssh.
You'll find it at $TeamCityData/config/projects/ and then they are stored in subfolders such as $projectName/buildTypes/$buildFile.xml
An example is:
E:\TeamCityData\config\projects\CSandbox\buildTypes\CSandbox_Project1TrunkBuildUnitTest.xml
TeamCity 9 adds a new "Versioned Settings" feature which keeps these XML files under version control and allows changes to be made via the VCS.
In TeamCity 9.0 this can be git or mercurial, and the upcoming TeamCity 9.1 will add support for Perforce and Subversion.
I've been using it with git for a few months and it works quite nicely in practice.
I sometimes have trouble persuading TeamCity to notice changes coming in from the VCS - particularly when deleting projects - but otherwise it's been really useful for standardising configuration and spinning up new job chains quickly.
Another slight annoyance is that you can't configure the location within the repository that the settings come from - it's always .teamcity in the root - so I've had to use multiple branches or repositories to manage multiple TeamCity servers.

SVN Post-Commit to Update Working Copy when Working Copy is on a Network Drive

I work for a fairly new web development company and we are currently testing subversion installations to implement a versioning system. One of the features we need the versioning system to perform is to update the development server with an edited file once it has been committed.
We would like to maintain one server for all of our SVN repositories, even though, due to system requirements, we need to maintain several separate development servers. I understand that the updates are fairly simple when the development server resides in the same location as SVN, but that is just not possible for us. So, we need to map separate network drives to the SVN server for each development server.
However, this errors on commit. Here is my working copy test directory, as referenced in the post-commit.bat file:
SET WORKING_COPY=Z:\testweb
This, however, results in an error...
post-commit hook failed (exit code 1) with output: svn: Error resolving case of 'Z:\testweb'
I'm sure this is because the server is not the same user as me and therefore does not have the share I need mapped to "Z" - I just have no idea how to work around this. Can anyone help?
UPDATE: The more I look in to these issues it appears that the real solution to the problem is to use a CI Server to accomplish what I am attempting to accomplish. I am currently looking in to TeamCity and what it might do for us.
Don't do this through a post-commit hook. If you ever manage to get the hook to succeed, you'll be causing the person who did the commit to wait until the update is complete. Instead, I recommend that you use Jenkins which is a continuous build engine.
It is possible that you don't have anything to build. After all, if you're using PHP or JavaScript, there's nothing to compile. However, you can still use Jenkins to do the update for you.
I can't get into the nitty-gritty detail hear, but one of the things you can do with Jenkins is redefine its working directory. You can do this by clicking on the Advanced button when you define a job, and it'll ask you where you want the working directory. In this case, you can specify your server's working directory.
One of the things you can do with Jenkins is have it automatically run tests, or maybe do a bit smoother update. For example, you might have to restart your web server when you change a few files, or maybe you need to make sure that if you're changing 100 files, they all get changed at once, or your server isn't in a stable state. You could use Jenkins to do this too. And, if there are any problems, you can have Jenkins email the person who is responsible for the server that the server update failed.
Jenkins is easy to setup and use. You can download it and start up Jenkins in 10 minutes. Setting up a job in Jenkins might take you another 15 minutes if you had never seen Jenkins before and had no idea how it works.

Version Control for Hudson Continuous Integration Build Jobs

We have a continuous integration server with over 40 jobs that are constantly changing. I would like to version control continuous integration build jobs in Hudson so we can roll back changes if we have problems.
Is there a Hudson plugin that will do this or other solution that already exists or should I keep the config.xml files in SVN.
Hudson Labs has a really great write up on this, Keeping your configuration and data in Subversion
This is the first bit of the article
We all know that keeping important
files in version control is critical,
as it ensures problematic changes can
be reverted and can serve as a backup
mechanism as well. Code and resources
are often kept in version control, but
it can be easy to forget your
continuous integration (CI) server
itself! If a disk were to die or fall
victim to a misplaced rm -rf, you
could lose all the history and
configuration associated with the jobs
your CI server manages.
It’s pretty simple to create a
repository, but it isn’t obvious which
parts of your $HUDSON_HOME you’ll want
to backup. You’ll also want to have
some automation so new projects get
added to the repository, and deleted
ones get removed. Luckily we have a
great tool to handle this: Hudson!
We have a Hudson job which runs
nightly, performs the appropriate SVN
commands, and checks in
You only seem to be interested in the configuration, which is fine, just ignore or filter out the bits about the data and focus on the configuration.
This is one of the more recent threads about using version control with Hudson's configuration on the Hudson users list.
There are no plugins to do store configuration in an SCM right now (March 2010) though the backup plugin might do something close to what you want, but perhaps with less of a view of 'change' and more of just a snapshot at any given time.
The relatively new Job Config History plugin gets part of the way there - it doesn't actually store the configurations in source control, but it does provide history and auditing of changes to jobs.
You could look into the SCM Sync configuration plugin.
It automatically commits all of your jenkins config changes to svn. that way you can track configuration errors easily.
https://wiki.jenkins-ci.org/display/JENKINS/SCM+Sync+configuration+plugin

Resources