Maven artifacts are deployed to a wrong location at when invoking Gradle install task on Team City CI - maven

I'm trying to setup simple continuous integration system on my local PC. I use gradle as my build system (gradle wrapper option). One of the steps in the build process in to deploy build artifacts to a local repository (located at:
"{user_dir}/.m2/repository)". It works ok when I run it from local PC, but when it runs on Team City CI (version 9) it deploys it to a
"{windows_dir}\System32\config\systemprofile.m2\repository". This is probably some configuration issue but I couldn't manage to solve it. In the build logs I saw that it can't find the local repository in the settings.xml file. I've tried to add it but it didn't help. How can I configure Team City to use local repository folder in user directory?

I found out what was the issue. If you install Team City system services to run under admin account it will always use windows directory. In order to use the User's directory you need to install the services under that user account.
Source: https://confluence.jetbrains.com/display/TCD9/Maven+Server-Side+Settings.

Related

No agent found in pool Hosted which satisfies the specified demands: svn maven

Using VSTS, and following this step by step guide, I still get this error.
Am using the hosted pool, and have verified it has maven
What could be missing?
Thanks in advance!
That's actually the svn capability which is missing on the hosted agent. You might have to add a user capability and add a task to run a non-installed version of it and set the path on the current machine.
The other solution would be to install your build machine with all the tools you need

Using TeamCity to build local project in order to test setings

I want to experiment with my project output on the TeamCity. I dont want to use git or any other version control system.
Is there a way to turn off the VCS in teamcity and make it build and execute local projects.
Just install TeamCity on your machine, up to 20 build configurations it's free. VCS roots are completely optional, you can just build solutions from a local path - VCS roots are just a means to getting your code locally from a repo so you can build / interact with it.

Create job in jenkins with calling svn and maven

For now I have a batch file with commands for update projects using svn and calling maven 'clean install'. How to create some job in Jenkins for similar actions?
Should I write it to ant file (sorry if it's stupid idea, I've just heard about it but I don't know what is it exactly and what can I do with this) or there is other way?
Thanks
Like arghtype suggested, you need to be using Jenkin's own Source Code Management by configuring SVN as SCM source and supplying credentials as part of Maven build job.
If you have to use your own local working copy, you are organizing it wrong, you will lose on all the benefits of having Jenkins manage SVN changes, and in the end, this organization will give you more unsolvable problems in the future. Think about the advice people are giving here and come with up a reason why you need to have a local workspace outside of Jenkins management on a Jenkins build machine. My only guess is: your Jenkins and Development machine are the same. That again is not how it should be organized. Jenkins is a CI-server, not a personal build "automator".
Regardless, if you still want to do what you say.
What you think you want
Create a new Freestyle job
Under Build Steps, click Add build step
Select Execute Windows batch command
Write your batch execute command in there. Your working directory will be Jenkins's $WORKSPACE, so change your path accordingly to where you want to run it.
But with the above configuration, you might have as well put the batch file under windows scheduler... You are not really using Jenkins with the above.
What you should do instead
Create a new maven2/3 build job
Under Source Code Management, select Subversion
Under Repository URL enter the remote SVN repo (i.e. http://your.svnsever.com/path/to/project)
Under Build, enter your Root POM location (this will be relative to the location of your SVN checkout, so if your POM is under http://your.svnserver.com/path/to/project/maven/pom.xml, then enter maven/pom.xml.
Under Goals and options, enter clean install
Click Save
The Source Code Management section will take care of setting up a local workspace and checkout the repository into that workspace. By default, every time a new build is triggered, it will run svn update on that workspace for you.
The Maven Build step will take care of running your Maven, however note that it is configured to use default ~/.m2/repository location. If your local maven repo needs to be different, change this under Jenkins Global Configuration
Create a new job.
In Source Management choose Subversion, specify your repo and credentials.
Add a new build step - maven build, specify your maven goals ('clean install').
Jenkins is a CI(contiounus integration) server. It can be used to generate scheduled builds of ant or maven based projects. It can also start building projects by some triggering event such as a commit to SCM (git, svn, mercurial,...)connected to it. You really have to read its documentation to get a better understanding. It has nice tutorials.

cloud-based build for debian packages with network access?

I have some debian packages, which need network access in build time
- one builds with maven, and needs to access the repositories
- the other tries to bind to 127.0.0.1 as part of some unit tests
I would use launchpad for these, but the launchpad buildd does not support any of these kinds of network operations.
I am also building the packages with travis, so I would upload only the binary packages to launchpad, but it is also unsupported.
I am looking for either a cloud based debian package builder with network access,
or a cloud-based debian package repository where I can upload my binary and source packages.
Is there any?
I think you can use Jenkins as a Continuous Integration tool. CloudBees offers Jenkins as Service, where you can just test the environment you want to see if it meets your needs or not.
Since their slaves run on a Fedora Linux machine, you can easily generate the .deb files and after that use a Debian Repository as a Service. Bintray, for example, lets you upload your .deb packages on the cloud. Bintray is a part of JFrog, so you can easily enable the JFrog service through this PaaS.
You can upload your .deb package from the command line using this command:
curl -T -uXXXXXXXXX: https://api.bintray.com/content/XXXXXXXXX/deb///
So my idea is that you could use the Jenkins instance in order to create your .deb package (build + tests), and then upload your .deb package to Bintray using the Command Line from a Post build step on your Jenkins job.
Once you have your .deb package on Bintray you can easily access to the artifactory to get .deb for your builds/tests...
The solution was the following:
For the one which needs network socket for testing:
- I build with travis and do the testing there. I post the source packages to launchpad from that build. Testing is turned off in debian/rules. This way the package builds for more ubuntu revisions simultaneosly.
For the mavis one maybe bintray would be the right answer. Now I build with drone.io, and post to sourceforge FRS, but no apt repo there.

Jenkins Deploy scripts

So, I'm writing the build and the deploy scripts. To create the build, I used ant. The continuous build is done with Jenkins.
The build generates 3 different artifacts:
The war file
A zip with layouts
A zip with images
So far, so good, but now I need to write the deploy script, which should:
Deploy the war (artifact 1) to the tomcat running at server 1
Place the artifact 2 at server 1 in a specific directory
Place the artifact 3 at server 2 in a specific directory
So I was talking with my colleague and he said that we should also generate an artifact (maybe deploy.xml) that deploys these artifacts when placed at the correct server.
So there would be another script, that would:
Download the jenkins artifacts
scp to each server and place the deploy.xml there
remotely invoke the deploy.xml
What makes me a little uncomfortable is the act of having the deploy.xml as a build artifact. The motivation behind this would be to be able to make a deploy without needing to have access to the VCS repositories, so a build would be self-contained, ie, any build could go into production only with what was generated by Jenkins.
Where should the deploy scripts be placed? Should they be only at the VCS or should they be build artifacts too?
Please provide if any sample deploy scripts
I wrote my own deployment framework, consisting of different shell, batch, python, and .... scripts. It neatly separates environment information from application information and allows me to quickly update deployment information and add new apps or environment. However, the orchestration of the different parts is done by Jenkins. When just copying files to a Windows server, my Jenkins master (running on Windows) just copies the files to a network share that exposes the target directory. Services I can restart remotly using sc.exe. When crossing the borders to AIX, I use jenkins slaves that are started via ssh on the target system. So distribution is managed by Jenkins. The actual work is done by the scripts.

Resources