how can I copy the artifacts from Teamcity to another server?
Thanks
The way I have done this, make things a lot easier.. Setup another configuration that pulls in, via artifact dependencies, all the files you need then run a cmd script to xcopy/copy the files to another drive on the network. You can do this using cmd script, vbs, python, shell etc..
Remember, you only need to refer to directories as if they were local as you would have your script in the same working directory
i.e cmd script :: xcopy .\"my build artifact(s)" \path\to\drive\on\my\network\"my build artifacts"
It doesn't get easier than that.
Naturally, if your artifacts are huge, then you may want to consider your more complicated option. However, TeamCity currently have a ticket pending, which you can vote on, that allows you to run multiple runners in one configuration - so you could just add your cmd script to the same configuration to save the copy time; please vote if can spare a minute:
http://youtrack.jetbrains.net/issue/TW-3660
There is a Deployer plugin, that supports deploy by fileshare/SMB, FTP, SSH and other means. The usage is basically the same as the Artifact paths.
We have used just samba, so you must enter:
target Host path: //server/drive/myfolder
Username: mydomain\myusername - in our case we had to write domain
here too
Password: ****
Domain: mydomain
and in path just select the files as in artifacts:
product/* => product.zip
and it will create file //server/drive/myfolder/product.zip
You can do it from your build script or externally.
If you are looking to get artifacts copied from a remote build agent to the primary TeamCity server, you may want to look into configuring Build Artifacts under the General Settings.
According to TeamCity's wiki entry on BuildArtifacts (http://confluence.jetbrains.com/display/TCD7/Build+Artifact) "Upon build finish, TeamCity searches for artifacts in the build's checkout directory according to the specified artifact patterns. Matching files are then uploaded ("published") to the TeamCity server, where they become available for download through the web UI or can be used in other builds using artifact dependencies."
Related
I want to trigger a Jenkins job for a maven(v3.5.3) project from my local work space folder location instead of configuring SVN Repository URL in the Source code management section. Is there a way to achieve this?. I need to test with code modifications in the project and not wanting to commit the changes, that is the purpose.
I am using Jenkins (v2.161) and it is installed in another machine.
Thanks in Advance.
Although it might look like a sort of tinkering, the source code can be pulled to the Jenknins' host from your local machine, provided that they are properly configured to communicate via ssh.
In the project build configuration on Jenkins' host:
Do not use "Source Code Management" (choose "None").
Check "Delete workspace before build starts", to avoid conflicts with previous changes.
As the very first build step, add "Execute shell" and write a few commands that pull the data, for example:
scp myusername#myhost:/path/to/myworkspace/myproject/src .
scp myusername#myhost:/path/to/myworkspace/myproject/pom.xml .
# etc for all the files/dirs you need to build the project
Then continue with the build steps that were already used for building the project from SCM.
There is a scenario in which I have to define a build step under Teamcity deploy job configuration. Wherein a compressed file needs to copy from shared drive to udeploy. But firstly I am trying to copy to local drive(WINDOWS OS) for testing.
I am using below command to perform the operation in POWERSHELL, its working fine as expected locally.
Start-BitsTransfer -Source \Shared_Drive_Path\File_Name.zip -Destination Destination_Path
Now, I need to define this command under Teamcity Deploy Job Configuration for its execution.
What configuration i will be requiring to achieve this thing.
Anybody suggest here.
Thanks in advance....... !!
I have a Jenkins job that uses a script to build my project. On the following line, the script fails mvn -e -X -Dgit='$git' release:prepare.
Because I want to search for the cause of this, I want to go to the Jenkins server and run mvn -e -X -Dgit='$git' release:prepare from the command line, to see if it works.
Does Jenkins store the projects' source code somewhere, such that I can go to that folder and call Maven?
If yes, then where?
Yes, It Stores the project files for the job by default at
/var/lib/jenkins/workspace/{your-job-name}
This is where jenkins suppose the project files to be present or it pulls it from a source before start working/building from it.
Quote from Andrew M.:
"Hudson/Jenkins doesn't quite work that way. It stores configurations and job information in /var/lib/jenkins by default (if you're using the .deb package). If you want to setup persistence for a specific application, that's something you'll want to handle yourself - Hudson is a continuous integration server, not a test framework.
Check out the Wiki article on Continuous Integration for an overview of what to expect."
From this Question on serverfault.
This worked for me:
/var/jenkins/workspace/JobNameExample
but, if your build machine (node) is a different than the one where Jenkins is running (manager), You need specify it:
/var/jenkins/workspace/JobNameExample/label/NodeName
Where you can define label too:
jenkins stores its workspace files currently in /var/jenkins_home/workspace/project_name
I am running from docker though!
For now I have a batch file with commands for update projects using svn and calling maven 'clean install'. How to create some job in Jenkins for similar actions?
Should I write it to ant file (sorry if it's stupid idea, I've just heard about it but I don't know what is it exactly and what can I do with this) or there is other way?
Thanks
Like arghtype suggested, you need to be using Jenkin's own Source Code Management by configuring SVN as SCM source and supplying credentials as part of Maven build job.
If you have to use your own local working copy, you are organizing it wrong, you will lose on all the benefits of having Jenkins manage SVN changes, and in the end, this organization will give you more unsolvable problems in the future. Think about the advice people are giving here and come with up a reason why you need to have a local workspace outside of Jenkins management on a Jenkins build machine. My only guess is: your Jenkins and Development machine are the same. That again is not how it should be organized. Jenkins is a CI-server, not a personal build "automator".
Regardless, if you still want to do what you say.
What you think you want
Create a new Freestyle job
Under Build Steps, click Add build step
Select Execute Windows batch command
Write your batch execute command in there. Your working directory will be Jenkins's $WORKSPACE, so change your path accordingly to where you want to run it.
But with the above configuration, you might have as well put the batch file under windows scheduler... You are not really using Jenkins with the above.
What you should do instead
Create a new maven2/3 build job
Under Source Code Management, select Subversion
Under Repository URL enter the remote SVN repo (i.e. http://your.svnsever.com/path/to/project)
Under Build, enter your Root POM location (this will be relative to the location of your SVN checkout, so if your POM is under http://your.svnserver.com/path/to/project/maven/pom.xml, then enter maven/pom.xml.
Under Goals and options, enter clean install
Click Save
The Source Code Management section will take care of setting up a local workspace and checkout the repository into that workspace. By default, every time a new build is triggered, it will run svn update on that workspace for you.
The Maven Build step will take care of running your Maven, however note that it is configured to use default ~/.m2/repository location. If your local maven repo needs to be different, change this under Jenkins Global Configuration
Create a new job.
In Source Management choose Subversion, specify your repo and credentials.
Add a new build step - maven build, specify your maven goals ('clean install').
Jenkins is a CI(contiounus integration) server. It can be used to generate scheduled builds of ant or maven based projects. It can also start building projects by some triggering event such as a commit to SCM (git, svn, mercurial,...)connected to it. You really have to read its documentation to get a better understanding. It has nice tutorials.
So, I'm writing the build and the deploy scripts. To create the build, I used ant. The continuous build is done with Jenkins.
The build generates 3 different artifacts:
The war file
A zip with layouts
A zip with images
So far, so good, but now I need to write the deploy script, which should:
Deploy the war (artifact 1) to the tomcat running at server 1
Place the artifact 2 at server 1 in a specific directory
Place the artifact 3 at server 2 in a specific directory
So I was talking with my colleague and he said that we should also generate an artifact (maybe deploy.xml) that deploys these artifacts when placed at the correct server.
So there would be another script, that would:
Download the jenkins artifacts
scp to each server and place the deploy.xml there
remotely invoke the deploy.xml
What makes me a little uncomfortable is the act of having the deploy.xml as a build artifact. The motivation behind this would be to be able to make a deploy without needing to have access to the VCS repositories, so a build would be self-contained, ie, any build could go into production only with what was generated by Jenkins.
Where should the deploy scripts be placed? Should they be only at the VCS or should they be build artifacts too?
Please provide if any sample deploy scripts
I wrote my own deployment framework, consisting of different shell, batch, python, and .... scripts. It neatly separates environment information from application information and allows me to quickly update deployment information and add new apps or environment. However, the orchestration of the different parts is done by Jenkins. When just copying files to a Windows server, my Jenkins master (running on Windows) just copies the files to a network share that exposes the target directory. Services I can restart remotly using sc.exe. When crossing the borders to AIX, I use jenkins slaves that are started via ssh on the target system. So distribution is managed by Jenkins. The actual work is done by the scripts.