So, I'm writing the build and the deploy scripts. To create the build, I used ant. The continuous build is done with Jenkins.
The build generates 3 different artifacts:
The war file
A zip with layouts
A zip with images
So far, so good, but now I need to write the deploy script, which should:
Deploy the war (artifact 1) to the tomcat running at server 1
Place the artifact 2 at server 1 in a specific directory
Place the artifact 3 at server 2 in a specific directory
So I was talking with my colleague and he said that we should also generate an artifact (maybe deploy.xml) that deploys these artifacts when placed at the correct server.
So there would be another script, that would:
Download the jenkins artifacts
scp to each server and place the deploy.xml there
remotely invoke the deploy.xml
What makes me a little uncomfortable is the act of having the deploy.xml as a build artifact. The motivation behind this would be to be able to make a deploy without needing to have access to the VCS repositories, so a build would be self-contained, ie, any build could go into production only with what was generated by Jenkins.
Where should the deploy scripts be placed? Should they be only at the VCS or should they be build artifacts too?
Please provide if any sample deploy scripts
I wrote my own deployment framework, consisting of different shell, batch, python, and .... scripts. It neatly separates environment information from application information and allows me to quickly update deployment information and add new apps or environment. However, the orchestration of the different parts is done by Jenkins. When just copying files to a Windows server, my Jenkins master (running on Windows) just copies the files to a network share that exposes the target directory. Services I can restart remotly using sc.exe. When crossing the borders to AIX, I use jenkins slaves that are started via ssh on the target system. So distribution is managed by Jenkins. The actual work is done by the scripts.
Related
I am working on continuous integration project to auto build and deploy ETL workflow and Autosys jil file to target environment.
We are using Perforce P4 for source code repository and Nexus for artefacts repository. Both ETL and Autosys applications are hosted on linux server.
- Developers extract workflow in the form of xml using Repository Manager from Informatica and check-in to source repository in Perforce.
- Developers extract jil file of Autosys job and check-in to source repository in Perforce.
Requirement:
As part of CI process, when developers check-in their code to source repository build process should get triggered and create artefacts of checked-in code and copy to artefacts repository.
Deployment process should get automatically trigger when it find any new artefacts and deploy artefact to target environment.
I would highly appreciate if someone helps me to know:
build and deployment steps
requirement of manifest file
Regarding build/deployment steps its nothing more than:
open build configuration->build steps
create new step in e.g. following way:
Runner type: command line
step name: that_one_from_autosys
working directory: %system.autosys.home%
command executable: run_autosys_.bat
Please check this article, I fully support the author.
You should have Jil templates and Environment contexts (+ other variables).
Than you need to have script that will generate Jil files for each Environment using templates and Env context.
Upload generated Jils to Artifact repository with sufficient version number.
Deploy Jil files using script that will operate with Autosys CLI commands. For instance, you need to stop running jobs, load BOXs before JOBs etc..
I am in process of moving configuration parameters out of Java application. I discover that the best approach is to extend your classpath and use .properties files (leave ZooKeeper alone for another requirement).
So my WAR file no longer have any hosts/IPs/URLs, users/passwords.
DevOps distribute configs manually across test, stage, stable installations.
Now time for Jenkins to run tests. But they fail as there are no required .propeties files in classpath.
How can I load this config files to Jenkins and how to make in available in test classpath?
maven-surefire-plugin allow extending classpath and passing system-properties.
So only question how to get separate directory in Jenkins hosting server and load files to this directory and create alias/placeholder/envvar per build job to refer to this path in build config.
This job can be done with SSH access, but I think that this is "wrong way". I expect that this can be done via Jenkins UI (any manager can upload file in WEB browser).
UPDATE I have no requirements for distributed slave/master builds but it whould nice to have solution that migrate configuration files to slaves automatically...
In this way sshing to host or ftp/scp - bad thing.
I read most of Jenkins docs, ask at mail list and IRC. Yea - Jenkins community is silent. At docs I found link to Config File Provider Plugin, after that I visit http://builder.evil.com/jenkins/pluginManager/available page and look for config keyword.
There are a lot related plug-ins with various usefulness to my subject (most useless first):
https://wiki.jenkins-ci.org/display/JENKINS/Envfile+Plugin - This plugin enables you to set environment variables via a file.
https://wiki.jenkins-ci.org/display/JENKINS/Credentials+Binding+Plugin - Allows credentials to be bound to environment variables for use from miscellaneous build steps.
https://wiki.jenkins-ci.org/display/JENKINS/Environment+Script+Plugin - Allows you to run a script before each build that generates environment variables for it.
https://wiki.jenkins-ci.org/display/JENKINS/EnvInject+Plugin - This plugin makes it possible to have an isolated environment for your jobs.
https://wiki.jenkins-ci.org/display/JENKINS/Copy+Data+To+Workspace+Plugin - Copies data to workspace directory for each project build.
https://wiki.jenkins-ci.org/display/JENKINS/Copy+To+Slave+Plugin - This plugin allows to copy a set of files, from a location somewhere on the master node, to jobs' workspaces. It also allows to copy files back from the workspaces of jobs located on a slave node to their workspaces on the master one.
https://wiki.jenkins-ci.org/display/JENKINS/Config+File+Provider+Plugin - Adds the ability to provide configuration files (i.e., settings.xml for maven, XML, groovy, custom files, etc.) loaded through the Jenkins UI which will be copied to the job's workspace.
Only last plug-in - Config File Provider Plugin allow editing configs via Jenkins WEB interface. And it have brother - Managed Script Plugin - for uploading/managing/editing custom scripts. No question now I use Config File Provider Plugin!
You should keep the configs required for the tests together with the rest of source code, so that after compilation, your unit tests can run.
After deploying the .war, the DevOps team should overwrite the in-war configs with whatever per-environment configs that they have.
For now I have a batch file with commands for update projects using svn and calling maven 'clean install'. How to create some job in Jenkins for similar actions?
Should I write it to ant file (sorry if it's stupid idea, I've just heard about it but I don't know what is it exactly and what can I do with this) or there is other way?
Thanks
Like arghtype suggested, you need to be using Jenkin's own Source Code Management by configuring SVN as SCM source and supplying credentials as part of Maven build job.
If you have to use your own local working copy, you are organizing it wrong, you will lose on all the benefits of having Jenkins manage SVN changes, and in the end, this organization will give you more unsolvable problems in the future. Think about the advice people are giving here and come with up a reason why you need to have a local workspace outside of Jenkins management on a Jenkins build machine. My only guess is: your Jenkins and Development machine are the same. That again is not how it should be organized. Jenkins is a CI-server, not a personal build "automator".
Regardless, if you still want to do what you say.
What you think you want
Create a new Freestyle job
Under Build Steps, click Add build step
Select Execute Windows batch command
Write your batch execute command in there. Your working directory will be Jenkins's $WORKSPACE, so change your path accordingly to where you want to run it.
But with the above configuration, you might have as well put the batch file under windows scheduler... You are not really using Jenkins with the above.
What you should do instead
Create a new maven2/3 build job
Under Source Code Management, select Subversion
Under Repository URL enter the remote SVN repo (i.e. http://your.svnsever.com/path/to/project)
Under Build, enter your Root POM location (this will be relative to the location of your SVN checkout, so if your POM is under http://your.svnserver.com/path/to/project/maven/pom.xml, then enter maven/pom.xml.
Under Goals and options, enter clean install
Click Save
The Source Code Management section will take care of setting up a local workspace and checkout the repository into that workspace. By default, every time a new build is triggered, it will run svn update on that workspace for you.
The Maven Build step will take care of running your Maven, however note that it is configured to use default ~/.m2/repository location. If your local maven repo needs to be different, change this under Jenkins Global Configuration
Create a new job.
In Source Management choose Subversion, specify your repo and credentials.
Add a new build step - maven build, specify your maven goals ('clean install').
Jenkins is a CI(contiounus integration) server. It can be used to generate scheduled builds of ant or maven based projects. It can also start building projects by some triggering event such as a commit to SCM (git, svn, mercurial,...)connected to it. You really have to read its documentation to get a better understanding. It has nice tutorials.
I am working on a project where i want to copy the compiled file (which compiled through jenkins) from one windows server to another through jenkins. Jenkins is installed on a windows server and after building the code, those compiled file should be copied to another windows server through jenkins. Is there any way to achive it?
Jenkins might be able to do it, via the script steps running the scp command; however, if this is part of a build, I would suggest attaching the file(s) to a project, and distributing them through the maven repository.
how can I copy the artifacts from Teamcity to another server?
Thanks
The way I have done this, make things a lot easier.. Setup another configuration that pulls in, via artifact dependencies, all the files you need then run a cmd script to xcopy/copy the files to another drive on the network. You can do this using cmd script, vbs, python, shell etc..
Remember, you only need to refer to directories as if they were local as you would have your script in the same working directory
i.e cmd script :: xcopy .\"my build artifact(s)" \path\to\drive\on\my\network\"my build artifacts"
It doesn't get easier than that.
Naturally, if your artifacts are huge, then you may want to consider your more complicated option. However, TeamCity currently have a ticket pending, which you can vote on, that allows you to run multiple runners in one configuration - so you could just add your cmd script to the same configuration to save the copy time; please vote if can spare a minute:
http://youtrack.jetbrains.net/issue/TW-3660
There is a Deployer plugin, that supports deploy by fileshare/SMB, FTP, SSH and other means. The usage is basically the same as the Artifact paths.
We have used just samba, so you must enter:
target Host path: //server/drive/myfolder
Username: mydomain\myusername - in our case we had to write domain
here too
Password: ****
Domain: mydomain
and in path just select the files as in artifacts:
product/* => product.zip
and it will create file //server/drive/myfolder/product.zip
You can do it from your build script or externally.
If you are looking to get artifacts copied from a remote build agent to the primary TeamCity server, you may want to look into configuring Build Artifacts under the General Settings.
According to TeamCity's wiki entry on BuildArtifacts (http://confluence.jetbrains.com/display/TCD7/Build+Artifact) "Upon build finish, TeamCity searches for artifacts in the build's checkout directory according to the specified artifact patterns. Matching files are then uploaded ("published") to the TeamCity server, where they become available for download through the web UI or can be used in other builds using artifact dependencies."