Publish Maven artifacts on FTP with Hudson FTP Publisher Plugin - ftp

I'm building a number of artefacts (zip files for different environments: test, dev) using the maven-assembly-plugin using a specialized Maven profile. These artefacts I want to copy/collect on on a FTP server keeping the version (01.07.10.16.Wed-1626) as a folder, so I need to copy from test/build/01.07.10.16.Wed-1626/ to ftp://my-ftp-server:21/projects/myserver-1.7/01.07.10.16.Wed-1626/
The layout for the Maven output is this:
target/
build/
01.07.10.16.Wed-1626/
my-server-01.07.10.16.Wed-1626-dev.zip
my-server-01.07.10.16.Wed-1626-test.zip
For copying the artefacts I'm using FTP Publisher Plugin but it seams I miss something since that even the build is OK and the artefacts are build without problem but the job is finishing without copying the artefacts, and in the console there is no log info about copying the artefacts
My FTP publisher config (FTP repository hosts) is:
Hostname: my-ftp-server
Port: 21
Timeout: 10000
Root Repository Path: projects
User Name: my-user
Password: my-pass
My Hudson job FTP publisher config (Publish artifacts to FTP) is:
FTP site: my-ftp-server
Files to upload
Source: target/build/**
Destination: myserver-1.7
Additional issues:
1: There is any log (hou can the FTP Uploader log can be enabled) to check if there are any FTP copy errors ?
2: There is any problem with the file pattern (source) or with the dest ?
3: I can also use maven-antrun-plugin for upload but since this post-buil task should be used just by Hudson, should be defined outside of the POM.

Have you looked at Hudson's Artifactory Plugin ? http://wiki.hudson-ci.org/display/HUDSON/Artifactory+Plugin

Related

Jenkins Deployment Issue

I'm using Jenkins to deploy war file into tomcat.Build is success but giving FileNotFound exception.I'm using tomcat7.I found some references in google but didn't get the solution
if you want to copy war from build server to tomacat server ,use robocopy to copy war to tomcat server
robocopy c:abc/abc.war //tomcatserver and path
Most containers have a directory where you can "place" the war, in order to deploy it.
Therefore in Jenkins you can set up Send build artifacts over SSH
More
On configuring the server, you should go in Configure Jenkins -> Configure system -> SSH Servers
Add the server you need to deploy to, username (+password) and Remote directory: /opt/app/tomcat/webapps (or whatever)
More info here - https://wiki.jenkins.io/display/JENKINS/Publish+Over+SSH+Plugin

How to extract and export artifact files (SNAPSHOT.jar) from Jenkins to a network drive

My team has code being built and tested in Jenkins and when the build process is done Jenkins produces a SNAPSHOT.jar file. I need to unpacked the snapshot.jar file and send the extracted files and folders to a network drive. What is the best way to do that?
I've tried a few Jenkins plugins, the most recent being artifactDeployer, but when the plugins deploy the artifacts, as a post-build action, they don't unpack the jar files; I would have to execute a windows batch command after they are deployed to unpack them but I cant because the plugin runs as a "post-build action" and the batch commands are done before the post-build actions. Is there a way to deploy the artifacts and unpack them without using a plugin? Or is there a plugin that will do both? What is the best way to achieve this?
The way I accomplished this was by using 7zip in a Windows batch command as a post-step in the jenkins project configuration.
The command is:
`7z x %WORKSPACE%\target\*.jar -oX:\"mapped network drive location" -y`
This extracts the artifacts out of the snapshot.jar file and places those artifacts into the network drive. I needed the files contained in the snapshot.jar to be sent to the network drive when the build completed. I am new to jenkins and the plugins I tried were post-build actions and only copied the snapshot.jar to a given location; they did not extract the artifacts out of the jar file. That is why I chose this route.

Let Jenkins build a maven project from Perforce

I am trying to set up a Jenkins server to build Maven projects stored in a Perforce Depot but I am failing the setup of the Perforce plugin.
That's what I have (regardless of Jenkins):
A running Perforce server with my Maven Project in a depot //components/myjavaproject/main (that's the path where the pom.xml is.
A Perforce user p4javabld with a client spec called p4javabld-cqm1
The client root is set to D:\p4client\p4javabld
When I set up a Jenkins project I set
P4PORT to my perforce server,
Username to p4javabld
Workspace to p4javabld-cqm1
I did not allow Jenkins to create any new Workspace or Workspace View
Client View type to stream with the stream //components/myjavaproject/main
On the config page I get Unable to check workspace against depot and Unable to check stream against depot
When I run a build it executes a
p4 changes -s submitted -m 1 //p4javabld-cqm1/...
which makes no sense I think. It seems then to scan the complete Perforce depot and seems not to concentrate on //components/myjavaproject/main
What is the configuration error?
I was able to let Jenkins build a Maven project from Perforce. I used the P4 Plugin instead of the Perforce Plugin. Then I configured a manual workspace with a workspace view to
//components/myjavaproject/main/... //p4javabld-cqm1/...
having the pom.xml in the root of //components/myjavaproject/main/

Jenkins Deploy scripts

So, I'm writing the build and the deploy scripts. To create the build, I used ant. The continuous build is done with Jenkins.
The build generates 3 different artifacts:
The war file
A zip with layouts
A zip with images
So far, so good, but now I need to write the deploy script, which should:
Deploy the war (artifact 1) to the tomcat running at server 1
Place the artifact 2 at server 1 in a specific directory
Place the artifact 3 at server 2 in a specific directory
So I was talking with my colleague and he said that we should also generate an artifact (maybe deploy.xml) that deploys these artifacts when placed at the correct server.
So there would be another script, that would:
Download the jenkins artifacts
scp to each server and place the deploy.xml there
remotely invoke the deploy.xml
What makes me a little uncomfortable is the act of having the deploy.xml as a build artifact. The motivation behind this would be to be able to make a deploy without needing to have access to the VCS repositories, so a build would be self-contained, ie, any build could go into production only with what was generated by Jenkins.
Where should the deploy scripts be placed? Should they be only at the VCS or should they be build artifacts too?
Please provide if any sample deploy scripts
I wrote my own deployment framework, consisting of different shell, batch, python, and .... scripts. It neatly separates environment information from application information and allows me to quickly update deployment information and add new apps or environment. However, the orchestration of the different parts is done by Jenkins. When just copying files to a Windows server, my Jenkins master (running on Windows) just copies the files to a network share that exposes the target directory. Services I can restart remotly using sc.exe. When crossing the borders to AIX, I use jenkins slaves that are started via ssh on the target system. So distribution is managed by Jenkins. The actual work is done by the scripts.

How can I copy the artifacts from Teamcity to another server?

how can I copy the artifacts from Teamcity to another server?
Thanks
The way I have done this, make things a lot easier.. Setup another configuration that pulls in, via artifact dependencies, all the files you need then run a cmd script to xcopy/copy the files to another drive on the network. You can do this using cmd script, vbs, python, shell etc..
Remember, you only need to refer to directories as if they were local as you would have your script in the same working directory
i.e cmd script :: xcopy .\"my build artifact(s)" \path\to\drive\on\my\network\"my build artifacts"
It doesn't get easier than that.
Naturally, if your artifacts are huge, then you may want to consider your more complicated option. However, TeamCity currently have a ticket pending, which you can vote on, that allows you to run multiple runners in one configuration - so you could just add your cmd script to the same configuration to save the copy time; please vote if can spare a minute:
http://youtrack.jetbrains.net/issue/TW-3660
There is a Deployer plugin, that supports deploy by fileshare/SMB, FTP, SSH and other means. The usage is basically the same as the Artifact paths.
We have used just samba, so you must enter:
target Host path: //server/drive/myfolder
Username: mydomain\myusername - in our case we had to write domain
here too
Password: ****
Domain: mydomain
and in path just select the files as in artifacts:
product/* => product.zip
and it will create file //server/drive/myfolder/product.zip
You can do it from your build script or externally.
If you are looking to get artifacts copied from a remote build agent to the primary TeamCity server, you may want to look into configuring Build Artifacts under the General Settings.
According to TeamCity's wiki entry on BuildArtifacts (http://confluence.jetbrains.com/display/TCD7/Build+Artifact) "Upon build finish, TeamCity searches for artifacts in the build's checkout directory according to the specified artifact patterns. Matching files are then uploaded ("published") to the TeamCity server, where they become available for download through the web UI or can be used in other builds using artifact dependencies."

Resources