Share Timestamp within Jenkins - maven

I do have a jenkins job where I execute command from within Maven plugin which executes ant build script. The job also does 2 ant calls as there are 2 mirror servers. Something like this:
usr/bin/ant -v -d -f /utils_repo/build.xml ${target} -propertyfile /tmp/myjob/install.properties
Where Maven connects to each server and executes something similar.
My question is how can I share timestamp of when jenkins job starts within 2 instances of ant calls. In my ant job I have a backup build step before rolling in a new code, but I need to put the logic if dump/backup was done on the first host, do not do it on the second one as they do share mysql instance and core files on nfs mount, What happens right now is there is no logic and when second ant call runs dump on the second server it overwrites the previous dump from the first instance with the new data and updated mysql.
So I was thinking on creating a touch task to touch some file since I have shared directory between 2 servers, but I have the same build.xml for both server instances, so the touch will executed on the second ant call and overwrite the modification time of the first ant call.
I thought of if I could share jenkins timestamp property of when job starts within 2 ant jobs. Do not know if this is possible.
Thanks in advance for advise.

I suggest you should use the environment variable BUILD_NUMBER set by Jenkins and make it stored on you nfs, as a property file for instance.
So if that property file doesn't exists, or if the value of the property loaded from there doesn't match the environment variable set by Jenkins, it means the current node is the first to run for that Jenkins job. So it can do the backup. And that node would overwrite the shared property file with the current build number.
If the loaded property match the current build number, then it means the first run has already been done, no backup to do.
Implementation hints:
add the build number to the command line
usr/bin/ant -v -d -f /utils_repo/build.xml ${target} -propertyfile /tmp/myjob/install.properties -Dexpected.jenkins.build.number=$BUILD_NUMBER
use ant contrib if/then/else tasks: http://ant-contrib.sourceforge.net/tasks/tasks/index.html
write the property file with:
<echo file="/mnt/nfs/shared/jenkins.properties">jenkins.build.number=$expected.jenkins.build.number</echo>

Related

How to access ssh key file in a Teamcity right in a job without SSH upload

I have a job that ssh into other servers and deploys some configuration with scp, but I can not find any way to access ssh key file used in my project configuration in TeamCity in order to execute shell command in my job - "ssh -I ~/.ssh/password", because TeamCity runs only in job directory. Therefore, I want to ask is there any way to access this SSH private key file that I mentioned in a project settings.
Just to say, I cannot use SSH-EXEC and SSH-UPLOAD as I have Shell script that ssh into many servers one by one reading from a file, therefore it would not be useful to have for each job one separate SSH exec job step in TeamCity project, so I have to somehow access the file without using standard SSH-EXEC and SSH-UPLOAD in a TeamCity
What have I tried?
I only had one idea - somehow to access SSH key that is located outside working directory by a path (I found this in documentation):
<TeamCity Data Directory>/config/projects/<project>/pluginData/ssh_keys
Problem with this, is that I cannot just cd into given path, as job does not want to go outside my working directory where job is executed by TeamCity. Therefore I could not access given directory where ssh_keys for my project is located.
UPD: Find out solution to use build feature SSH, that way you can execute SSH-key right with command line in job

Does azure pipeline 'command line' agent job inherit working directory from the previous job?

My understanding about azure pipelines agent jobs was that:
Each job is independent
And that each 'command line' job runs in its own context with an independent scope.
But if the working directory of azure pipeline 'command line' is not set, then it defaults to the working directory from previous 'command line' agent job.
Before I answer the question I want to make sure the terminology is clear:
pipelines are the overall definition of your ci cd process, they can contain multiple stages.
stages are phases of your pipeline, like build, test, deploy... They can contain multiple jobs.
jobs are collections of tasks/steps needed to implement your process. They contain one or more task/step.
tasks or steps are the actual actions being executed like "execute this command" "build that dotnet project"...
The environment is reset between each job (meaning a new virtual machine will be used, sources pulled again etc.). Between each task or step that belong to the same job, you will keep the same environment and each task will "benefit" the outcome (files changed, environment variables...) From the previous ones.
In terms of working directory, they all default to the build.workingDirectory (see azure devops default variables).
If you set the working directory of one task to something different, it will not impact other tasks.
If you use Microsoft-hosted agent, each time you run a pipeline, you get a fresh virtual machine. The virtual machine is discarded after one use. Each job may use different agents, you should not assume that the state from an earlier job is available during subsequent.
And follow is a simple test about it.
I create two agent job in my pipeline, and add the command task and run the agent job one by one. In first command task, I create a .txt file in $( Agent.BuildDirectory) folder and then read it.
In the second command task, I just changed to the folder and try to read the .txt file.
At last, the second task failed and show me the error message.
If I set working directory in first task, and not set it in second task. The two tasks’ working directory is different.

How to run bat file in jenkins

My jenkins is installed in C:\Program Files (x86)\Jenkins
bat file is located in C:\Users\Admin\workspace\demoWork\run.bat
When i run this bat file from cmd everything works fine. But when i try from jenkins executing batch command as mentioned in Image, Jenkins displays error as
Build step 'Execute Windows batch command' marked build as failure
Also inside jenkins folder automatically workspace folder gets created with Job title name. Can you guys please explain me in detail
Tatkal, you can't execute a command like in your image,
why don't you simply try
C:\users\admin\workspace\demowork\run.bat
or
call "C:\users\admin\workspace\demowork\run.bat"
"Also inside jenkins folder automatically workspace folder gets created with Job title name. Can you guys please explain me in detail" -
Jenkins creates folder with job title name automatically, saves jobs data and other build info... this is how it works. By default in jenkins job you can access your workspace using $WORKSPACE variable
You have put very little detail into this so I'm going by pure guess..
The Execute Windows batch command is to literally execute code, not execute a file.. to execute the file you could use this command :
start cmd.exe /c C:\myprj\mybat.bat
or you could take the contents of the .bat file and rewrite in in that command line..
The way Jenkins works is it creates its own workspace for each job, essentially to sandbox the environment, its a testing framework so it should be used to stage changes to code, which will then be pushed to your live(working) environment. People use it to automate some tasks, but this isnt the primary use of Jenkins.. if the above doesn't help you let me know more details of the error and I can try help you with it.
node {
bat 'D:\\gatling-charts-highcharts-bundle-3.0.2\\bin\\gatling.bat'
}

sh file is not running on slave node in jenkins?

Hi I am fairly new to jenkins config and I am struck on running sh file on slave node. I have created two jobs one is creating some .sh and .jar file and other is copying it to all the slave node, after the build I need to run the .sh file which is running on local but not running on master. I am specifying the path but Jenkins is always running some blank .sh file from tmp folder.
where as in job config I have given this
the slave.sh file is present on remote slave but jenkins is not running it, what is the possible cause?
I really do not understand what you are trying to do. You have a very strange way to dividing the work and then executing part of it in a post-build step. There should be very few use-cases for using post-build step. Maybe you could just try to execute the slave.sh script in a normal build step? And maybe execute it directly from the source location without copying it to another location.
If I'm missing something and it really is necessary to execute slave.sh in a post-build step, please verify the path to the script is correct. There are several similar but slightly different paths in your question and I cannot say if that is on purpose but probably not.

Set global environment variables inside Xcode build phase run script

I'm using Jenkins to do continuous integration builds. I have quite a few jobs that have much of the same configuration code. I'm in the midst of pulling this all out into a common script file that I'd like to run pre and post build.
I've been unable to figure out how to set some environment variables within that script, so that both the Xcode build command, and the Jenkins build can see them.
Does anyone know if this is possible?
It is not possible to do exactly what you ask. A process cannot change the environment variables of another process. The pre and post and actual build steps run in different processes.
But you can create a script that sets the common environment variables and share that script between all your builds.
The would first call your shell to execute the commands in the script and then call xcodebuild:
# Note the dot in the beginning of the next line. It is not a typo.
. set_environment.sh
xcodebuild myawesomeapp.xcodeproj
The script could look like this:
export VARIABLE1=value1
export VARIABLE2=value2
How exactly your jobs will share the script depends on your environment and use case. You can
place the script in some well-known location on the Jenkins host or
place the script in the version controlled source tree if all your jobs share the same repository or
place the script in a repository of its own and make a Jenkins build which archives the script as a build artifact. All the other jobs would then use Copy Artifact plugin to get a copy of the script from the artifacts of script job.
From Apple's Technical Q&A QA1067 it appears that if you create the file /Users/YOU/.MacOSX/environment.plist and populate it with your desired environment variables that all processes (launched by the user with the environment.plist file in their home dir) will pick up these environment variables. You may need to restart your computer (or just log out and back in) before a newly launched process will pick up the variables.
This article also claims that Xcode will also pass these variables to a build phase script. I have not tested it yet but next time I restart my MacBook I will let you know if it worked.
From http://developer.apple.com/library/mac/#/legacy/mac/library/qa/qa1067/_index.html
Q: How do I set environment for all processes launched by a specific
user?
A: It is actually a fairly simple process to set environment variables
for processes launched by a specific user.
There is a special environment file which loginwindow searches for
each time a user logs in. The environment file is:
~/.MacOSX/environment.plist (be careful it's case sensitive). Where
'~' is the home directory of the user we are interested in. You will
have to create the .MacOSX directory yourself using terminal (by
typing mkdir .MacOSX). You will also have to create the environment
file yourself. The environment file is actually in XML/plist format
(make sure to add the .plist extension to the end of the filename or
this won't work).

Resources