Dynamically changing 'teamcity.build.branch' - teamcity

I want to set the value of 'teamcity.build.branch' dynamically according to the result of another TC build configuration part of the build pipeline.
Is that even possible? It looks like the value is evaluated and used at the start of the build pipeline.
UseCase:
I am executing a TC build configuration that will generate a unique number
in the connected TC build configuration part of the same pipeline I want the number to be used in the 'teamcity.build.branch' - just for visualization purposes
I am already using message service to overwrite the parameter, but the change is not taken into account. It looks like the value is read in the very early stage of the build process.

Check below reference containing build number and git branch name
https://octopus.com/blog/teamcity-version-numbers-based-on-branches

You could overwrite the value of the parameter by using a simple script that emits a "set parameter" service message.
By using a dedicated service message in your build script, you can dynamically update build parameters of the build right from a build step (...)
With that approach, here are the steps that you need to perform:
In the first build config, define a custom build parameter and set its value to the unique number you're generating. Do this directly from the script that generates the unique number by writing something like this to STDOUT:
##teamcity[setParameter name='magicNumber' value='1234']
In the dependent build config, you now have access to that parameter. Using a second build script, you can overwrite the teamcity.build.branch with the same mechanism:
##teamcity[setParameter name='teamcity.build.branch' value='the new value']
Note 1: I recommend against overwriting the built-in parameters, because this might have strange side-effects. Rather, define a custom parameter in the second build config and use that for your visualization purposes.
Note 2: In case you decide to ignore Note 1, it may be necessary to overwrite the build parameters by setting the dependency property as outlined in the docs in section "Overriding Dependencies Properties":
##teamcity[setParameter name='reverse.dep.*.teamcity.build.branch' value='the new value']

Related

How to use computed properties in github actions

I am trying to rebuild my ci-cd within the new github actions yaml format (new), the issue is that I can't seem to use computed values as arguments in a step.
I have tried the following
- name: Download Cache
uses: ./.github/actions/cache
with:
entrypoint: restore_cache
args: --bucket=gs://[bucket secret] --key=node-modules-cache-$(checksum package.json)-node-12.7.0
However "$(checksum package.json)" is not valid as part of an argument.
Please not this has nothing to do with if the command checksum exists, it does exist within the container.
I'm trying to copy this kind of setup in google cloud build
- name: gcr.io/$PROJECT_ID/restore_cache
id: restore_cache_node
args:
- '--bucket=gs://${_CACHE_BUCKET}'
- '--key=node-modules-cache-$(checksum package.json)-node-${_NODE_VERSION}'
I expected to be able to use compute arguments in a similar way to other ci-cd solutions.
Is there a way to do this that I am missing? Maybe being able to use 'run:' within a docker container to run some commands.
The only solution I'm aware of at the moment is to compute the value in a previous step so that you can use it in later steps.
See this answer for a method using set-output. This is the method I would recommend for passing computed values between workflow steps.
Github Actions, how to share a calculated value between job steps?
Alternatively, you can create environment variables. Computed environment variables can also be used in later steps.
How do I set an env var with a bash expression in GitHub Actions?

Tagging Jmeter Test Cases

I am looking for a way to tag Jmeter test cases.
We are using Jmeter for functional test , so we have a lot of test cases , not every test cases are run for every application configuration.
So based on configuration we need tag our test cases and run the test set accordingly from command line.
(Some thing we can do in testng and other framework , where you tag TC and during run you provide the tag so that TC only with that tag are executed)
If there is no tagging available then i feel that i will need to create multiple test set as per configuration and run them accordingly.
In most cases the Test overlap between this test sets and this will result in duplication and require quite a good maintenance.
Please suggest if you all see any solution to this problem.
One solution that I can think of is explained below:
- Declare a property for every test cases
- Process that property to execute your test cases. Use If Controller to check whether that property is passed or not
- Control that property via command line or via GUI. If you want to run some of them pass that property only.
Practical example is shown below:
Test Plan will look like this:
Test Case steps will be inside If controller, and if controller will decide whether to run that Test Case or not, depending upon what you are passing in that property.
Property declaration for all you test cases.
I have designed this in such a way that you can execute that via GUI as well as via command prompt.
If Controller logic
${__BeanShell("${TC1}"=="ON",)}
Execution via command line
jmeter -n -t <>.jmx -JTC1=ON -JTC3=ON -j sample.log
Here I am running TC1 and TC3, depending upon your requirement you can pass whichever scenario you need to execute.

Is it possible to set rootMetricsPrefix with a command line parameter?

My JMeter test receives a parameter to specify test environment like PROD, DEV.
Results from both test environment will be sent to a certain GraphiteHost. But I need to separate the results of each environment by using rootMetricsPrefix.
For example, results from PROD will use prefix global.myapp.performance.prod. while the results from DEV will use prefix global.myapp.performance.dev..
So I set the rootMetricsPrefix in my Backend Listener as global.myapp.performance.${__groovy($__P(env).toLowerCase())}..
Unfortunately, It doesn't work.
Data in Graphite doesn't contains the environment name.
Can anyone tell me how to solve this?
You can use new changeCase function to lower case your value:
${__changeCase(${__P(env)},LOWER,)}
It will read the property and then execute lower case on value
There is an error in your expression it should be:
global.myapp.performance.${__groovy("${__P(env)}".toLowerCase())}
This will also work:
global.myapp.performance.${__groovy(props.get("env").toLowerCase())}
But for performances, it is better to use the solution provided by #user7294900

Jenkins Build Flow Plugin having trouble passing parameters between builds

I've got a set of parametrized builds in Jenkins that to build I have to click 'Build Now' and then enter a value for the parameter (in this case called GIT_TAG_NAME). I'd like to trigger a set of these parametrized builds that all use the same parameter without typing it multiple times.
I'm trying to get this working with the Build Flow Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Build+Flow+Plugin), by making a master build flow that triggers all the other builds, but I'm not understanding the plugin syntax, or maybe this just isn't possible.
My DSL looks like:
out.println "-------------------------"
out.println 'Building all OTA builds at tag: '
out.println params["GIT_TAG_NAME"]
out.println "-------------------------"
build( "SomeOTA-Build-1", param1: params["GIT_TAG_NAME"] )
build( "SomeOTA-Build-2", param1: params["GIT_TAG_NAME"] )
The print statement prints the parameter correctly, but the child builds don't seem to get the parameter passed in to them.
Try using the Parameterized Trigger Plugin -
Set Job-A with a parameter GIT_TAG_NAME - this is your "front-end"
Set Job-B1, Job-B2 and Job-B3 with the same parameter GIT_TAG_NAME - those do the actual work
Set Job-A to Trigger parameterized build on other projects (in Post-build Actions) and pass Current build parameters to the triggered jobs(need a trigger per derived job - either with same or different conditions)
Now, running Job-A will rigger the other jobs, while passing them the value of GIT_TAG_NAME.
EDIT:
There is a Plugin from TIKAL that uses a different approach:
Multijob Plugin tries to squeeze all job-steps into one big job
(have not tried it, so cannot comment on this approach).
If taking the first approach, you would probably want to take a look at the Join Plugin -
this plugin allows a job to be run after all the immediate downstream jobs have ended.

Jenkins - passing variables between jobs?

I have two jobs in jenkins, both of which need the same parameter.
How can I run the first job with a parameter so that when it triggers the second job, the same parameter is used?
You can use Parameterized Trigger Plugin which will let you pass parameters from one task to another.
You need also add this parameter you passed from upstream in downstream.
1.Post-Build Actions > Select ”Trigger parameterized build on other projects”
2.Enter the environment variable with value.Value can also be Jenkins Build Parameters.
Detailed steps can be seen here :-
https://itisatechiesworld.wordpress.com/jenkins-related-articles/jenkins-configuration/jenkins-passing-a-parameter-from-one-job-to-another/
Hope it's helpful :)
The accepted answer here does not work for my use case. I needed to be able to dynamically create parameters in one job and pass them into another. As Mark McKenna mentions there is seemingly no way to export a variable from a shell build step to the post build actions.
I achieved a workaround using the Parameterized Trigger Plugin by writing the values to a file and using that file as the parameters to import via 'Add post-build action' -> 'Trigger parameterized build...' then selecting 'Add Parameters' -> 'Parameters from properties file'.
I think the answer above needs some update:
I was trying to create a dynamic directory to store my upstream build artifacts so I wanted to pass my upstream job build number to downstream job I tried the above steps but couldn't make it work. Here is how it worked:
I copied the artifacts from my current job using copy artifacts plugin.
In post build action of upstream job I added the variable like "SOURCE_BUILD_NUMBER=${BUILD_NUMBER}" and configured it to trigger the downstream job.
Everything worked except that my downstream job was not able to get $SOURCE_BUILD_NUMBER to create the directory.
So I found out that to use this variable I have to define the same variable in down stream job as a parameter variable like in this picture below:
This is because the new version of jenkins require's you to define the variable in the downstream job as well. I hope it's helpful.
(for fellow googlers)
If you are building a serious pipeline with the Build Flow Plugin, you can pass parameters between jobs with the DSL like this :
Supposing an available string parameter "CVS_TAG", in order to pass it to other jobs :
build("pipeline_begin", CVS_TAG: params['CVS_TAG'])
parallel (
// will be scheduled in parallel.
{ build("pipeline_static_analysis", CVS_TAG: params['CVS_TAG']) },
{ build("pipeline_nonreg", CVS_TAG: params['CVS_TAG']) }
)
// will be triggered after previous jobs complete
build("pipeline_end", CVS_TAG: params['CVS_TAG'])
Hint for displaying available variables / params :
// output values
out.println '------------------------------------'
out.println 'Triggered Parameters Map:'
out.println params
out.println '------------------------------------'
out.println 'Build Object Properties:'
build.properties.each { out.println "$it.key -> $it.value" }
out.println '------------------------------------'
Just add my answer in addition to Nigel Kirby's as I can't comment yet:
In order to pass a dynamically created parameter, you can also export the variable in 'Execute Shell' tile and then pass it through 'Trigger parameterized build on other projects' => 'Predefined parameters" => give 'YOUR_VAR=$YOUR_VAR'. My team use this feature to pass npm package version from build job to deployment jobs
UPDATE: above only works for Jenkins injected parameters, parameter created from shell still need to use same method. eg. echo YOUR_VAR=${YOUR_VAR} > variable.properties and pass that file downstream
I faced the same issue when I had to pass a pom version to a downstream Rundeck job.
What I did, was using parameters injection via a properties file as such:
1) Creating properties in properties file via shell :
Build actions:
Execute a shell script
Inject environment variables
E.g : properties definition
2) Passing defined properties to the downstream job :
Post Build Actions :
Trigger parameterized build on other project
Add parameters : Current build parameters
Add parameters : predefined parameters
E.g : properties sending
3) It was then possible to use $POM_VERSION as such in the downstream Rundeck job.
/!\ Jenkins Version : 1.636
/!\ For some reason when creating the triggered build, it was necessary to add the option 'Current build parameters' to pass the properties.
Reading through the answers, I don't see another option that I like so will offer it as well. I love the parameterization of jobs, but it doesn't always scale well. If you have jobs which are not directly downstream of the first job but farther down the pipeline, you don't really want to parameterize every job in the pipeline so as to be able to pass the parameters all the way through. Or if you have a large number of parameters used by a variety of other jobs (especially those not necessarily tied to one parent or master job), again parameterization doesn't work.
In these cases, I favor outputting the values to a properties file and then injecting that in whatever job I need using the EnvInject plugin. This can be done dynamically, which is another way to solve the issue from another answer above where parameterized jobs were still used. This solution scales very well in many scenarios.
This could be done via groovy function:
upstream Jenkinsfile - param CREDENTIALS_ID is passed downsteam
pipeline {
stage {
steps {
build job: "my_downsteam_job_name",
parameters [string(name: 'CREDENTIALS_ID', value: 'other_credentials_id')]
}
}
}
downstream Jenkinsfile - if param CREDENTIALS_ID not passed from upsteam, function returns default value
def getCredentialsId() {
if(params.CREDENTIALS_ID) {
return params.CREDENTIALS_ID;
} else {
return "default_credentials_id";
}
}
pipeline {
environment{
TEST_PASSWORD = credentials("${getCredentialsId()}")
}
}
You can use Hudson Groovy builder to do this.
First Job in pipeline
Second job in pipeline
I figured it out!
With almost 2 hours worth of trial and error, i figured it out.
This WORKS and is what you do to pass variables to remote job:
def handle = triggerRemoteJob(remoteJenkinsName: 'remoteJenkins', job: 'RemoteJob' paramters: "param1=${env.PARAM1}\nparam2=${env.param2}")
Use \n to separate two parameters, no spaces..
As opposed to
parameters: '''someparams'''
we use
paramters: "someparams"
the " ... " is what gets us the values of the desired variables. (These are double quotes, not two single quotes)
the ''' ... ''' or ' ... ' will not get us those values. (Three single quotes or just single quotes)
All parameters here are defined in environment{} block at the start of the pipeline and are modified in stages>steps>scripts wherever necessary.
I also tested and found that when you use " ... " you cannot use something like ''' ... "..." ''' or "... '..'..." or any combination of it...
The catch here is that when you are using "..." in parameters section, you cannot pass a string parameter; for example This WILL NOT WORK:
def handle = triggerRemoteJob(remoteJenkinsName: 'remoteJenkins', job: 'RemoteJob' paramters: "param1=${env.PARAM1}\nparam2='param2'")
if you want to pass something like the one above, you will need to set an environment variable param2='param2' and then use ${env.param2} in the parameters section of remote trigger plugin step
You can also make a job write into a properties file somewhere and have another job read it. One of the way to do that is to inject variables via EnvInject plugin.

Resources