Is it possible to set Jenkins pipeline node value as parameter? - jenkins-pipeline

I want to set node value as a Jenkins Scripted Pipeline parameter. Something like this:
node('${Node}'){
stage('Clone') {
checkout scm
}
}
The Node parameter specified as Choise Parameter.
node(${Node})
Gives an error: java.lang.NoSuchMethodError: No such DSL method '$' found among steps
node("${Node}")
Gives an error: There are no nodes with the label ‘class hudson.model.Node’
node("${Node}")
Gives an error: There are no nodes with the label ‘${Node}’
Is it possible at all?

Try setting the parameter to a variable in your scripted pipeline, like so ..
def node_to_run_on = "${params.Node}"
node ("${node_to_run_on}") {
...
...
}
Or just
node ("${params.Node}") {
...
...
}

Related

How to mock variables from a groovy file

I have a groovy file created under "vars" in a Jenkins-shared-lib.
Few variables are defined inside call().
I want to mock the variables in a groovy test file.
Variables are defined in sonarGradleProject.groovy in vars:
#!/usr/bin/env groovy
import static com.example.jenkins.Constants.*
def call(Map params = [:]) {
def sonarCredId = params.sonarCredentialId ?: SONAR_CREDENTIAL_ID_DEFAULT;
def sonarUrl = params.sonarUrl ?: SONAR_URL;
def profile = params.profile ?: 'example';
withCredentials([string(credentialsId: sonarCredId, variable: 'SONAR_TOKEN')]) {
sh "./gradlew --no-daemon jacocoTestReport sonarqube -P${profile} -Dsonar.host.url=${sonarUrl} -Dsonar.login=$SONAR_TOKEN -Dsonar.verbose=true"
}
}
Test file looks like this:
import com.example.jenkins.testing.JenkinsPipelineSpecification
class SonarGradleProjectSpec extends JenkinsPipelineSpecification {
def "expected minor value"() {
setup:
def sonarGradleProject = loadPipelineScriptForTest("vars/sonarGradleProject.groovy")
explicitlyMockPipelineStep('withCredentials')
when:
def verType = sonarGradleProject(profile: 'foo')
then:
1 * getPipelineMock("sh")("./gradlew --no-daemon jacocoTestReport sonarqube -P${profile} -Dsonar.host.url=${sonarUrl} -Dsonar.login=$SONAR_TOKEN -Dsonar.verbose=true")
expect:
'foo' == profile
}
}
On executing the test case, I get this error:
java.lang.IllegalStateException: There is no pipeline variable mock for [profile].
1. Is the name correct?
2. Is it a GlobalVariable extension point? If so, does the getName() method return [profile]?
3. Is that variable normally defined by Jenkins? If so, you may need to define it by hand in your Spec.
4. Does that variable come from a plugin? If so, is that plugin listed as a dependency in your pom.xml?
5. If not, you may need to call explicitlyMockPipelineVariable("profile") during your test setup.
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at com.example.jenkins.testing.JenkinsPipelineSpecification.addPipelineMocksToObjects_closure1$_closure15(JenkinsPipelineSpecification.groovy:755)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at SonarGradleProjectSpec.expected minor value(sonarGradleProjectSpec.groovy:12)
Caused by: groovy.lang.MissingPropertyException: No such property: (intercepted on instance [SonarGradleProjectSpec#3dfb1626] during test [SonarGradleProjectSpec#3dfb1626]) profile for class: SonarGradleProjectSpec
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at com.example.jenkins.testing.JenkinsPipelineSpecification.addPipelineMocksToObjects_closure1$_closure15(JenkinsPipelineSpecification.groovy:754)
... 3 more
I want to use the variables in test.
How do I mock them?
I've run into similar issues before. The issue for me is that I was trying to verify the interactions of a mock that uses string interpolation. So for the one entry in the then section...
1 * getPipelineMock("sh")("./gradlew --no-daemon jacocoTestReport sonarqube -P${profile} -Dsonar.host.url=${sonarUrl} -Dsonar.login=$SONAR_TOKEN -Dsonar.verbose=true")
I would define the profile and sonarUrl variables in the setup section (i.e., def profile = 'test123'), referenced to verify the mocked sh step interaction. I believe you would also want to adjust the assignment of default values to use the null-safe ?. operator to access key-value pairs in the params map. So def sonarUrl = params.sonarUrl ?: SONAR_URL; would become def sonarUrl = params?.sonarUrl ?: SONAR_URL;
You may have already addressed this and omitted it from the question, but I figured I'd share this information too. Please ignore the following information if you've already addressed it. :)
The three environment variables defined in the script - SONAR_CREDENTIAL_ID_DEFAULT, SONAR_URL, and SONAR_TOKEN - will need to be injected into the script before calling sonarGradleProject. That should be sufficient to test default values if key-value pairs are not provided in the params map.
setup:
def sonarGradleProject = loadPipelineScriptForTest("vars/sonarGradleProject.groovy")
explicitlyMockPipelineStep('withCredentials')
// Inject environment variables.
// First argument is the variable name and second argument is the variable value.
sonarGradleProject.getBinding().setVariable('SONAR_CREDENTIAL_ID_DEFAULT', 'test123')
sonarGradleProject.getBinding().setVariable('SONAR_URL', 'test123')
sonarGradleProject.getBinding().setVariable('SONAR_TOKEN', 'test123')

Terraform EC2 NIC private_ips build list from custom module outputs

I have a custom child module that is building various AWS resources for unique EC2 instances, which are loaded from JSON definition files. In the root module, I need to concatenate an output property from each of the child modules to apply secondary private IPv4 addresses to a network interface resource. The network interface will have 2 static IPv4 addresses, plus an IPv4 address from EACH of the child modules that are built.
Here is my folder structure:
root/
|_
main.tf
instances/
|_
instance1.json
instance2.json
...
modules/instance/
|_
main.tf
outputs.tf
The root main.tf file will load all of the JSON files into custom child modules using the for_each argument like so:
locals {
json_files = fileset("./instances/", "*.json")
json_data = [for f in local.json_files: jsondecode(file("./instances/${f}"))]
}
module "instance" {
for_each = { for k,v in local.json_data: k => v }
source = "../modules/instance"
server_name = each.value.server_name
firewall_vip = each.value.firewall_vip
...
}
There is a string output attribute I'm trying to grab from the child modules to then apply as a list to an aws_network_interface resource private_ips property.
The string output attribute is a virtual IP used for special routing through a firewall to the backend instances.
Example of the output attribute in the child module outputs.tf file:
output "firewall_vip" {
description = "The virtual IP to pass through firewall"
value = "10.0.0.10"
}
Side note: The "firewall_vip" output property is ALSO defined within the JSON files for an input variable to the child module... So is there an easier way to pull the property straight from the JSON files instead of relying on the child module outputs?
Within the root module main.tf file, I am trying to concatenate a list of all secondary IPs to apply to the NIC with the Splat expression (not sure if this is the right approach):
resource "aws_network_interface" "firewall" {
subnet_id = <subnet ID>
private_ips = concat(["10.0.0.4","10.0.0.5"], module.instance[*].firewall_vip)
}
I receive an error saying:
Error: Incorrect attribute value type
module.instance is a map of object, known only after apply
Inappropriate value for attribute "private_ips": element 2: string required.
I have also tried to use the For expression to achieve this like so:
resource "aws_network_interface" "firewall" {
private_ips = concat(["10.0.0.4", "10.0.0.5"], [for k,v in module.instance[*]: v if k == "firewall_vip"])
...
}
I do not receive any errors with this method, but it also will not recognize any of the "firewall_vip" outputs from the child modules for appending to the list.
Am I going about this the wrong way? Any suggestions would be very helpful, as I'm still a Terraform newb.
I realize I was over-complicating this, and I could just use the locals{} block to pull the JSON attributes without having to rely on the child module outputs...
In the root main.tf file:
locals {
json_data = [for f in fileset("./instances/", "*.json"): jsondecode(file("./instances/${f}"))]
server_vips = local.json_data[*].server_vip
}
resource "aws_network_inteface" "firewall" {
private_ips = concat(["10.0.0.4", "10.0.0.5"], local.server_vips)
...
}

Console Output in pipeline:Jenkins

I have created a complex pipeline. In each stage I have called a job. I want to see the console output for each job in a stage in Jenkins. How to get it?
The object returned from a build step can be used to query the log like this:
pipeline {
agent any
stages {
stage('test') {
steps {
echo 'Building anotherJob and getting the log'
script {
def bRun = build 'anotherJob'
echo 'last 100 lines of BuildB'
for(String line : bRun.getRawBuild().getLog(100)){
echo line
}
}
}
}
}
}
The object returned from the build step is a RunWrapper class object. The getRawBuild() call is returning a Run object - there may be other options than reading the log line-by-line from the looks of this class. For this to work you need to either disable the pipeline sandbox or get script approvals for these methods:
method hudson.model.Run getLog int
method org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper getRawBuild
If you are doing this for many builds, it would be worth putting some code in a pipeline shared library to do what you need or define a function in the pipeline.

How to configure lazily a Gradle task?

I'm trying to configure the following custom task:
task antecedeRelease(type: AntecedeReleaseTask) {
antecedeWithVersion = project.'antecede-with-version'
antecedeToVersion = project.'antecede-to-version'
}
The problem is that the properties antecede-with-version and antecede-to-version are to be set through the command line with a -P option. If they're not set and antecedeRelease isn't being called, that shouldn't be a cause for an error:
$ ./gradlew tasks
org.gradle.api.GradleScriptException: A problem occurred evaluating project ...
Caused by: groovy.lang.MissingPropertyException: Could not find property 'antecede-with-version' on project ...
I could conditionally define the antecedeRelease task such that it's defined only if those properties are defined but I'd like to keep the build.gradle file as clean as possible.
If you need the antecedeRelease task to run "lazily" as-in, at the end of the configuration phase, or at the beginning of the execution phase, your best bet is to use doFirst
task antecedeRelease(type: AntecedeReleaseTask) {
doFirst {
antecedeWithVersion = project.'antecede-with-version'
antecedeToVersion = project.'antecede-to-version'
}
}
One option might be to use Groovy's elvis operator like so:
task antecedeRelease(type: AntecedeReleaseTask) {
antecedeWithVersion = project.ext.get('antecede-with-version') ?: 'unused'
antecedeToVersion = project.ext.get('antecede-with-version') ?: 'unused'
}
If this fails still, you can consider project.ext.has('property') when setting the value.

The nested copy task of svnant can't work in gradle

i am a newbie of both gradle and groovy, now i am try to setup a tag on my subversion repository. Below is my gradle script:
task svnrev {
// use ant to retrieve revision.
ant.taskdef(resource: 'org/tigris/subversion/svnant/svnantlib.xml') {
classpath {
fileset(dir: 'lib/DEV/svnant', includes: '*.jar')
}
}
ant.svn(javahl: 'false', svnkit: 'true', username: "${_svn_user}", password: "${_svn_password}", failonerror: 'false') {
ant.info(target: "${_svn_source_url}", propPrefix: 'svninfo')
}
// retrieve property of ant project and assign it to a task's property, refer to:
// http://gradle.1045684.n5.nabble.com/can-t-find-or-extract-properties-from-svnant-info-function-in-gradle-td3335388.html
ext.lastRev = ant.getProject().properties['svninfo.lastRev']
// retrieve property of gradle project
//getProject().properties['buildFile']
}
task svntag << {
ant.svn(javahl: 'false', svnkit: 'true', username: "${_svn_user}", password: "${_svn_password}", failonerror: 'false') {
copy(srcurl: "${_svn_source_url}", desturl="${_svn_tag_url}", message="Create tag: ${_svn_tag_url}")
}
}
The task 'svnrev' works normally, however when run 'gradle svntag', i constantly got a error message:
* What went wrong:
A problem occurred evaluating root project 'AFM-IGPE-v2.0.0'.
> Could not find method copy() for arguments [{srcurl=svn://192.168.2.9/IGPE/trunk_dev}, svn://192.168.2.9/IGPE/tag/AFM, Create tag: svn://192.168.2.9/IGPE/tag/AFM] on root project 'AFM-IGPE-v2.0.0'.
Also I tried
ant.copy(srcurl: "${_svn_source_url}", desturl="${_svn_tag_url}", message="Create tag: ${_svn_tag_url}")
And this time a different error message shown:
* What went wrong:
A problem occurred evaluating root project 'AFM-IGPE-v2.0.0'.
> No signature of method: org.gradle.api.internal.project.DefaultAntBuilder.copy() is applicable for argument types: (java.util.LinkedHashMap, org.codehaus.groovy.runtime.GStringImpl, org.codehaus.groovy.runtime.GStringImpl) values: [[srcurl:svn://192.168.2.9/IGPE/trunk_dev], ...]
Possible solutions: any(), notify(), wait(), grep(), every(), find()
In fact I just simple translate my ant build.xml to gradle, and my ant build.xml works well. I have googled a period time, however no results found. Pls help and thanks in advance for your kindly help.
At first sight, I can spot two problems:
It has to be task svnrev << {, not task svnrev {.
Groovy named parameters are written with a :, not a =. (The latter instead assigns a default value to a positional parameter.) That's probably why you get the error for ant.copy (you mix and match between : and =).

Resources