How to access value of an axis in a label? - jenkins-pipeline

...
matrix {
axes {
axis {
name 'FOO'
values 'foo1' 'foo2' //
}
... stages {
stage ('doIt') {
agent{
label '???'
}
...
I would like to build a label instruction that will accept win or mac, if also one of the values of FOO is found. How can I combine the value of the axis with the other strings to form a meaningful label?

You can access the axis variable by its name - FOO. The only thing you need to keep in mind is to use it inside a double-quoted string, so the value can be interpolated correctly.
pipeline {
agent none
stages {
stage('Matrix example') {
matrix {
agent any
axes {
axis {
name 'FOO'
values 'bar1', 'bar2', 'bar3'
}
}
stages {
stage('Test') {
agent {
label "${FOO}"
}
steps {
// ...
}
}
}
}
}
}
}

Related

Does Pimcore provide possibility to export Hotspots and Markers additional data in DataHub GraphQL?

Asset Hotspots have 'add Data' in context menu.
We can add related objects / fileds / documents / etc there.
How to export this data in GraphQL? I see only '__typename' property.
Sample query:
{
getCategoryListing(defaultLanguage: "en") {
totalCount
edges {
node {
categoryimage {
image {
fullpath
}
hotspots {
name
top
height
left
width
data {
__typename
}
}
marker {
name
top
left
data {
__typename
}
}
}
}
}
}
}
Yes, this functionality is provided by GraphQL itself.
Check that you granted reading rights to object class that you are trying to access otherwise you will see 'object: null'
learch which object types you get using __typename
When you know the type name and have access to that object, just get its data
Sample:
... on property_text {
type
name
text
}
or
... on object_product {
sku
}
See the documentation here: https://pimcore.com/docs/data-hub/current/GraphQL/Query/Query_Samples/Sample_Element_Properties.html
Sample query to get additional data from Pimcore Hotspots and Markers on Hotspotimage:
{
getCategoryListing(defaultLanguage: "en") {
totalCount
edges {
node {
id
categoryimage {
image {
fullpath
}
hotspots {
name
top
height
left
width
data {
__typename
... on property_text {
type
name
text
}
... on property_object {
type
name
object {
__typename
... on object_product {
sku
}
... on element {
__typename
}
}
}
}
}
marker {
name
top
left
data {
__typename
... on property_text {
type
name
text
}
... on property_object {
type
name
object {
__typename
... on object_product {
sku
}
}
}
}
}
}
}
}
}
}

What is correct way to write a condition using when in Jenkinsfile?

I am trying to write when statement in the single stage block in Jenkinsfile. I have tried to write as below. I know it's not the correct way to write. It's a declarative pipeline script. The pipeline expects only a single when block. How can I combine both of my when blocks and write as a single when.
stages{
stage('Approve Dev Deployment') {
agent { label 'docker-kitchensink-slave' }
when {
anyOf {
expression {
return (env.GIT_BRANCH.equals('master') || env.GIT_BRANCH.startsWith('hotfix-'))
}
}
}
when {
expression {
input message: 'Deploy test?'
return true
}
beforeAgent true
}
steps{
approveDeployment()
}
}
}
Write function with the conditions outside the pipeline scope and use the function as a condition.
def checkcondition(){
your_condition
}
stages{
stage('Approve Dev Deployment') {
agent { label 'docker-kitchensink-slave' }
when { checkcondition() }
steps{
approveDeployment()
}
}
}

Jenkins pipelines with parallel and different containers

So I am already running Jenkins pipelines with parallel base on the example from: Is it possible to create parallel Jenkins Declarative Pipeline stages in a loop?
I want to run each job in different isolated container, the agent name should be the same to all of them. Tried a few options all of them ended up withe errors, I think I need to use both declarative and scripted but not sure how.
Things I tired:
def generateTerraformStage(env) {
return {
agent { label 'local_terraform' }
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
stage('parallel stages') {
agent { label 'local_terraform' }
steps {
script {
parallel parallelStagesMapEnvironment
}
}
}
One of the errors I got during testing:
"java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps" and "java.lang.IllegalArgumentException: Expected named arguments but got org.jenkinsci.plugins.workflow.cps.CpsClosure2#560f3533"
Dynamic parallel stages could be created only by using Scripted Pipelines. The API built-it Declarative Pipeline is not available (like agent, options, when etc.).
I don't see any information that you really need dynamic stages (e.g. based on the value returned by a 3rd-party service), so I prepared two solutions:
dynamic parallel stages - stages are generated based on something
static parallel stages - you know all stages (the when block could be used to disable these which are not needed - e.g. passed in parameters)
pipeline {
// ...
stages {
stage('dynamic parallel stages') {
steps {
script {
// params.ENVS == ['envA', 'envB', 'envC']
def values = params.ENVS.split(',')
def stages = [:]
for (def value in values) {
stages[value] = generateTerraformStage(value)
}
parallel stages
}
}
}
stage('static parallel stages') {
parallel {
stage('envA') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envA') }
}
steps {
terraformStageLogic 'envA'
}
}
stage('envB') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envB') }
}
steps {
terraformStageLogic 'envB'
}
}
stage('envC') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envC') }
}
steps {
terraformStageLogic 'envC'
}
}
// ...
}
}
}
}
Closure<Void> generateTerraformStage(env) {
return {
node('local_terraform') {
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
}
void terraformStageLogic(env) {
echo "${env}"
sleep 30
}
When you don't use the workspace in the stage responsible for generating or executing other stages (dynamic parallel stages and static parallel stages) then you don't need to allocate any node to it (waste of resources).

How to write a dynamic declarative pipeline that contains sequential job inside parallel job

I'm trying to write a declarative pipeline code that accepts a map and create a pipeline. I can able to achieve sequential stages or parallel stages but facing problems while making a pipeline that contains sequential stages inside parallel stages.
The input data would be Map. Each list in the map should run parallel and the items inside the list corresponding to each key should run in sequentially.
example data : [1:[11,12], 2:[21,22], 3:[31,32]]
The output should be of image. Could someone give some idea?
Below is the code i have tried.
def stageData = [1:[11,12], 2:[21,22], 3:[31,32]];
def getDeployStages1(stageData){
Map deployStages = [:]
stageData.each{ key, stgValue ->
List stgs = []
stgValue.each{ value ->
deployStages.put("${value}", {
echo "${value}"
})
}
}
return deployStages;
}
def getDeployStages2(stageData){
Map deployStages = [:]
stageData.each{ key, stgValue ->
List stgs = []
stgValue.each{ value ->
stgs.add(stage("${value}"){
echo "${value}"
})
}
deployStages.put("${key}", stgs)
}
return deployStages;
}
pipeline {
agent any
stages {
stage ("deploy1") {
steps {
script {
parallel getDeployStages1(stageData)
}
}
}
stage ("deploy2") {
steps {
script {
parallel getDeployStages2(stageData)
}
}
}
}
}
According to this documentation you can nest the stages in this way
pipeline {
agent none
stages {
stage("build and deploy on Windows and Linux") {
parallel {
stage("windows") {
agent {
label "windows"
}
stages {
stage("build") {
steps {
bat "run-build.bat"
}
}
stage("deploy") {
when {
branch "master"
}
steps {
bat "run-deploy.bat"
}
}
}
}
stage("linux") {
agent {
label "linux"
}
stages {
stage("build") {
steps {
sh "./run-build.sh"
}
}
stage("deploy") {
when {
branch "master"
}
steps {
sh "./run-deploy.sh"
}
}
}
}
}
}
}
}
This should result in the following flow
To apply this in your case, you can simplify your functions to return just elements that need to be sequential (just the values).
pipeline {
agent any
stages {
stage ("parallel") {
parallel {
stage ("deploy1") {
stages {
def list = getDeployStages1(stageData)
for (int i=0; i < list.size(); i++) {
stage(i) {
echo("${list[i]}")
}
}
}
stage ("deploy2") {
stages {
//similar
}
}
}
}
}

how to query prismic slices and returning data from each slice

I'm trying to use Gatsby's /___graphq debugger and the README file for gatsby-source-prismic says you can return slices. So below I'm returning the slice with a name PrismicProductBodySteps.
{
allPrismicHomePage {
edges {
node {
data {
seo_title
body {
__typename
... on PrismicProductBodySteps {
}
}
}
}
}
}
}
}
Can someone explain to me what ... on PrismicProductBodySteps means ?
In a gatsby component I've seen this as an example.
body {
... on PrismicProductsBodySteps {
...ProductStepsFragment
}
Can anyone explain to me what the ...ProductStepsFragment means ?
PrismicProductBodySteps would be a custom node type name representing a dynamic series of content blocks. That custom node type name is coming from a Prismic data model; yours will likely be different.
According to the gatsby-source-prismic documentation, using custom node type names requires you to figure out what they are first:
The easiest way to get the type of nodes is to use the /___graphql
debugger and run the below query (adjust the document type and field
name).
{
allPrismicPage {
edges {
node {
id
data {
body {
__typename
}
}
}
}
}
}
Once you have your custom node type name, you can use a GraphQL fragment to pull data specific to each fragment. Again, this would depend on how you have the fragments defined in your data model, but it would look something like this:
{
allPrismicHomePage {
edges {
node {
data {
seo_title
body {
__typename
... on PrismicYourContentBlockOne {
text {
html
}
}
... on PrismicYourContentBlockTwo {
text {
html
}
}
... on PrismicYourContentBlockThree {
text {
html
}
}
}
}
}
}
}
}

Resources