Using Azure Devops pipeline and webhook to trigger it, i can not get payload content - yaml

I'm trying to get the payload content coming from the webhook that trigger the pipeline when a work item is updated.
I have a powershell Task to try to get the content, for example, when a work item is updated, i want to get the System.AreaPath that is in the WorkItem and available in the payload.
trigger:
none
pool:
name: SYNCCHR
resources:
webhooks:
- webhook: GETPAYLOAD
connection: GETPAYLOAD_CON
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "${{ parameters.GETPAYLOAD.resource.workItemId }}"
Write-Host "${{ parameters.GETPAYLOAD.resource.revision.rev }}"
Write-Host '${{ parameters.GETPAYLOAD.resource.revision.fields.System.AreaPath }}'
i don't find a way to get the System.Area value. parameters.GETPAYLOAD is a yaml expression related withe the parameters variable, it is gobal variable built-in in the yaml pipeline in Devops.
The fact the property System.AreaPath contains a dot in the name does not help with yaml syntax !
The payload is containing this kind of json :
payload content
I tried many time different syntax, triggering the pipeline via the update of work item but i can not obtain the content of properties under revision or fields nodes.
The goal is to be able to obtain in powershell task any property in the payload.

i found finally the way to obtain the data concerning System.AreaPath:
the syntax is
for the workitem.updated event :
'"${{ parameters.GETPAYLOAD.resource.revision.fields['System.AreaPath'] }}"'
for the workitem.created event :
'"${{ parameters.GETPAYLOAD.resource.fields['System.AreaPath'] }}"'

Related

Ansible, counting occurence of a word in a string

I am fairly new to Ansible, and I have been googling about this particular issue described below:
name: GETTING OUTPUT AND STORING IT INTO A VARIABLE
connection: network_cli
cli_command:
command: show configuration interface ge-0/0/0 | display set | match unit
register: A
Above, the task will run the command show configuration interface ge-0/0/0 on Juniper router, the out put will contain a bunch of key words unit. This output is then stored in a variable A.
I want to count the number of occurence key word unit appear in the output and store it in a variable COUNT. How can I do that? Ijust need an example.
Thanks and have a good weekend!!
If you have this task:
- name: get output and store
connection: network_cli
cli_command:
command: show configuration interface ge-0/0/0 | display set | match unit
register: show_config_result
Then you use a subsequent set_fact task to store the value you want in a variable:
- name: store unit count in unit_count variable
set_fact:
unit_count: "{{ (show_config_result.stdout_lines|length)-1 }}"

Ansible-lint Throwing error "ERROR! 'raw' is not a valid attribute for a Play"

I have yaml file with just couple of tasks that i'm including in another Yaml file.
The play book is running fine, but when i run ansible-lint against the yaml file with tasks, it's throwing the error
ERROR! 'raw' is not a valid attribute for a Play.
- name: Clusters Info
raw: "show-clusters-info cluster-id={{item}}"
register: Clusters_Info
ignore_errors: true
- name: Show XMS Info
raw: "show-xms"
register: show_xms_info
ignore_errors: true
A playbook is a list of plays. Your above example is only a list of tasks (which is I guess included in your playbook later on).
From the ansible-lint README
Usage: ansible-lint [options] [playbook.yml [playbook2 ...]]|roledirectory
So if you pass a file name directly to ansible-lint, it will try to analyze it as a playbook. The error you get is therefore expected. Either pass a playbook (which includes your task file) or a role directory (defaulting to current dir if empty) to analyze a playbook or a role.

Ansible loop over registered results [duplicate]

i have a simple playbook that is supposed to display my services status. and i want to view the output from the machine to see if the status is active or not. so i used a debug print, like so:
- name: name_of_services
shell: systemctl status {{item}}
with_items:
- service1
- service2
register: out
- debug: var=item.stdout_lines
with_items: out.results
when i execute this i get a lot of info i don't want plus the item.stdout_lines info that i do want in the end of it.
how is it possible to view the output of my command better?
For modules, including debug, called in a loop (ie with_items), the value of item at each iteration will be shown. I don't know of a way to turn this off. If you want you reduce your output you can try switching to using the msg parameter to the debug module which takes a jinja templated string. You could do something like this obviously adjusting the regex to match systemctl output.
- name: show values
debug: msg="{{ item.stdout_lines | replace_regex('^(.*).service.*Active: (.*).$', \\\1 \\\2) }}"
with_items: out.results
If you don't want to use the replace_regex function you can consider writing your own filter plugin to format the data the way you like it.
In general ansible playbooks aren't a great place to display status information gathered through register vars, facts, etc. The playbook output is more geared toward task status.

Ansible looping when a variable has no value

I have created a variable to disable sites in nginx within my main set of tasks. Since this is a one time task, meaning once domain1.com is disabled I can comment the entire line out. When I do, I receive an error
" {"failed": true, "msg": "the field 'args' has an invalid value, which appears to include a variable that is undefined. The error was: 'None' has no attribute 'domain'".
What can I do to modify my task to only run when there are domains listed within the variable?
nginx_sites_disabled:
#- domain: "domain1.com"
- name: Disable sites
file:
path: /etc/nginx/sites-enabled/{{ item.domain }}
state: absent
with_items: "{{nginx_sites_disabled}}"
notify:
- Reload nginx
Apply default filter within your task:
- name: Disable sites
file:
path: /etc/nginx/sites-enabled/{{ item.domain }}
state: absent
with_items: "{{ nginx_sites_disabled | default([]) }}"
notify:
- Reload nginx
Related answer: Ansible: apply when to complete loop
You don't need to comment out the lines when the work is done.
Like most of Ansible modules, file module is idempotent : if the desired state is absent and the file isn't there, it won't do anything.
Just leave nginx_sites_disabled list unchanged.
By the way, if you still need nginx_sites_disabled to be an empty list, you need to write this :
---
nginx_sites_disabled: []
otherwise, nginx_sites_disabled will be equal to None. That's why you have this error.

Ansible use task return varible in variable templates

I am trying to craft a list of environment variables to use in tasks that may have slightly different path on each host due to version differences.
For example, /some/common/path/v_123/rest/of/path
I created a list of these variables in variables.yml file that gets imported via roles.
roles/somerole/varables/main.yml contains the following
somename:
somevar: 'coolvar'
env:
SOME_LIB_PATH: /some/common/path/{{ unique_part.stdout }}/rest/of/path
I then have a task that runs something like this
- name: Get unique path part
shell: 'ls /some/common/path/'
register: unique_part
tags: workflow
- name: Perform some actions that need some paths
shell: 'binary argument argument'
environment: somename.env
But I get some Ansible errors about variables not being defined.
Alternatively I tried to predefine the unique_part.stdout in hopes of register overwriting predefined variable, but then I got other ansible errors - failure to template.
Is there another way to craft these variables based on command returns?
You can also use facts:
http://docs.ansible.com/set_fact_module.html
# Prepare unique variables
- hosts: myservers
tasks:
- name: Get unique path part
shell: 'ls /some/common/path/'
register: unique_part
tags: workflow
- name: Add as Fact per for each hosts
set_fact:
library_path: "{{ unique_part.stdout }}"
# launch roles that use those unique variables
- hosts: myservers
roles:
- somerole
This way you can dynamicaly add variable to your hosts before using them.
The vars files gets evaluated when it is read by Ansible. Your only chance would be to include a placeholder which you then later have to replace yourself, like this:
somename:
somevar: 'coolvar'
env:
SOME_LIB_PATH: '/some/common/path/[[ unique_part.stdout ]]/rest/of/path'
And then later in your playbook you can replace that placeholder:
- name: Perform some actions that need some paths
shell: 'binary argument argument'
environment: '{{ somename.env | replace("[[ unique_part.stdout ]]", unique_part.stdout) }}'

Resources