Ansible pass multiple lines in input variable file - ansible

I am running ansible playbook via taking variables from a file outside
ansible-playbook -v /path/export.yml --extra-vars '#input.json'
Now the file has only one line like below
{ out_file: exp_app_12.xml, control_file: export_control.xml}
Now I want to push multiple lines in the input.json file like below
{ out_file: exp_app_12.xml, control_file: export_control1.xml}
{ out_file: exp_app_13.xml, control_file: export_control2.xml}
{ out_file: exp_app_14.xml, control_file: export_control3.xml}
But it's not working , how to achieve this ?

You should pass the JSON file in proper format like this;
ansible-playbook arcade.yml --extra-vars '{"pacman":"mrs","ghosts":["inky","pinky","clyde","sue"]}'
I think your JSON file is not in correct format, it must be like this;
[
{"out_file": "exp_app_12.xml","control_file": "export_control1.xml"},
{"out_file": "exp_app_12.xml","control_file": "export_control1.xml"},
{"out_file": "exp_app_12.xml","control_file": "export_control1.xml"}
]
Furthermore see the ansible docs here for more understanding.

Related

How can I source Terraform HCL variables in bash?

I have Terraform variables defined like
variable "location" {
type = string
default = "eastus"
description = "Desired Azure Region"
}
variable "resource_group" {
type = string
default = "my-rg"
description = "Desired Azure Resource Group Name"
}
and potentially / partially overwritten in terraform.tfvars file
location = "westeurope"
and then defined variables as outputs e.g. a file outputs.tf:
output "resource_group" {
value = var.resource_group
}
output "location" {
value = var.location
}
How can I "source" the effective variable values in a bash script to work with these values?
One way is to use Terraform output values as JSON and then an utility like jq to convert and source as variables:
source <(terraform output --json | jq -r 'keys[] as $k | "\($k|ascii_upcase)=\(.[$k] | .value)"')
note that output is only available after executing terraform plan, terraform apply or even a terraform refresh
If jq is not available or not desired, sed can be used to convert Terraform HCL output into variables, even with upper case variable names:
source <(terraform output | sed -r 's/^([a-z_]+)\s+=\s+(.*)$/\U\1=\L\2/')
or using -chdir argument if Terraform templates / modules are in another folder:
source <(terraform -chdir=$TARGET_INFRA_FOLDER output | sed -r 's/^([a-z_]+)\s+=\s+(.*)$/\U\1=\L\2/')
Then these variables are available in bash script:
LOCATION="westeurope"
RESOURCE_GROUP="my-rg"
and can be addressed as $LOCATION and $RESOURCE_GROUP.

Jenkinsfile with ansibleplaybook using multiple inventory files

When I need to include multiple inventory files using ansible-playbook in cli y usually use:
ansible-playbook -i inventory/dev-hosts -i inventory/staging-hosts playbooks/run.yml
Now I want to do this, but in a Jenkinsfile . I´m using the following code:
stage('1') {
steps {
ansiColor('xterm') {
ansiblePlaybook([
colorized: true,
inventory: inventory/dev-hosts , inventory/staging-hosts ,
playbook: playbooks/run.yml
])
}
}
}
But this not working. I can´t find information how to do this using Jenkinsfile. Any idea on how to solve it ?

Escaping Dollar sign in Jenkins credentials

I have test$001 as a value in Jenkins secret text credentials. Later in pipeline script i'm accessing that value and writing it to yaml file like mentioned below, which is used as K8S configmap.
Problem is with the Dollar sign in the value.
environment {
TEST_CRED=credentials('TEST_CRED')
}
script.sh
cat << EOF > test.yaml
...
data:
TEST: ${TEST_CRED}
EOF
Expected: test$001
Printed: test$$001 (Note extra dollar sign being inserted automatically)
I tried all possibilities to escape this dollar sign, nothing worked.
TEST_01: '${TEST_CRED}'
TEST_02: ${TEST_CRED}
TEST_03: '$${TEST_CRED}'
TEST_04: $${TEST_CRED}
TEST_05: "$${TEST_CRED}"
TEST_08: $TEST_CRED
When storing value in Jenkins secret text credentials, escape the dollar sign. So, test$001 should actually be stored as test\$001.
Following works for me:
pipeline {
agent any
environment {
MYTEST_CRED=credentials('TEST_CRED')
}
stages {
stage('Special Char') {
steps {
sh """
cat << EOF > test.yaml
Name: test-config
Namespace: default
data:
TEST: ${MYTEST_CRED}
EOF
"""
}
}
}
}
Output:
This is an example when I'm passing a not escaped string to the Jenkins job via parameters. And things are not going my way.
// Original and expected value. Works fine with pure groovy
echo env.SECRET_VALUE
test#U$3r
// But this variable in shell is getting messed up
// sh("\$ENV") and sh('$ENV') are using value of shell env variale
sh("echo \$SECRET_VALUE")
test#U$$3r
sh('echo $SECRET_VALUE')
test#U$$3r
// sh("$ENV") and sh("${ENV}") are using value of groovy variables passed to the shell
sh("echo $SECRET_VALUE")
test#Ur
sh("echo ${SECRET_VALUE}")
test#Ur
Let's try to fix it
env.ESCAPED_SECRET_VALUE = env.SECRET_VALUE.replaceAll(/(!|"|#|#|\$|%|&|\\/|\(|\)|=|\?)/, /\\$0/)
// groovy variable is becoming a bit broken
echo env.ESCAPED_SECRET_VALUE
test\#U\$3r
// shell env variable is still broken
sh("echo \$ESCAPED_SECRET_VALUE")
test\#U\$$3r
sh('echo $ESCAPED_SECRET_VALUE')
test\#U\$$3r
// But, if we will pass groovy env variable to the shell - it looks good
sh("echo $ESCAPED_SECRET_VALUE")
test#U$3r
sh("echo ${ESCAPED_SECRET_VALUE}")
test#U$3r
If You are using command straight in the sh(script:""), then just pass groovy ESCAPED variable. If You need to invoke shell script file, then try to pass value of this groovy ESCAPED variable as input argument into it
Example:
sh("./my_super_script.sh $ESCAPED_SECRET_VALUE")
# my_super_script.sh
#!/bin/bash
SECRET_VALUE=$1
echo $SECRET_VALUE
I did a setup as per your requirement and got the desired results.
The setup is shown below with the screenshots,
Setup Jenkins secret text credential
Setup Binding in the Jenkins job
Configuring the build to create the test.yaml
Content of test.yaml
$ cat test.yaml
...
data:
TEST: test$001

Dynamic Inventory script output vs JSON file

I'm writing a dynamic inventory script which queries Docker containers. It outputs JSON, which I can save to a file and use, but I get parse errors from Ansible when I try to use the script directly.
[root#297b1ca0cfa4 /]# docker-dynamic-inventory > inv.json
[root#297b1ca0cfa4 /]# cat inv.json
{"all": {"hosts": {"inv_clyde_1": null, "inv_blinky_1": null, "inv_inky_1": null, "inv_pinky_1": null, "admiring_chandrasekhar": null}, "_meta": {"hostvars": {}}, "vars": {"ansible_connection": "docker"}}}
[root#297b1ca0cfa4 /]# ansible all -i inv.json -m ping
inv_clyde_1 | FAILED! => {
"failed": true,
"msg": "docker command not found in PATH"
}
Note that I don't care if the ping fails, getting that far means that my inventory works. Ansible is successfully interpreting the JSON and the inventory it represents. Now compare this with using the script directly:
[root#297b1ca0cfa4 /]# ansible all -i /usr/bin/docker-dynamic-inventory -m ping
[WARNING]: * Failed to parse /usr/bin/docker-dynamic-inventory with script plugin:
You defined a group 'all' with bad data for the host list:
{u'hosts': {u'inv_clyde_1': None, u'inv_inky_1': None,
u'admiring_chandrasekhar': None, u'inv_pinky_1': None, u'inv_blinky_1': None},
u'_meta': {u'hostvars': {}}, u'vars': {u'ansible_connection': u'docker'}}
Ansible's docs on Inventory show it using a dictionary and null values to represent hosts, which is why I do that here.
Apart from the fact that Ansible prints the dict it read in from JSON, I'm not seeing what's different/wrong here. Why does the stored JSON output work where the script won't?
So it turns out all is a special group, but only when interpreted with the script parser. In a static inventory, all can be a dictionary of keys with null values, but when coming from a script, the host value for all must be a list of strings.
{"all":
{"hosts": ["admiring_chandrasekhar", "inv_inky_1", "inv_pinky_1",
"inv_blinky_1", "inv_clyde_1"],
"_meta": {"hostvars": {}},
"vars": {"ansible_connection": "docker"}}}

executing shell script and using its output as input to next gradle task

I am using gradle for build and release, so my gradle script executes a shell script. The shell script outputs an ip address which has to be provided as an input to my next gradle ssh task. I am able to get the output and print on the console but not able to use this output as an input to next task.
remotes {
web01 {
def ip = exec {
commandLine './returnid.sh'
}
println ip --> i am able to see the ip address on console
role 'webServers'
host = ip --> i tried referring as $ip '$ip' , both results into syntax error
user = 'ubuntu'
password = 'ubuntu'
}
}
task checkWebServers1 << {
ssh.run {
session(remotes.web01) {
execute 'mkdir -p /home/ubuntu/abc3'
}
}
}
but it results in error "
What went wrong:
Execution failed for task ':checkWebServers1'.
java.net.UnknownHostException: {exitValue=0, failure=null}"
Can anyone please help me use the output variable in proper syntax or provide some hints which could help me.
Thanks in advance
The reason it's not working is the fact, that exec call return is ExecResult (here is it's JavaDoc description) and it's not a text output of the execution.
If you need to get the text output, then you've to specify the standardOutput property of the exec task. This could be done so:
remotes {
web01 {
def ip = new ByteArrayOutputStream()
exec {
commandLine './returnid.sh'
standardOutput = ip
}
println ip
role 'webServers'
host = ip.toString().split("\n")[2].trim()
user = 'ubuntu'
password = 'ubuntu'
}
}
Just note, the ip value by default would have a multiline output, include the command itself, so it has to be parsed to get the correct output, For my Win machine, this could be done as:
ip.toString().split("\n")[2].trim()
Here it takes only first line of the output.

Resources