I am trying to read the contents of a file, store these in a variable and then insert them into another file if they don't already exist.
So, how I'm attempting to go about this is as follows:
# Create a variable that represents the path to the file that you want to read from
ssh_public_key_file: '../../jenkins_master/files/{{ hostvars[inventory_hostname]["environment"] }}/id_rsa.pub'
# Create a variable that represents the contents of this file:
ssh_public_key: "{{ lookup('file', '{{ ssh_public_key_file }}') }}"
I then use these variables in my Ansible playbook as follows:
- name: Install SSH authorized key
lineinfile: create=yes dest=~/.ssh/authorized_keys line=" {{ ssh_public_key }}" mode=0644
However, when I try and run the playbook, I get the following error message:
could not locate file in lookup: {{ ssh_public_key_file }}
Can anyone recommend a solution or suggest what I may have done wrong?
Thanks,
Seán
You have to change the line to:
# Create a variable that represents the contents of this file:
ssh_public_key: "{{ lookup('file', ssh_public_key_file) }}"
If you need to concatenate variables and strings you can do it like this:
# Example with two variables
ssh_public_key: "{{ lookup('file', var_1+var_2) }}"
# Example with string and variable
ssh_public_key: "{{ lookup('file', '~/config/'+var_1) }}"
. .
First I would make sure that your ssh_public_key_file variable is set up properly. If you add a task like the following what does it show?
- name: display variable
debug: var=ssh_public_key_file
If the output looks something like this then the variable isn't defined properly (eg. the "environment" fact doesn't exist for the host):
ok: [localhost] => {
"ssh_public_key_file": "../../jenkins_master/files/{{ hostvars[inventory_hostname][\"environment\"] }}/id_rsa.pub"
}
However if everything is defined properly then your output should show the variables replaced with their correct values:
ok: [localhost] => {
"ssh_public_key_file": "../../jenkins_master/files/foo/id_rsa.pub"
}
Once you've verified that then I would do the same thing with your ssh_public_key variable. Just output its value using the debug module. It should display as the contents of the public key file.
One other thing I would strongly suggest is to avoid using lineinfile altogether. Since you're working with SSH keys I would recommend you use the authorized_key module instead. It's a much cleaner way of managing authorized_keys files.
Related
I want to set a playbook level environment, but after I execute a couple of tasks. I have found that I could define a playbook level environment variable before definition of any tasks or task level environment variables. But, I haven't found how can I set-up an environment variable that can be used by all tasks following a task.
- name: server properties
hosts: kafka_broker
vars:
ansible_ssh_extra_args: "-o StrictHostKeyChecking=no"
ansible_host_key_checking: false
date: "{{ lookup('pipe', 'date +%Y%m%d-%H%M%S') }}"
copy_to_dest: "/export/home/kafusr/kafka/secrets"
server_props_loc: "/etc/kafka"
secrets_props_loc: "{{ server_props_loc }}/secrets"
environment:
CONFLUENT_SECURITY_MASTER_KEY: "{{ extract_key2 }}"
tasks:
- name: Create a directory if it does not exist
file:
path: "{{ copy_to_dest }}"
state: directory
mode: '0755'
- name: Find files from "{{ server_props_loc }}"
find:
paths: /etc/kafka/
patterns: "server.properties*"
# ... the rest of the task
register: etc_kafka_server_props
- name: Find files from "{{ secrets_props_loc }}"
find:
paths: /etc/kafka/secrets
patterns: "*"
# ... the rest of the task
register: etc_kafka_secrets_props
- name: Copy the files
copy:
src: "{{ item.path }}"
dest: "{{ copy_to_dest }}"
remote_src: yes
loop: "{{ etc_kafka_server_props.files + etc_kafka_secrets_props.files }}"
- name: set masterkey content value
set_fact:
contents: "{{ lookup('file', '/export/home/kafusr/kafka/secrets/masterkey.txt') }}"
extract_key2: "{{ contents.split('\n').2.split('|').2|trim }}"
I want to set CONFLUENT_SECURITY_MASTER_KEY after the set_facts task
Is it possible to set playbook level environment variable, but after defining some tasks
Thank you
UPDATE
Initially, when I was executing the playbook as originally defined, I was getting the error
fatal: [kafkaserver1]: FAILED! => {"msg": "The field 'environment' has an invalid value,
which includes an undefined variable. The error was: 'extract_key2' is undefined"}
which was expected as the variable extract_key2 was not set - before copying the files to desired directory.
After #Zeitounator's suggestion, when I added default to the environment variable's definition,
CONFLUENT_SECURITY_MASTER_KEY: "{{ extract_key2 | default('') }}"
I now get a different error
The new error is
TASK [set masterkey content value] ******************** fatal: [kafkaserver1]: FAILED! =>
{"msg": "The task includes an option with an undefined variable. The error was: 'contents' is undefined\n\n
The error appears to be in '/export/home/kafuser/tmp/so-71538207-question.yml': line 43, column 7, but may\n
be elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n
- name: set masterkey content value\n ^ here\n"}
Getting this on all 3 brokers in the console and I checked the file it exists
I did do a cat on that file, copying the path from error to make sure there is no typo, and the contents of that file are displayed on console.
Update 2
I am trying to figure out how to use slurp to get the info, with the same approach as #Zeitounator's example about using lookup.
This is what I am trying. The current definition, is of course, erroneous. Just wanted to show what I am trying to do. But, can it be done for slurp and am I on the right path?
environment:
CONFLUENT_SECURITY_MASTER_KEY: >-
{{
(
((slurp: src: /export/home/z8tpush/kafka/secrets/masterkey.txt)['content'] | b64decode ).split('\n').2.split('|').2|trim
)
}}
#Zeitounator - Will you be able to direct me to an example where a slurp or fetch module is defined to set-up an environment variable and where the value will get updated after the tasks that create the file are executed, similar to what you have shown with lookup filter? I would really appreciate it.
Note:
Ultimately, I want to use ansible to create a new kafka user using confluents CLI commands ( using shell or command module ), verify it in my directory and once satisfied, I will encrypt the security.properties file using the masterkey and copy it to the appropriate location where confluent is installed.
As already mentioned, you can
With Ansible Configuration Settings set environment variables globally
Setting the remote environment in a task
Regarding your question
I haven't found how can I set-up an environment variable that can be used by all tasks following a task.
You can set the environment on Block level, a logical groups of tasks too
Setting the remote environment: "When you set a value with environment: at the play or block level, it is available only to tasks within the play or block that are executed by the same user."
This means you would need to define a block for the next tasks
- name: Block of next task(s)
block:
- name: Next task
...
environment:
CONFLUENT_SECURITY_MASTER_KEY: "{{ extract_key2 }}"
Regarding your question
Is it possible to set playbook level environment variable, but after defining some tasks?
No, not on that level in that run as the playbook is already running.
Another option might be to distribute the tasks in question into an other role, playbook or task file and include_* it.
You cannot set_fact a var depending on an other var in the same block. Moreover, there is absolutely no need to set_fact here as long as your relevant tasks can live with an empty environment var until it is fully defined. The following environment declaration (untested) should work and return the key for every task running after your file exists.
environment:
CONFLUENT_SECURITY_MASTER_KEY: >-
{{
(
(
lookup('file', '/export/home/kafusr/kafka/secrets/masterkey.txt', errors='ignore')
| default('')
).split('\n').2
| default('')
).2
| default('')
| trim
}}
I have an ansible playbook that has created a file (/tmp/values.txt) in each server with the following content
1.1.1.1
2.2.2.2
3.3.3.3
4.4.4.4
5.5.5.5
And I have a conf file named etcd.conf in the /tmp/etcd.conf with the following content
and I need to substitute the value of each line from /tmp/values.txt, so /tmp/etcd.conf will look like this:
SELF_IP=1.1.1.1
NODE_1_IP=2.2.2.2
NODE_2_IP=3.3.3.3
NODE_3_IP=4.4.4.4
NODE_4_IP=5.5.5.5
Is there a way I can do this? I used the following lookup method but it only works in the controll server
- name: Create etcd.conf
template:
src: etcd.conf.j2
dest: /tmp/etcd.conf
vars:
my_values: "{{ lookup('file', '/tmp/values.txt').split('\n') }}"
my_vars: [SELF_IP, NODE_1_IP, NODE_2_IP, NODE_3_IP, NODE_4_IP]
my_dict: "{{ dict(my_vars|zip(my_values)) }}"
You can use the slurp: task to grab the content for each machine, or use command: cat /tmp/values.txt and then examine the stdout_lines of its register:-ed variable to achieve that same .split("\n") behavior (I believe you'll still need to use that .split call after | b64decode if you use the slurp approach)
What will likely require some massaging is that you'll need to identify the group (which may very well include all) of inventory hostnames for which that command will produce content, and then grab the hostvar out of them to get all values
Conceptually, similar to [ hostvars[hv].tmp_values.stdout_lines for hv in groups[the_group] ] | join("\n"), but it'll be tedious to write out, so I'd only want to do that if you aren't able to get it to work on your own
I have an ansible playbook in which one variable I am passing from the command. I am trying to append a windows folder path to it. One way I am able to find out is to add the path to another variable and then join the two variable. I would like to know if it is possible to avoid the variable and put the path like this:
"{{ variable2 }} \build\dist\package\ui.msi"
variable1 has value "d:\install"
var_build_file_name is entered by the user
variable2 is formed by combining variable1 and var_build_number.
This is the actual content in the playbook which works:
vars:
installerFolder: "{{ UploadFolder }}{{ var_build_file_name | regex_replace('(\\.zip)','\') }}"
packagePath: '\build\dist\package\UI.msi'
- name: Install new version.
debug:
msg: "{{ installerFolder }}{{ packagePath }}"
And this is the playbook command:
ansible-playbook Install.yml -i ../inventory/hosts.ini --extra-vars "target=servername var_build_file_name=16.3.0.zip UploadFolder=D:\Install\\"
The output I am getting is:
"msg": "D:\\Install\\16.3.0\\build\\dist\\package\\UI.msi"
In the above output, how come two backslashes are showing instead of one?
Is it possible to do
msg: "{{ installerFolder }} '\build\dist\package\UI.msi' "
I have tried many combinations but the backslashes are not getting properly escaped for above. If it is not possible then can someone enlighten on the reason.
Thanks.
"msg": "D:\\Install\\16.3.0\\build\\dist\\package\\UI.msi"
It is the output of debug module, which is a JSON string, so every \ is escaped.
The actual value of the msg here is D:\Install\16.3.0\build\dist\package\UI.msi, as you expect it.
And you can definitely use this syntax: msg: '{{ installerFolder }}\build\dist\package\UI.msi'
I'm trying to overwrite a previously defined variable my_var (when it's set to LATEST) by loading a value from a file (containing, say, NEWVALUE).
- name: Load from file
vars:
my_var: "{{ lookup('file', '~/file.txt') }}"
my_var2: "{{ lookup('file', '~/file.txt') }}"
debug: msg="my_var is {{ my_var }} my_var2 is {{ my_var2 }}"
when: "{{ my_var=='LATEST' }}"
This prints
ok: [host] ==> {
"msg": "my_var is LATEST my_var2 is NEWVALUE"
}
So I feel that I've verified that I'm loading the value correctly.. but for some reason I can't set the result of lookup in a previously set variable. Disabling the when clause doesn't seem to make any difference.
Should I be able to do this? As an alternative I'm going to use a third variable and just set it to either the preexisting value or the value from the file - but this seems like an unnecessary step to me.
Ansible version 2.1.0.0 b.t.w.
The vars you defined in your example only are available for the single debug task.
As you mentioned in the comment you figured this out and used set_fact instead. And yes, this won't work if you have passed the same variable as extra-var, as it has the highest precedence. There is no way to override a variable you passed as extra-var.
I'm trying to transfer public keys to linux servers using ansible and it's authorized_key module. I thought I use a lookup to read the content of a file and combine it with items.
- name: ensure deployment keys are in authorized keys
authorized_key: user={{ sshaccess_user }} key="{{ lookup('file', '{{ item }}') }}"
with_items: sshaccess_keys
And sshaccess_keys is defined:
sshaccess_keys:
- ~/.ssh/id_rsa.pub
Obviously, I would like to append more than one. The error I get is
fatal: [testbox] => could not locate file in lookup: {{ item }}
It tries to read a file instead of taking the content of item as the filename?
When using with_items you want to use the item variable. You're using items. Fix that and it looks like this should work.