I know how to use the acl module in ansible. It's working like a charm but not exactly like I want.
I have a log_dir variable with the exact path to log files. My goal is to set an ACL to the files and only to the parent directories up to a base directory.
For example:
Log file: /some/highly/fancy/secured/file
Log path: /some/highly/fancy/secured
Now I want an ACL up to /some but not to (for example):
/some/otherDirectory or /some/highly/fancy/A/file
Do you know how to handle this?
Feels super hacky, but something like this would work. I do hope there's a more elegant solution though.
vars:
file: /some/highly/fancy/secured/file
tasks:
- acl:
path: "/{{ file.split('/')[1:index+2] | join('/') }}"
# <snip>
loop: "{{ file.split('/')[1:] }}"
loop_control:
index_var: index
Basic idea is to use the file path split into a list to figure out how many times to loop. Then inside the loop once again split the file path into a list, and slice it from the base folder up to the loop index, and join it again into a file path. We skip the first entry in the list because it is blank, so need to adjust the index value in the list slice.
Related
I currently pass parameters to my script which an ansible tower job template/role calls. In order to make it more user friendly, I have decided to use a survey to do this. The script takes on filenames as parameters.
The script accepts the parameters in this format
'file1.txt','file2.txt','file3.txt'
However, here is the problem
The file could be one, it could be 2 or three up to possibly 5.
I have thought about a solution and I think the best design is to have a comma delimited list of files coming from ansible survey, for example file1, file2, file3
How can I have a logic whereby with the list of files, they can be split and a loop used to copy them one by one if there are more than one file provided in the list, then have a variable that will add single quotes and a comma to the file list. For example in the survey the values provided such as file1.txt, file2.txt, file3.txt will then be transformed into a variable which contains the following
'file1.txt','file2.txt','file3.txt'
The other issue is this.
The ansible role copies the given file name onto a directory, I know the split function can be used to split a comma separated list, how can I then copy them onto a folder in a loop ? If we look at the example below, it only works for a single file.
EDIT.
I have looked at the split function and combining it with a loop. I get an error when I run it. Template error while templating string
---
- name: Set file name
set_fact:
file1: "file1.txt"
file_list: "file1.txt"
- name: Set working directory
set_fact:
standard_path: "{{ansible_user_dir}}\\execution"
content_file: "{{standard_path}}\\{{file1}}"
- name: Copy file to working directory
win_copy:
src: "file1.txt"
dest: "{{content_file}}"
- name: Set parameters for script
set_fact:
params: "-filenames '{{content_file}}'"
- name: Run a loop to copy the files.
win_copy:
src: "{{ item }}"
dest: "{{standard_path \\ item }}"
with_items: "{{file_list.split(',') }}"
The section within the code that was throwing an exception has now been fixed.
The jinja2 standard requires the variables to be in separate {{}} as seen below.
- name: Run a loop to copy the files.
win_copy:
src: "{{ item }}"
dest: "{{standard_path}}\\{{item }}"
with_items: "{{file_list.split(',') }}"
In my Ansible Playbook I'd like to have a task that removes old files and folders from the application's directory. The twist to this otherwise simple task is that a few files or folders need to remain. Imagine something like this:
/opt/application
- /config
- *.properties
- special.yml
- /logs
- /bin
- /var
- /data
- /templates
Let's assume I'd like to keep /logs completely, /var/data and from /config I want to keep special.yml.
(I cannot provide exact code at the moment because I left work frustrated by this and, after cooling down, I am now writing up this question at home)
My idea was to have two lists of exclusions, one holding the folders and one the file. Then I use the find module to first get the folders in the application's directory into a variable and the same for the remaining files into another variable. Afterwards I wanted to remove every folder and file that are not in the lists of exclusions using the file module.
(Pseudo-YML because I'm not yet fluent enough in Ansible that I can whip up a properly structured example; it should be close enough though)
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ found_files_list.files }}"
when: well, that is the big question
What I can't figure out is how to properly construct the when clause. Is it even possible like this?
I don't believe there is a when clause with the file module.
But you can probably achieve what you need as follows:
- name: Find /opt/application all directories, exclude logs, data, and config
find:
paths: /opt/application
excludes: 'logs,data,config'
register: files_to_delete
- name: Ansible remove file glob
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ files_to_delete.files }}"
I hope this is what you need.
First use the find module like you said to get a total list of all files and directories. Register to a variable like all_objects.
- name: Get list of all files recursively
find:
path: /opt/application/
recurse: yes
register: all_objects
Then manually make a list of things you want to keep.
vars:
keep_these:
- /logs
- /var/data
- /config/special.yml
Then this task should delete everything except things in your list:
- name: Delete all files and directories except exclusions
file:
path: "{{ item.path }}"
state: absent
recurse: true
with_items: "{{ all_objects.files }}"
when: item.path not in keep_these
I think this general strategy should work... only thing I'm not sure about is the exact nesting heiararchy of the registered variable from the find module. You might have to play around with the debug module to get it exactly right.
I'm writing an Ansible role where I have some templates that must be present multiple times with different names in a single destination directory. In order not to have to handle each of these files separately I would need to be able to apply templating or some other form of placeholder substitution also to their names. To give a concrete example, I might have a file named
{{ Client }}DataSourceContext.xml
which I need to change into, say,
AcmeDataSourceContext.xml
I have many files of this kind that have to be installed in different directories, but all copies of a single file go to the same directory. If I didn't need to change their names or duplicate them I could handle a whole bunch of such files with something like
- name: Process a whole subtree of templates
template:
src: "{{ item.src }}"
dest: "/path/to/{{ item.path }}"
with_filetree: ../templates/my-templates/
when: item.state == 'file'
I guess what I'd like is a magic consider_filenames_as_templates toggle that turned on filename preprocessing. Is there any way to approximate this behaviour?
Pretty much anywhere you can put a literal value in Ansible you can instead substitute the value of a a variable. So for example, you could do something like this:
- template:
src: sometemplate.xml
dest: "/path/to/{{ item }}DataSourceContext.xml"
loop:
- client1
- client2
This would end up creating templates
/path/to/client1DataSourceContext.xml and
/path/to/client2DataSourceContext.xml.
Update 1
For the question you've posed in your update:
I guess what I'd like is a magic consider_filenames_as_templates toggle that turned on filename preprocessing. Is there any way to approximate this behaviour?
It seems like you could just do something like:
- name: Process a whole subtree of templates
template:
src: "{{ item.src }}"
dest: "/path/to/{{ item.path.replace('__client__', client_name) }}"
with_filetree: ../templates/my-templates/
when: item.state == 'file'
That is, replace the string __client__ in your filenames with the
value of the client_name variable.
I need to create a single file with the contents of a single fact in Ansible. I'm currently doing something like this:
- template: src=templates/git_commit.j2 dest=/path/to/REVISION
My template file looks like this:
{{ git_commit }}
Obviously, it'd make a lot more sense to just do something like this:
- inline_template: content={{ git_revision }} dest=/path/to/REVISION
Puppet offers something similar. Is there a way to do this in Ansible?
Another option to the lineinfile module (as given by udondan's answer) would be to use the copy module and specify the content rather than a source local to the Ansible host.
An example task would look something like:
- name: Copy commit ref to file
copy:
content: "{{ git_commit }}"
dest: /path/to/REVISION
I personally prefer this to lineinfile as for me lineinfile should be for making slight changes to files that are already there where as copy is for making sure a file is in a place and looking exactly like you want it to. It also has the benefit of coping with multiple lines.
In reality though I'd be tempted to make this a template task and just have a the template file be:
"{{ git_commit }}"
Which gets created by this task:
- name: Copy commit ref to file
template:
src: path/to/template
dest: /path/to/REVISION
It's cleaner and it's using modules for exactly what they are meant for.
Yes, in that simple case it is possible with the lineinfile module.
- lineinfile:
dest=/path/to/REVISION
line="{{ git_commit }}"
regexp=".*"
create=yes
The lineinfile module usually is used to ensure that a specific line is contained inside a file. The create=yes option will crete the file if it does not exist. The regexp=.* option makes sure you do not add content to the file if git_commit changes, because it would by default simply make sure the new content is added to the file and not replace the previous content.
This only works since you only have one line in your file. If you'd had more lines this obviously would not work with this module.
This issue seems to be resolved. However, if the template file was more than one variable, i.e. a json file, it is possible to use copy module with content parameter, with a lookup, i.e.:
# playbook.yml
---
- name: deploy inline template
copy:
content: '{{ lookup("template", "inlinetemplate.yml.j2") }}'
dest: /var/tmp/inlinetempl.yml
# inlinetemplate.yml.j2
---
- name: {{ somevar }}
abc: def
If you need insert the template to the exist file, you can insert through the lineinfile module.
- name: Insert jinja2 template to the file
lineinfile:
path: /path/file.conf
insertafter: "after this line"
line: "{{ lookup('template', 'template.conf.j2') }}"
I'm fairly new to Ansible and I'm trying to create a role that copies a file to a remote server. The local file can have a different name every time I'm running the playbook, but it needs to be copied to the same name remotely, something like this:
- name: copy file
copy:
src=*.txt
dest=/path/to/fixedname.txt
Ansible doesn't allow wildcards, so when I wrote a simple playbook with the tasks in the main playbook I could do:
- name: find the filename
connection: local
shell: "ls -1 files/*.txt"
register: myfile
- name: copy file
copy:
src="files/{{ item }}"
dest=/path/to/fixedname.txt
with_items:
- myfile.stdout_lines
However, when I moved the tasks to a role, the first action didn't work anymore, because the relative path is relative to the role while the playbook executes in the root dir of the 'roles' directory. I could add the path to the role's files dir, but is there a more elegant way?
It looks like you need access to a task that looks up information locally, and then uses that information as input to the copy module.
There are two ways to get local information.
use local_action:. That's shorthand for running the task agains 127.0.0.1, more info found here. (this is what you've been using)
use a lookup. This is a plugin system specifically designed for getting information locally. More info here.
In your case, I would go for the second method, using lookup. You could set it up like this example:
vars:
local_file_name: "{{ lookup('pipe', 'ls -1 files/*.txt') }}"
tasks:
- name: copy file
copy: src="{{ local_file_name }}" dest=/path/to/fixedname.txt
Or, more directly:
tasks:
- name: copy file
copy: src="{{ lookup('pipe', 'ls -1 files/*.txt') }}" dest=/path/to/fixedname.txt
With regards to paths
the lookup plugin is run from the context of the task (playbook vs role). This means that it will behave differently depending on where it's used.
In the setup above, the tasks are run directly from a playbook, so the working dir will be:
/path/to/project -- this is the folder where your playbook is.
If you where to add the task to a role, the working dir would be:
/path/to/project/roles/role_name/tasks
In addition, the file and pipe plugins run from within the role/files folder if it exists:
/path/to/project/roles/role_name/files -- this means your command is ls -1 *.txt
caveat:
The plugin is called every time you access the variable. This means you cannot trust debugging the variable in your playbook, and then relying on the variable to have the same value when used later in a role!
I do wonder though, about the use-case for a file that resides inside a projects ansible folders, but who's name is not known in advance. Where does such a file come from? Isn't it possible to add a layer in between the generation of the file and using it in Ansible... or having a fixed local path as a variable? Just curious ;)
Just wanted to throw in an additional answer... I have the same problem as you, where I build an ansible bundle on the fly and copy artifacts (rpms) into a role's files folder, and my rpms have versions in the filename.
When I run the ansible play, I want it to install all rpms, regardless of filenames.
I solved this by using the with_fileglob mechanism in ansible:
- name: Copy RPMs
copy: src="{{ item }}" dest="{{ rpm_cache }}"
with_fileglob: "*.rpm"
register: rpm_files
- name: Install RPMs
yum: name={{ item }} state=present
with_items: "{{ rpm_files.results | map(attribute='dest') | list }}"
I find it a little bit cleaner than the lookup mechanism.