Using win_environment, it is possible to add/remove environment variables to a windows host.
But to modify variables that are already there, win_environment does not seem to be useful as u can't read old value to modify and update a variable. right?
EDIT: Since Ansible 2.3, the win_path module does all the heavy lifting for you. Just give it a list of items that should be present in the path and it'll make sure they're present and in the relative order you specified.
(if you're still using an ancient version of Ansible, the following is still the way to go)
To get this to work sanely, you'll want to combine with a replace and search filter to only make the change if the value you want isn't in there. For instance (this is for Ansible 1.9):
- raw: echo %PATH%
register: path_out
- win_environment:
name: path
value: "{{ path_out.stdout | regex_replace('[\r\n]*', '') + ';C:\\\\newpath' }}"
state: present
level: machine
when: not (path_out.stdout | search("(?i)c:\\\\newpath"))
This is a lot harder than it should be- I've got half a mind to hack up a win_path module for 2.0 to make it easier...
For 2.0, raw runs under Powershell, so you'd want Get-Item env:PATH instead.
I just spent some hours fighting with Ansible, Jinja2, and JSON backslash hell and finally found a generic solution for this - ie, one that lets you add ANY directory to the system path, and won't add the same path twice. I adapted Devis' solution but made both the SETX command and the when: clause accept (the same) {{item}}, so it could be parameterized. Here's what I came up with.
Save this as extend-path.yml:
---
- name: Get current machine PATH.
raw: $ENV:PATH
register: path_out
- name: "Add {{ item }} to PATH."
raw: SETX /M PATH "$ENV:PATH;{{ item }}"
when: "not (path_out.stdout | urlencode | search( '{{ item | urlencode }}' ) )"
changed_when: true
And then, for example, in your playbook.yml:
---
tasks:
- name: Add tools to PATH.
include: extend-path.yml
with_items:
- C:\bin
- C:\Program Files (x86)\CMake\bin
- C:\Program Files\git\cmd
(As you see, I actually lost the backslash war and decided to bypass it entirely by using urlencode.)
Try this with Ansible 2.0
- name: Get actual PATH
raw: $ENV:PATH
register: path_out
tags: path
- name: Add Notepad++ to PATH
raw: SETX /M PATH "$ENV:PATH;C:\Program Files (x86)\Notepad++"
when: path_out.stdout.find('Notepad') == -1
tags: path
Here is an example that sets msbuild to the machine path. You could add more items if needed. It's important that you only retrieve the Machine path before then modifying the machine path. If you just call $ENV:PATH, you will get the machine path combined with the user path. If you use that to set the machine path, then you are copying all your user path values to the machine path which I'm assuming is not what you want.
- name: Get System PATH
raw: '[Environment]::GetEnvironmentVariables("Machine").Path'
register: path_out
- name: Modify System PATH
raw: SETX /M PATH "$([Environment]::GetEnvironmentVariables("Machine").Path | Out-String);{{ item }}"
when: path_out.stdout.find(item) == -1
with_items:
- 'C:\Program Files (x86)\MSBuild\14.0\Bin'
You can use Powershell for adding a string to the Path. The code below adds a given path to the PATH variable while ensuring path isn't modified if the given path is already existent in PATH.
$env = [Environment]::GetEnvironmentVariable('path','machine') -split ';'
$msdeploypath = 'C:\Program Files\IIS\Microsoft Web Deploy'
if ($env -notcontains $msdeploypath) {
$env += $msdeploypath
[Environment]::SetEnvironmentVariable('path', ($env -join ';'), 'machine')
Write-Host "changed"
}
In Ansible 2 you can also use the raw module for that as it uses Powershell
- name: Set Path
raw: $env = [Environment]::GetEnvironmentVariable('path','machine') -split ';' ; $msdeploypath = 'C:\Program Files\IIS\Microsoft Web Deploy' ; if ($env -notcontains $msdeploypath) { $env += $msdeploypath ; [Environment]::SetEnvironmentVariable('path', ($env -join ';'), 'machine') ; Write-Host "changed" }
register: pathchange
changed_when: pathchange.stdout.find('changed') != -1
Casey's solution is pretty close. The only problem is that [Environment]::SetEnvironmentVariable adds a newline at the end of the PATH. So when you add to it, it puts all your new values on another line making the PATH not work. Here's what I did, and it works pretty well.
Just needed to add a split on newlines... then the system PATH variable gets set correctly.
It's a combination of Casey's solution and Chris Hillery's:
in a file called extend-path.yml:
---
- name: Get current machine PATH.
raw: "$([Environment]::GetEnvironmentVariables(\"Machine\").Path -split '\r\n')"
register: path_out
- name: Print Out PATH
debug:
msg: "PATH: {{ path_out }}"
- name: "Add {{ item }} to PATH."
raw: SETX /M PATH "$($([Environment]::GetEnvironmentVariables("Machine").Path -split '\r\n'));{{ item }}"
when: path_out.stdout.find(item) == -1
changed_when: true
Then to call it, in your playbook:
- name: Update system PATH
include: tasks/win_system_path.yml
with_items:
- C:\Program Files\Git\bin
Related
How to delete the oldest directory with ansible.
suppose I have the following tree structure
Parent Directory
-Dir2020-05-20
-Dir2020-05-21
-Dir2020-05-22
-Dir2020-05-23
now every time an ansible playbook is run, it should delete the oldest directory, For e.g it should delete Dir2020-05-20 in its first run if we consider its creation date to be 2020-05-20.
age attribute of file module does not seen helpful as i have to run this playbook very randomly and i want to keep limited no. of these directories.
Just assign dirpath to the path of your "Parent Directory" where all these directories are present
---
- hosts: localhost
vars:
dir_path: "/home/harshit/ansible/test/" ##parent directory path, make sure it ends with a slash
tasks:
- name: find oldest directory
shell:
cmd: "ls `ls -tdr | head -n 1 `"
chdir: "{{dir_path}}"
register: dir_name_to_delete
- name: "delete oldest directory: {{dir_path}}{{dir_name_to_delete.stdout}}"
file:
state: absent
path: "{{dir_path}}{{dir_name_to_delete.stdout}}"
Considering a recommended practice is not to use shell or command modules wherever possible I suggest a pure ansible solution for this case:
- name: Get directory list
find:
paths: "{{ target_directory }}"
file_type: directory
register: found_dirs
- name: Get the oldest dir
set_fact:
oldest_dir: "{{ found_dirs.files | sort(attribute='mtime') | first }}"
- name: Delete oldest dir
file:
state: absent
path: "{{ oldest_dir.path }}"
when:
- found_dirs.files | count > 3
There are two ways to know how many files were found with find module - either using its return value matched like this when: found_dirs.matched > 3 or using count filter. I prefer the latter method because I just use this filter in a lot of other cases so this is just a habit.
For your reference, ansible has a whole bunch of useful filters (e.g. I used count and sort here, but there are dozens of them). One does not need to remember those filter names, of course, just keep in mind they exist and might be useful in many cases.
One of my Ansible roles allows users to execute a set of tests against a target host, leaving minimal impact, removing all traces of the test.
One of the surprisingly difficult things to determine is absolute paths on the host running the playbook. Resolving files for copy seems to be really complicated, Ansible having a priority list for trying to resolve files.
As part of my debugging information emitted when tests fail, I want to specify the absolute path of the file on the local host and the absolute path of the file on the remote host. The remote host is easy enough to query, but I can't seem to find a way to determine where the source file is coming from on the local host.
Here's my task:
- copy: src={{ goss_file }} dest=/tmp/goss/ mode=0644
goss_file is specified by the user as a role variable. What I'd like to do is determine the full absolute path of goss_file on the local machine running Ansible against remote hosts.
Is there a way to do this? It would significantly help in the debugging I've been doing.
Here is a snippet that I've used to find the templates files (in this case) in our goss testing solution
- name: save goss files list
set_fact:
goss_files : "{{ lookup('fileglob', '../roles/{{role_name}}/templates/*.j2', wantlist=True) | sort }}"
- name: print
debug:
msg:
- "{{ goss_files }}"
- name: Copy goss test from template to remote
template:
src: "{{ item }}"
dest: /tmp/{{ item | basename | regex_replace('\.j2','.yml') }}
mode: '0644'
with_items:
- "{{ goss_files }}"
I also had to create an empty files folder in the "root" of ansible.
You can see the full solution here Using goss and Ansible templates for System Testing
I have a list of files in an Ansible playbook and I want to perform the following tasks on them:
Copy each file from my control machine to the managed node.
If and only if the current file has changed, run mycmd on the managed node with the filename as an argument (e.g. mycmd --file myfile).
I only want to run mycmd on files which have changed because it involves some expensive (relative to the other tasks) API calls.
I know how to copy files from a list (copy modules + with_items), but what I can't work out is how to run mycmd with the file that has been copied but only if the file has been changed. notify doesn't seem appropriate because the full command run will vary between files.
What's the best way to achieve this result, assuming that it's possible in Ansible?
Register copy's result and iterate over it:
- copy:
src: "{{ item }}"
dest: /tmp/
with_items:
- fff1.txt
- fff2.txt
register: copy_res
- command: echo {{ item.dest }}
with_items: "{{ copy_res.results | select('changed') | list }}"
loop_control:
label: "{{ item.item }}"
I used select('changed') to reduce list only to changed items and defined label to make output more human readable.
I have the following inside a playbook of Ansible 2.3.0.0:
- name: Disable SSL2, SSL3, RC4. Activate TLS
win_regedit:
path: 'HKLM:\System\CurrentControlSet\Control\SecurityProviders\SCHANNEL\{{ item.path }}'
name: "{{ item.name }}"
data: "{{ item.data }}"
type: dword
with_items:
# more items working correctly
- { path: "Ciphers\\RC4 128/128", name: 'Enabled', data: 0 }
- { path: "Ciphers\\RC4 40/128", name: 'Enabled', data: 0 }
- { path: "Ciphers\\RC4 56/128", name: 'Enabled', data: 0 }
I've tried every single combination of quotes and slashes I could think of to escape the /, and still either throws syntax error or considers the last 128 as another folder of the registry path rather than part of the key itself.
Is there any way Ansible can take that 128/128 literally and not as part of a path?
Sorry, but you are out of luck with win_regedit and forward slash.
win_regedit use PowerShell and Get-ItemProperty with friends under the hood.
And PowerShell treat forward slash character as level separator, whether you escape it or not.
You can google for some ways to overcome this in PowerShell (example1, example2).
But with win_regedit Ansible module you can't use that tricks.
So either you write your own PowerShell script with tricks from above articles and use script module, or prepare registry template and use win_regmerge module (it uses reg.exe under the hood) to import required settings.
Thanks to #KonstantinSuvorov I've done a workaround that, although ugly, works. Perform this step to create the registry key directly with PowerShell before the win_regedit:
- win_shell: $path=new-item -path 'HKLM:\System\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Ciphers';$key = (get-item HKLM:\).OpenSubKey("System\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Ciphers", $true);$key.CreateSubKey('RC4 128/128');$key.CreateSubKey('RC4 40/128');$key.CreateSubKey('RC4 56/128');$key.Close()
As part of most of the Ansible playbooks I need to install Node and Mongo from internally hosted tarballs. Sudo privileges and internet access are not available. All of the Ansible runs happen against localhost.
One of the problems of this setup is that after untarring node/mongo, they need to be added to PATH or subsequent roles/tasks won't be able to rely on them. Unfortunately, I don't seem to be able to find a way to amend PATH within an Ansible playbook run.
I've tried using shell and command tasks to export PATH and source .bashrc, neither of those seem to help. Is there a way to use my node installation within the same playbook? yum task seems to do the trick, but it's not available to me now.
Have you tried using 'environment'?
You can get your local PATH into a variable
environment:
PATH: "{{ lookup('env', 'PATH') }}"
or you can set the PATH
environment:
PATH: "{{ node_path }}:{{mongo_path}}:{{ lookup('env', 'PATH') }}"
The above assumes you can register the path to mongo & Node as vars, and make them available to later plays.
Info on using environment & PATH locally and remotely is here:
https://serverfault.com/questions/577188/how-can-i-prepend-to-path-while-running-ansibles-pip-module
- hosts: localhost
gather_facts: False
vars:
path1: "{{lookup('env', 'PATH')}}"
tasks:
- shell: echo $PATH
environment:
PATH: 'mypath2'
register: path2
- shell: echo $PATH
environment:
PATH: 'mypath3'
register: path3
- shell: echo $PATH
environment:
PATH: "{{ path1 }}"
register: path4
- debug: msg={{path1}}
- debug: msg={{path2}}
- debug: msg={{path3}}
- debug: msg={{path4}}
- debug: msg={{lookup('env', 'PATH')}}