is there a way to do:
export PATH=$PATH:/new/path/to/bin
in Ansible, if possible without using shell or command.
I've tried this
- name: Add another bin dir to system-wide $PATH.
copy:
dest: /etc/profile.d/custom-path.sh
content: 'PATH=$PATH:{{ my_custom_path_var }}'
That I got from:
https://www.jeffgeerling.com/comment/reply/2799
But it doesn't work as PATH results in:
\$PATH:/new/path/to/bin
Breaking the system's PATH.
Thanks!
Using shell or command would be:
- name: Add pm2 to PATH
shell: echo "PATH=$PATH:/new/path/to/bin" > /etc/environment
become: true
But I'd still prefer an option that doesn't use shell/command.
Solution without using shell module
instead of
content: 'PATH=$PATH:{{ my_custom_path_var }}'
Use
content: 'export PATH=$PATH:{{ my_custom_path_var }}'
Related
I'm relatively new to ansible, so apologies if this question misses something.
My goal is to add a line to the ~/.bashrc file with ansible. I think the best way to do that is with the ansible.builtin.lineinfile module.
Unfortunately, I've run the module, it appears to run properly on the target host machine, reports back 'changed' on the first run (and 'ok' on subsequent runs), but no changes are actually made in the ~/.bashrc file.
Appreciate any help in figuring out what changes would be needed to create the desired outcome.
---
- hosts: setup
become: true
vars_files:
- /etc/ansible/vars.yml
tasks:
- name: Test lineinfile
ansible.builtin.lineinfile:
path: ~/.bashrc
line: "test lineinfile"
Changed path: ~/.bashrc to path: .bashrc and it worked.
Multiple lines could be done this way:
- name: Add an environment variable to the remote user's shell.
lineinfile:
dest: "~/.bashrc"
line: |
Line 1
Line 2
I am using ansible copy module to copy a file. The linter tells me "File permissions unset or incorrect" and I don't understand the problem.
Here is the ansible task
- name: Ensure some.txt is there
copy:
src: some.txt
dest: "{{ some_path }}/some.txt"
force: false
mode: '644'
Where some.txt is a file that exists in the ansible/files directory.
I have also tried with mode: 0644 but no luck.
My ansible version:
$ ansible-lint --version
ansible-lint 5.0.7 using ansible 2.10.8
Restarting the editor fixed it, but that's not really an answer.
Any ideas?
risky-file-permissions
File permissions unset or incorrect
Missing or unsupported mode parameter can cause unexpected file permissions based on version of Ansible being used. Be explicit, like mode: 0644 to avoid hitting this rule. Special preserve value is accepted only by copy, template modules. See https://github.com/ansible/ansible/issues/71200
Reference: https://ansible-lint.readthedocs.io/en/latest/default_rules.html
If you're using some module who requires the usage of mode... Set the mode and the error will vanish.
You just need to specify mode: 0644 in your playbook and it will be all good.
I'm trying to run a python script from an ansible script. I would think this would be an easy thing to do, but I can't figure it out. I've got a project structure like this:
playbook-folder
roles
stagecode
files
mypythonscript.py
tasks
main.yml
release.yml
I'm trying to run mypythonscript.py within a task in main.yml (which is a role used in release.yml). Here's the task:
- name: run my script!
command: ./roles/stagecode/files/mypythonscript.py
args:
chdir: /dir/to/be/run/in
delegate_to: 127.0.0.1
run_once: true
I've also tried ../files/mypythonscript.py. I thought the path for ansible would be relative to the playbook, but I guess not?
I also tried debugging to figure out where I am in the middle of the script, but no luck there either.
- name: figure out where we are
stat: path=.
delegate_to: 127.0.0.1
run_once: true
register: righthere
- name: print where we are
debug: msg="{{righthere.stat.path}}"
delegate_to: 127.0.0.1
run_once: true
That just prints out ".". So helpful ...
try to use script directive, it works for me
my main.yml
---
- name: execute install script
script: get-pip.py
and get-pip.py file should be in files in the same role
If you want to be able to use a relative path to your script rather than an absolute path then you might be better using the role_path magic variable to find the path to the role and work from there.
With the structure you are using in the question the following should work:
- name: run my script!
command: ./mypythonscript.py
args:
chdir: "{{ role_path }}"/files
delegate_to: 127.0.0.1
run_once: true
An alternative/straight forward solution:
Let's say you have already built your virtual env under ./env1 and used pip3 install the needed python modules.
Now write playbook task like:
- name: Run a script using an executable in a system path
script: ./test.py
args:
executable: ./env1/bin/python
register: python_result
- name: Get stdout or stderr from the output
debug:
var: python_result.stdout
If you want to execute the inline script without having a separate script file (for example, as molecule test) you can write something like this:
- name: Test database connection
ansible.builtin.command: |
python3 -c
"
import psycopg2;
psycopg2.connect(
host='127.0.0.1',
dbname='db',
user='user',
password='password'
);
"
You can even insert Ansible variables in this string.
I'm trying to turn these lines into something I can put in an ansible playbook:
# Install Prezto files
shopt -s extglob
shopt -s nullglob
files=( "${ZDOTDIR:-$HOME}"/.zprezto/runcoms/!(README.md) )
for rcfile in "${files[#]}"; do
[[ -f $rcfile ]] && ln -s "$rcfile" "${ZDOTDIR:-$HOME}/.${rcfile##*/}"
done
So far I've got the following:
- name: Link Prezto files
file: src={{ item }} dest=~ state=link
with_fileglob:
- ~/.zprezto/runcoms/z*
I know it isn't the same, but it would select the same files: except with_fileglob looks on the host machine, and I want it to look on the remote machine.
Is there any way to do this, or should I just use a shell script?
A clean Ansible way of purging unwanted files matching a glob is:
- name: List all tmp files
find:
paths: /tmp/foo
patterns: "*.tmp"
register: tmp_glob
- name: Cleanup tmp files
file:
path: "{{ item.path }}"
state: absent
with_items:
- "{{ tmp_glob.files }}"
Bruce P's solution works, but it requires an addition file and gets a little messy. Below is a pure ansible solution.
The first task grabs a list of filenames and stores it in files_to_copy. The second task appends each filename to the path you provide and creates symlinks.
- name: grab file list
shell: ls /path/to/src
register: files_to_copy
- name: create symbolic links
file:
src: "/path/to/src/{{ item }}"
dest: "path/to/dest/{{ item }}"
state: link
with_items: files_to_copy.stdout_lines
The file module does indeed look on the server where ansible is running for files when using with_fileglob, etc. Since you want to work with files that exist solely on the remote machine then you could do a couple things. One approach would be to copy over a shell script in one task then invoke it in the next task. You could even use the fact that the file was copied as a way to only run the script if it didn't already exist:
- name: Copy link script
copy: src=/path/to/foo.sh
dest=/target/path/to/foo.sh
mode=0755
register: copied_script
- name: Invoke link script
command: /target/path/to/foo.sh
when: copied_script.changed
Another approach would be to create an entire command line that does what you want and invoke it using the shell module:
- name: Generate links
shell: find ~/.zprezto/runcoms/z* -exec ln -s {} ~ \;
You can use with_lines to accomplish this:
- name: Link Prezto files
file: src={{ item }} dest=~ state=link
with_lines: ls ~/.zprezto/runcoms/z*
I have a playbook to install PythonBrew. In order to do this, I have to modify the shell environment. Because shell steps in Ansible are not persistent, I have to prepend export PYTHONBREW_ROOT=${pythonbrew.root}; source ${pythonbrew.root}/etc/bashrc; to the beginning of each of my PythonBrew-related commands:
- name: Install python binary
shell: export PYTHONBREW_ROOT=${pythonbrew.root}; source ${pythonbrew.root}/etc/bashrc; pythonbrew install ${python.version}
executable=/bin/bash
- name: Switch to python version
shell: export PYTHONBREW_ROOT=${pythonbrew.root}; source ${pythonbrew.root}/etc/bashrc; pythonbrew switch ${python.version}
executable=/bin/bash
I'd like to eliminate that redundancy. On the Ansible discussion group, I was referred the environment keyword. I've looked at the examples in the documentation and it's not clicking for me. To me, the environment keyword doesn't look much different than any other variable.
I've looked for other examples but have only been able to find this very simple example.
Can someone demonstrate how the environment keyword functions in Ansible, preferably with the code sample I've provided above?
Not sure if it will fits your need, but this is how I see this :
- hosts: all
vars:
env:
PYTHONBREW_ROOT: "{{ pythonbrew.root }}"
tasks:
- name: Install python binary
shell: pythonbrew install {{ python.version }} executable=/bin/bash
environment: env
- name: Switch to python version
shell: pythonbrew switch {{ python.version }} executable=/bin/bash
environment: env
It simply sets a variable named env and reuse it as environment in both of your shell commands. This way your shell command will have the PYTHONBREW_ROOT path set.
I have a very similar issue; I'd like to have ansible do it's stuff inside a Python virtualenv (after it's made sure it's set up for me, of course).
Here's one way I've done the environment preconditions so far; essentially I have had to add (and optionally remove) lines to .bashrc:
tasks:
- name: "Enable virtualenv in .bashrc"
lineinfile: dest=.bashrc
line="source {{ PROJECT_HOME }}/venv/bin/activate"
# Put tasks that rely on this environmental precondition here (?)
- name: "Disable virtualenv in .bashrc"
lineinfile: dest=.bashrc
line="source {{ PROJECT_HOME }}/venv/bin/activate"
state=absent
I don't know if I'm "Doing It Wrong", but until I figure it out or someone comes along to tell me how to do it better, I suppose this will work.