Ansible declare host from variable - ansible

So i'm running an ansible playbook, which creates a server (using terraform) and gives saves the ip-address of the server into a variable. i'd like to execute another task on the given ip-address. How do i declare the new host?
I've tried:
- hosts: "{{ remotehost }}"
tasks:
- name: test
lineinfile:
path: /etc/environment
line: test1234
I run the playbook with: ansible-playbook variable.yaml --extra-vars='playbook=ip-address'

If you just want to execute a single task you can use delegate_to
For example:
tasks:
- name: another host execute
command: ls -ltr
delegate_to: "{{ remotehost }}"
The server should have the ssh connection working with the new hosts

Related

Variable is defined but still getting undefined error

I am trying to write a playbook that completes some of its tasks on the machine that the playbook is running on. I know i can use local_action for this but I am just testing if the playbook works for now. Eventually I will need to use the delegate_to. I have defined the variable and I am using delegate_to: 'variable name' but I am getting this error. : " fatal [target node]: FAILED! => { msg": "'variablename' is undefined. Below is my playbook:
name: Create certs
gather_facts: true
vars:
master: "{{ nameofhost }}"
tasks:
- name: Run command
shell: Command to run
delegate_to: "{{ master }}"
You need to target your play to a target hosts of an inventory
name: Create certs
gather_facts: true
hosts: target_hosts
vars:
master: "{{ nameofhost }}"
tasks:
- name: Run command
shell: Command to run
delegate_to: "{{ master }}"
``` 
Your inventory files may look like that:
[target_hosts]
master ansible_host=your_master_dns_or_ip
And then ansible can target that inventory group and then reduce the task scope to master host. Or you can just use the localhost as target.

how can I get hostname and put it on cfg

how can I edit a remote cfg file or in my case
I must modify the "hostname" by the name of the remote machine,
knowing that it is for automated because aprre I will deploy it on +300 server
I must be able to get the remote hostname and put it in the cfg file with ansible
thanks
############# file for config host ############
---
- hosts: computer_user
remote_user: toto
tasks:
- name: "config zabbix agent"
lineinfile:
path: /etc/zabbix.cfg
regexp: '(.*)hostname_local(.*)'
line: '%hostname%'
########### file_cfg_on_computer_user #########
hostname_local: here_i_want_put_the_hostname_of_my_computer_user_with_a_like_%hostname%
I'm not really sure about what you really want, but if you want to get the hostname of the system where your playbook is running, then you have two possibility :
You can get the value of inventory_hostname : It is the name of the hostname as configured in Ansible’s inventory host file
Or you can get the value of the ansible fact ansible_hostname : this one is discovered during the gathering fact phase
You can find more info about hosts variables and facts here
Ansible defines many special variables at runtime.
You can use {{ inventory_hostname }} to returns the inventory name for the ‘current’ host being iterated over in the play.
Or you can execute a remote command and use result in next task :
---
- hosts: computer_user
remote_user: toto
tasks:
- name: "get hostname from remote host"
shell: hostname
register: result_hostname
- name: "config zabbix agent"
lineinfile:
path: /etc/zabbix.cfg
regexp: '(.*)hostname_local(.*)'
line: '{{ result_hostname.stdout }}'

Ansible Hostname as variable in ansible_user

Need help with ansible. In our company we use following method to ssh to a server.
If IP of server is 172.16.1.8 , Username would be "empname~id~serverIP" e.g. john~1234~172.16.1.8 . So following ssh command is used -
> ssh john~1234~172.16.1.8#172.16.1.8 -i key.pem
Basically username has the hostname as a variable.
Now our inventory has just IPs with group web.
> cat inventory.txt
[web]
172.16.1.8
172.16.x.y
172.16.y.z
My playbook yml has ansible user as following.
> cat uptime.yml
- hosts: web
vars:
ansible_ssh_port: xxxx
ansible_user: john~1234~{{inventory_hostname}}
tasks:
- name: Run uptime command
shell: uptime
However, when I use following ansible-playbook command, it gives error for incorrect username.
> ansible-playbook -v uptime.yml -i inventory.txt --private-key=key.pem
Please help me find correct ansible_user in playbook which has hostname as a variable inside.
You can define ansible_user in group_vars/web.yml
group_vars/web.yml:
---
ansible_ssh_port: xxxx
ansible_user: "john~1234~{{ inventory_hostname }}"
Using a group var helped -
ansible_ssh_port: xxxx
ansible_user: "john~1234~{{ inventory_hostname }}"

Ansible how to compare output of multiple hosts within the same task

I have an ansible playbook that has a task to output the list of installed Jenkins plugins for each servers.
here is the host file:
[masters]
server1
server2
server3
server4
server5
server6
Here is the task that prints out the list of plugins installed on each of the jenkins servers:
- name: Obtaining a list of Jenkins Plugins
jenkins_script:
script: 'println(Jenkins.instance.pluginManager.plugins)'
url: "http://{{ inventory_hostname }}.usa.com:8080/"
user: 'admin'
password: 'password'
What I want to do next is do a comparison with all of the installed plugins across all of the servers -- to ensure that all of the servers are running the same plugins.
I don't necessarily want to force an update -- could break things -- just inform the user that they are running a different version of the plug in that the rest of the servers.
I am fairly new to ansible, will gladly accept any suggestions on how to accomplish this.
This is a bit ugly, but should work:
- hosts: master
tasks:
- jenkins_script:
script: 'println(Jenkins.instance.pluginManager.plugins)'
url: "http://{{ inventory_hostname }}.usa.com:8080/"
user: 'admin'
password: 'password'
register: call_result
- copy:
content: '{{ call_result.output }}'
dest: '/tmp/{{ inventory_hostname }}'
delegate_to: 127.0.0.1
- shell: 'diff /tmp/{{groups.master[0]}} /tmp/{{ inventory_hostname }}'
delegate_to: 127.0.0.1
register: diff_result
failed_when: false
- debug:
var: diff_result.stdout_lines
when: diff_result.stdout_lines | length != 0
This will save the result of the jenkins_script module onto the calling host (where you are running ansible-playbook), into /tmp/{{hostname}}. Afterwards it will run a normal diff against the first server's result and each of the others', and then print out if there are any differences.
It's a bit ugly, as it:
Uses /tmp on the calling host to store some temporary data
Does not clean up after itself
Uses the diff shell commands for something that might be doable with some clever use of jinja
Ansible 2.3 will have the tempfile module, that you might use to clean up /tmp

Ansible: Using Inventory file in shell command

Playbook below. I'm trying to replace test#ip: with a way to pull from my inventory file the IP from a group I've created.
- hosts: firewall
gather_facts: no
tasks:
- name: run shell script
raw: 'sh /home/test/firewall.sh'
- hosts: localhost
gather_facts: no
tasks:
- name: Copy File to Local Machine
shell: 'scp test#ip:/home/test/test.test /Users/dest/Desktop'
You need to change your task like this:
- hosts: localhost
gather_facts: no
tasks:
- name: Copy File to Local Machine
shell: 'scp test#{{ item }}:/home/test/test.test /Users/dest/Desktop'
with_items: groups['your_group_name']
If you want to run on all the hosts in the inventory then you can use like this:
with_items: groups['all']
Hope that will help you.

Resources