Download report content with ansible from Cisco DNA Center - ansible

I have an inventory report which is generated every day on our Cisco DNA Center. I would like to download this report with ansible to my ansible control node.
My playbook so far is:
---
- hosts: DNA
vars_files:
- /root/ansible/credentials.yaml
gather_facts: no
tasks:
- name: Get all Reports
cisco.dnac.reports_executions_info:
dnac_host: "{{dnac_host}}"
dnac_username: "{{dnac_username}}"
dnac_password: "{{dnac_password}}"
dnac_verify: "{{dnac_verify}}"
dnac_port: "{{dnac_port}}"
dnac_version: "{{dnac_version}}"
dnac_debug: "{{dnac_debug}}"
reportId: a13236797-7a85-4774-98bd-552b41a3s5v7
headers:
custom: text/csv
saveFile: true
dirPath: /root/ansible/outputs
register: result
- name: debug
debug:
msg: "{{ result.dnac_response }}"
In the documentation for this module it tells the following:
Returns report content.
Save the response to a file by converting the response data as a blob and setting the file format available from content-disposition response header.
https://github.com/cisco-en-programmability/dnacenter-ansible/blob/main/plugins/modules/reports_executions_info.py
Is this a misunderstanding from my side? I just want to transfer the generated report from the DNAC machine to my ansible machine.

The executionID was missing, thats why the content wasn't downloaded. There were no errors thats why it was confusing to me.

Related

ERROR! 'copy' is not a valid attribute for a Play

I am trying to make ansible playbook that connects to the server via ssh and sends a file.
Most of my google search yield no concrete results.
-
become: true
hosts: all
remote_user: artur
tasks: ~
-
copy:
dest: /home/artur/grep_error.py
group: UnixUsers
mode: 420
owner: artur
src: /Users/artur/Desktop/sublime/projects/scripts/grep_error.py
name: "example copying file with owner and permissions"
I expect to copy the file over to the ssh server.
Take Y minutes to learn yaml. Pay particular attention to the fact that indentation and new lines are syntactically significant
Install yamllint and validate your yaml files. It will save you a lot of precious time
Install ansible-lint and validate your files again. This one will go over the particular ansible syntax and watch for good practice
Read the doc about playbooks and make sure you respect the syntax (i.e. understand the errors you get from valiators above).
Now I gave you some references, here is a correction of your playbook
---
- name: My first play to copy files
become: true
hosts: all
remote_user: artur
tasks:
- name: Example copying file with owner and permissions
copy:
src: /Users/artur/Desktop/sublime/projects/scripts/grep_error.py
dest: /home/artur/grep_error.py
owner: artur
group: UnixUsers
mode: 0420
- name: I'm just a dummy task to show you a play can go on
debug:
msg: I'm a dummy task

What is the junos_install_config replacement module?

When using the junos_instal_config module from the Juniper.junos role for ansible in a playbook such as:
---
- name: Send Set Files to Different Devices
hosts: all
roles:
- Juniper.junos
connection: local
gather_facts: no
tasks:
- name: " Install vMX1 File"
junos_install_config:
host = " {{ inventory_hostname }}"
file = " /home/ubuntu/resources/vMX1.set"
overwrite = false
Running the playbook returns the following deprecation warning:
[DEPRECATION WARNING]: junos_install_config is kept for backwards compatibility but usage is discouraged. The module documentation details page may explain more about this rationale.. This feature will be
removed in a future release. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.
However reading documentation about the module I cant seem to find what has superseded it. Could anyone let me know which module can now be used in later versions to send and install ".set" files to a Junos Device?
You could try juniper_junos_config module to push or retrieve configuration.
tasks:
- name: Load configuration from a local file and commit
juniper_junos_config:
load: "merge"
src: "build_conf/{{ inventory_hostname }}/junos.conf"
Take a look at the documentation for more details.
https://www.juniper.net/documentation/en_US/junos-ansible/topics/topic-map/junos-ansible-configuration-loading-committing.html#task-configuration-load-file

Ansible - Unable to run certain JUNOS modules

I'm trying to run the Ansible modules junos_cli and junos_rollback and I get the following error:
ERROR! no action detected in task. This often indicates a misspelled module name, or incorrect module path.
The error appears to have been in '/home/quake/network-ansible/roles/junos-rollback/tasks/main.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
---
- name: I've made a huge mistake
^ here
This is the role in question:
---
- name: I've made a huge mistake
junos_rollback:
host={{ inventory_hostname }}
user=ansible
comment={{ comment }}
confirm={{ confirm }}
rollback={{ rollback }}
logfile={{ playbook_dir }}/library/logs/rollback.log
diffs_file={{ playbook_dir }}/configs/{{ inventory_hostname }}
Here is the Juniper page:
http://junos-ansible-modules.readthedocs.io/en/1.3.1/junos_rollback.html
Their example's syntax is a little odd. host uses a colon while the rest uses = signs. I've tried mixing both and only using one or the other. I keep getting errors.
I also confirmed that my junos-eznc version is higher than 1.2.2 (I have 2.0.1)
I've been able to use junos_cli before, I don't know if a version mismatch happened. On the official Ansible documentation, there is no mention of junos_cli or junos_rollback. Perhaps they're not supported anymore?
http://docs.ansible.com/ansible/list_of_network_modules.html#junos
Thanks,
junos_cli & junos_rollback are part of Galaxy and not core modules. You can find them at
https://galaxy.ansible.com/Juniper/junos/
Is the content posted here has whole content of your playbook? if yes, You need to define other items too in your playbook such as roles, connection, local. For example
refer https://github.com/Juniper/ansible-junos-stdlib#example-playbook
```
---
- name: rollback example
hosts: all
roles:
- Juniper.junos
connection: local
gather_facts: no
tasks:
- name: I've made a huge mistake
junos_rollback:
host = {{inventory_hostname}}
----
----
```
Where have you saved the content of juniper.junos modules?. Can you post the content of your playbook and the output of the tree command to see your file structure? That could help.
I had a similar problem where Ansible was not finding my modules and what I did was to copy the juniper.junos folder to my roles folder and then added a tasks folder within it to execute the main.yaml from there.
Something like this:
/Users/macuared/Ansible_projects/roles/Juniper.junos/tasks
---
- name: "TEST 1 - Gather Facts"
junos_get_facts:
host: "{{ inventory_hostname}}"
user: "uuuuu"
passwd: "yyyyyy"
savedir: "/Users/macuared/Ansible_projects/Ouput/Facts"
ignore_errors: True
register: junos
- name: Checking Device Version
debug: msg="{{ junos.facts.serialnumber }}"
Additionally, I would add "" to the string values in your YAML. Something like this:
---
- name: I've made a huge mistake
junos_rollback:
host="{{ inventory_hostname }}"
user=ansible
comment="{{ comment }}"
confirm={{ confirm }}
rollback={{ rollback }}
logfile="{{ playbook_dir }}/library/logs/rollback.log"
diffs_file="{{ playbook_dir }}/configs/{{ inventory_hostname }}"
Regarding this "I've tried mixing both and only using one or the other. I keep getting errors."
I've used just colon and mine works fine even when in the documentation suggests = signs. See junos_get_facts

how to read json file using ansible

I have a json file in the same directory where my ansible script is. Following is the content of json file:
{ "resources":[
{"name":"package1", "downloadURL":"path-to-file1" },
{"name":"package2", "downloadURL": "path-to-file2"}
]
}
I am trying to to download these packages using get_url. Following is the approach:
---
- hosts: localhost
vars:
package_dir: "/var/opt/"
version_file: "{{lookup('file','/home/shasha/devOps/tests/packageFile.json')}}"
tasks:
- name: Printing the file.
debug: msg="{{version_file}}"
- name: Downloading the packages.
get_url: url="{{item.downloadURL}}" dest="{{package_dir}}" mode=0777
with_items: version_file.resources
The first task is printing the content of the file correctly but in the second task, I am getting the following error:
[DEPRECATION WARNING]: Skipping task due to undefined attribute, in the future this
will be a fatal error.. This feature will be removed in a future release. Deprecation
warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.
You have to add a from_json jinja2 filter after the lookup:
version_file: "{{ lookup('file','/home/shasha/devOps/tests/packageFile.json') | from_json }}"
In case if you need to read a JSON formatted text and store it as a variable, it can be also handled by include_vars .
- hosts: localhost
tasks:
- include_vars:
file: variable-file.json
name: variable
- debug: var=variable
for future visitors , if you are looking for a remote json file read. this won't work
as ansible lookups are executed in the local
you should use a module like Slurp

Running a task on a single host always with Ansible?

I am writing a task to download a database dump from a specific location. It will always be run on the same host.
So I am including the task as follows in the main playbook:
tasks:
include: tasks/dl-db.yml
The content of the task is:
---
- name: Fetch the Database
fetch: src=/home/ubuntu/mydb.sql.gz dest=/tmp/mydb.sql.bz fail_on_missing=yes
But I want it to fetch from a single specific host not all hosts.
Is a task the right approach for this?
If all you need to happen is that it's only run once rather than on every host you can instead use run_once like so:
---
- name: Fetch the Database
run_once: true
fetch: src=/home/ubuntu/mydb.sql.gz dest=/tmp/mydb.sql.bz fail_on_missing=yes
This will then be run from the first host that runs the task. You can further limit this with delegate_to if you want to specifically target a specific host:
---
- name: Fetch the Database
run_once: true
delegate_to: node1
fetch: src=/home/ubuntu/mydb.sql.gz dest=/tmp/mydb.sql.bz fail_on_missing=yes

Resources