Error on Ansible VMWARE VMOTION module - need to move VMs using ansible from primary vcenter to DR site for Emergency Move - ansible

I am attempting to migrate VMs from one vcenter to another in the environment using ansible.
Both of the vcenters are running vSphere Client version 7.0.3.01100.
ansible-helper is a RHEL 8 system with Python 3 and the required pyvmomi and vsphere-automation-sdk-python packages.
Note: The vmware snapshot and power off work. It just fails at the vmotion stage
The user that is being used to attempt the vmotion is the same (AD Controlled) and has the same privileges.
Here is the playbook used to attempt the move:
---
- name: Migrate Systems to Subterra (Fail-over to DR Site)
hosts: subterra
gather_facts: no
vars_files:
- environments/vars/vault.yml
vars:
vcenter_hostname: "vcenter1.company.com"
vcenter_username: "DOMAIN\\ansible_user"
dst_vcenter: "vcenter2.company.com"
datacenter_name: "DC1"
datastore: "SAN_Location1"
tasks:
- name: Gather one specific VM
community.vmware.vmware_vm_info:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
vm_name: "{{ inventory_hostname }}"
validate_certs: false
delegate_to: ansible-helper
register: vminfo
- name: List the VM's Information
ansible.builtin.debug:
var: vminfo
- name: print keys of cluster data
ansible.builtin.debug:
msg: "{{ vminfo.virtual_machines | list }}"
- name: List the VM's Folder Information
ansible.builtin.debug:
var: vminfo.virtual_machines[0].folder
- name: Testing printing the json_query
ansible.builtin.debug:
msg: "{{ vminfo | json_query('virtual_machines[0].folder') }}"
- name: Listing Datastore name
ansible.builtin.debug:
msg: "{{ vminfo | json_query('virtual_machines[0].datastore_url.name') }}"
- name: Define VM Folder
ansible.builtin.set_fact:
vm_folder_loc: "{{ vminfo | json_query('virtual_machines[0].folder') }}"
- name: Define Data Store from vm output
ansible.builtin.set_fact:
dest_data_store: "{{ vminfo | json_query('virtual_machines[0].datastore_url.name') }}"
- name: List the VM's Folder Information
ansible.builtin.debug:
var: vm_folder_loc
- name: Create a snapshot prior to migration
community.vmware.vmware_guest_snapshot:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
folder: "{{ vm_folder_loc }}"
name: "{{ inventory_hostname }}"
state: present
memory_dump: false
validate_certs: false
snapshot_name: Snapshot_Prior_to_Migration
description: "Snapshot of system prior to migration to vcenter2"
delegate_to: ansible-helper
- name: PowerOff VM
community.vmware.vmware_guest:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
name: "{{ inventory_hostname }}"
validate_certs: false
state: poweredoff
delegate_to: ansible-helper
- name: Perform vMotion of virtual machine to Subterra
community.vmware.vmware_vmotion:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
vm_name: "{{ inventory_hostname }}"
destination_datastore: "{{ dest_data_store }}"
destination_datacenter: "vcenter2"
destination_resourcepool: "Linux"
validate_certs: false
delegate_to: ansible-helper
- name: PowerOn VM
community.vmware.vmware_guest:
hostname: "{{ dst_vcenter }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
name: "{{ inventory_hostname }}"
validate_certs: false
state: poweredon
delegate_to: ansible-helper
The playbook works until it gets to the community.vmware.vmware_vmotion module. Then I get the following error:
{
"exception": "Traceback (most recent call last):\r\n File \"/home/devops/.ansible/tmp/ansible-tmp-1676650446.81-32325-16791288075867/AnsiballZ_vmware_vmotion.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/devops/.ansible/tmp/ansible-tmp-1676650446.81-32325-16791288075867/AnsiballZ_vmware_vmotion.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/devops/.ansible/tmp/ansible-tmp-1676650446.81-32325-16791288075867/AnsiballZ_vmware_vmotion.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible_collections.community.vmware.plugins.modules.vmware_vmotion', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib64/python3.6/runpy.py\", line 205, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib64/python3.6/runpy.py\", line 96, in _run_module_code\r\n mod_name, mod_spec, pkg_name, script_name)\r\n File \"/usr/lib64/python3.6/runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/modules/vmware_vmotion.py\", line 549, in <module>\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/modules/vmware_vmotion.py\", line 545, in main\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/modules/vmware_vmotion.py\", line 243, in __init__\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/module_utils/vmware.py\", line 249, in find_datastore_by_name\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/module_utils/vmware.py\", line 209, in find_object_by_name\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/module_utils/vmware.py\", line 796, in get_all_objs\r\n File \"/usr/local/lib/python3.6/site-packages/pyVmomi/VmomiSupport.py\", line 706, in <lambda>\r\n self.f(*(self.args + (obj,) + args), **kwargs)\r\n File \"/usr/local/lib/python3.6/site-packages/pyVmomi/VmomiSupport.py\", line 511, in _InvokeMethod\r\n list(map(CheckField, info.params, args))\r\n File \"/usr/local/lib/python3.6/site-packages/pyVmomi/VmomiSupport.py\", line 1098, in CheckField\r\n % (info.name, info.type.__name__, valType.__name__))\r\nTypeError: For \"container\" expected type vim.ManagedEntity, but got str\r\n",
"_ansible_no_log": false,
"_ansible_delegated_vars": {
"ansible_port": null,
"ansible_host": "192.168.28.12",
"ansible_user": "devops"
},
"module_stderr": "OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /var/lib/awx/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 32265\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\nShared connection to 192.168.28.12 closed.\r\n",
"changed": false,
"module_stdout": "Traceback (most recent call last):\r\n File \"/home/devops/.ansible/tmp/ansible-tmp-1676650446.81-32325-16791288075867/AnsiballZ_vmware_vmotion.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/devops/.ansible/tmp/ansible-tmp-1676650446.81-32325-16791288075867/AnsiballZ_vmware_vmotion.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/devops/.ansible/tmp/ansible-tmp-1676650446.81-32325-16791288075867/AnsiballZ_vmware_vmotion.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible_collections.community.vmware.plugins.modules.vmware_vmotion', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib64/python3.6/runpy.py\", line 205, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib64/python3.6/runpy.py\", line 96, in _run_module_code\r\n mod_name, mod_spec, pkg_name, script_name)\r\n File \"/usr/lib64/python3.6/runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/modules/vmware_vmotion.py\", line 549, in <module>\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/modules/vmware_vmotion.py\", line 545, in main\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/modules/vmware_vmotion.py\", line 243, in __init__\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/module_utils/vmware.py\", line 249, in find_datastore_by_name\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/module_utils/vmware.py\", line 209, in find_object_by_name\r\n File \"/tmp/ansible_community.vmware.vmware_vmotion_payload_eyr21noz/ansible_community.vmware.vmware_vmotion_payload.zip/ansible_collections/community/vmware/plugins/module_utils/vmware.py\", line 796, in get_all_objs\r\n File \"/usr/local/lib/python3.6/site-packages/pyVmomi/VmomiSupport.py\", line 706, in <lambda>\r\n self.f(*(self.args + (obj,) + args), **kwargs)\r\n File \"/usr/local/lib/python3.6/site-packages/pyVmomi/VmomiSupport.py\", line 511, in _InvokeMethod\r\n list(map(CheckField, info.params, args))\r\n File \"/usr/local/lib/python3.6/site-packages/pyVmomi/VmomiSupport.py\", line 1098, in CheckField\r\n % (info.name, info.type.__name__, valType.__name__))\r\nTypeError: For \"container\" expected type vim.ManagedEntity, but got str\r\n",
"rc": 1,
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error"
}
Here is my ansible.cfg:
[defaults]
inventory = ./environments/test/hosts
ANSIBLE_SSH_ARGS = -C -o ControlMaster=auto -o ControlPersist=30m
ANSIBLE_PIPELINING = True
retry_files_enabled = False
host_key_checking = False
allow_world_readable_tmpfiles = True
remote_user = devops
private_key_file = ../keys/private_key_rsa
role_path = ./roles/

Related

Loading environment variables from YAML file (using include_vars) in Ansible

I am trying to load variables from a YAML file to be used as environment variables in a later task of the same playbook. This is what I am doing:
Template host vars in YAML format:
- name: builtin | template | template host variables file
ansible.builtin.template:
src: templates/django/hostvars.j2
dest: "host_vars/{{ inventory_hostname_short }}.yml"
mode: 0640
output_encoding: "utf-8"
delegate_to: localhost
This is producing the following file:
--
# Host variables to be set as environment variables in tasks that need it
POSTGRES_PASSWORD: "<password>"
POSTGRES_USER: "dbuser"
POSTGRES_DB: "dbname"
POSTGRES_HOST: "dbhost"
POSTGRES_PORT: 5432
POSTGRES_SSLMODE: "verify-full"
POSTGRES_SSLCA: "/etc/ssl/certs/ISRG_Root_X1.pem"
POSTGRES_APPNAME: "myproject"
DJANGO_SITE_NAME: "mysite"
DJANGO_SITE_PASSWORD: "mypassword"
DJANGO_SITE_USER: "myuser"
DJANGO_SITE_ID: 2
DJANGO_SECRET_KEY: "<very-long-and-random-secret>"
[..]
Use include_vars to load the vars into the playbook:
- name: builtin | include_vars | load host vars
ansible.builtin.include_vars:
file: "host_vars/{{ inventory_hostname_short }}.yml"
name: envvars
Later on the playbook, check that the variables are there:
- name: builtin | debug | print variable 'envvars'
ansible.builtin.debug:
var: envvars
This is working as intended (apparently) and a list of KEY: value variables is being printed, such as:
TASK [builtin | debug | print variable 'envvars'] ***************************************************************
ok: [django1.mydomain.com] => {
"envvars": {
"DJANGO_DEBUG": 0,
"DJANGO_LOGS_DIR": "/opt/django/logs",
"DJANGO_MEDIA_BASE": "/opt/django/media",
"DJANGO_SECRET_KEY": "<very-long-and-random-secret>",
[..]
Use django_manage to update the database schema:
- name: community.general | django_manage | update database schema
community.general.django_manage:
command: migrate
settings: myproject.settings
project_path: "/opt/django/src"
virtualenv: "/opt/django/venv"
become: true
become_user: django
become_method: su
environment: "{{ envvars }}"
This, unfortunately, is failing. Django is complaining that it cannot find the SECRET_KEY environment variable, which it should build based on one of the variables in the abovementioned list (especifically, DJANGO_SECRET_KEY).
Incidentally, if I run the following task, nothing is printed out:
- name: print environment variables
ansible.builtin.command: env
become: true
become_user: django
become_method: su
environment: "{{ envvars }}"
And I don't understand why. I've tried to debug using -vvv and they are being sent by Ansible though the SSH connection (at least so it seems).
Any hints on what bit I am doing wrong?
EDIT 1
I've changed the tasks list file where I use django_manage into the following code:
- name: builtin | shell | capture DJANGO_ environment variables
debugger: on_failed
ansible.builtin.shell:
cmd: "env | grep DJANGO_"
register: out
environment: "{{ envvars }}"
become: true
become_user: django
become_method: su
- name: builtin | debug | pinrt content of out.stdout
ansible.builtin.debug:
var: out.stdout
- name: builtin | debug | print variable 'envvars'
ansible.builtin.debug:
var: envvars
- name: community.general | django_manage | populate the static subdirectory
community.general.django_manage:
command: collectstatic
clear: yes
project_path: "/opt/django/src"
virtualenv: "/opt/django/venv"
become: true
become_user: django
become_method: su
environment: "{{ envvars }}"
The second and third tasks both print the values of the variables (the first one from the env | grep DJANGO_ command sent via shell and the second is the value of the envvars variable which is being sent via the environment: directive.
This is the error of the last task:
TASK [builtin | shell | capture DJANGO_ environment variables] *************************************************************************************************************
changed: [django1.donmain.com]
TASK [builtin | debug | pinrt content of out.stdout] ***********************************************************************************************************************
ok: [django1.django.com] => {
"out.stdout": "DJANGO_SITE_USER=mysite\nDJANGO_MEDIA_BASE=/opt/django/media\nDJANGO_SITE_NAME=mysite\nDJANGO_SITE_ID=2\nDJANGO_SECRET_KEY=<very-secret-key>\nDJANGO_LOGS_DIR=/opt/django/logs\nDJANGO_SETTINGS_MODULE=myproject.settings.production\nDJANGO_DEBUG=0\nDJANGO_STATIC_BASE=/opt/django/static\nDJANGO_SITE_PASSWORD=mypassword\nDJANGO_SITE_VERSION=57a2f3c168d86243f03809e5d02a0f50a8fa892e"
}
TASK [builtin | debug | print variable 'envvars'] **************************************************************************************************************************
ok: [django1.domain.com] => {
"envvars": {
"DJANGO_DEBUG": 0,
"DJANGO_LOGS_DIR": "/opt/django/logs",
"DJANGO_MEDIA_BASE": "/opt/django/media",
"DJANGO_SECRET_KEY": "<very-secret-key>",
"DJANGO_SETTINGS_MODULE": "myproject.settings.production",
"DJANGO_SITE_ID": 2,
"DJANGO_SITE_NAME": "mysite",
"DJANGO_SITE_PASSWORD": "mypassword",
"DJANGO_SITE_USER": "myuser",
[..]
}
}
TASK [community.general | django_manage | populate the static subdirectory] ************************************************************************************************
fatal: [django1.domain.com]: FAILED! => {"changed": false, "cmd": ["./manage.py", "collectstatic", "--noinput", "--clear"], "msg": "\n:stderr: Traceback (most recent call last):\n File \"/opt/django/venv/lib/python3.9/site-packages/django/core/management/__init__.py\", line 204, in fetch_command\n app_name = commands[subcommand]\nKeyError: 'collectstatic'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/opt/django/src/./manage.py\", line 22, in <module>\n execute_from_command_line(sys.argv)\n File \"/opt/django/venv/lib/python3.9/site-packages/django/core/management/__init__.py\", line 381, in execute_from_command_line\n utility.execute()\n File \"/opt/django/venv/lib/python3.9/site-packages/django/core/management/__init__.py\", line 375, in execute\n self.fetch_command(subcommand).run_from_argv(self.argv)\n File \"/opt/django/venv/lib/python3.9/site-packages/django/core/management/__init__.py\", line 211, in fetch_command\n settings.INSTALLED_APPS\n File \"/opt/django/venv/lib/python3.9/site-packages/django/conf/__init__.py\", line 57, in __getattr__\n self._setup(name)\n File \"/opt/django/venv/lib/python3.9/site-packages/django/conf/__init__.py\", line 44, in _setup\n self._wrapped = Settings(settings_module)\n File \"/opt/django/venv/lib/python3.9/site-packages/django/conf/__init__.py\", line 107, in __init__\n mod = importlib.import_module(self.SETTINGS_MODULE)\n File \"/usr/lib/python3.9/importlib/__init__.py\", line 127, in import_module\n return _bootstrap._gcd_import(name[level:], package, level)\n File \"<frozen importlib._bootstrap>\", line 1030, in _gcd_import\n File \"<frozen importlib._bootstrap>\", line 1007, in _find_and_load\n File \"<frozen importlib._bootstrap>\", line 986, in _find_and_load_unlocked\n File \"<frozen importlib._bootstrap>\", line 680, in _load_unlocked\n File \"<frozen importlib._bootstrap_external>\", line 790, in exec_module\n File \"<frozen importlib._bootstrap>\", line 228, in _call_with_frames_removed\n File \"/opt/django/src/black_pearl/settings/production.py\", line 3, in <module>\n from black_pearl.settings.common import *\n File \"/opt/django/src/black_pearl/settings/common.py\", line 301, in <module>\n path_app = import_module(app).__path__\n File \"/usr/lib/python3.9/importlib/__init__.py\", line 127, in import_module\n return _bootstrap._gcd_import(name[level:], package, level)\nModuleNotFoundError: No module named 'None'\n", "path": "/opt/django/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "syspath": ["/tmp/ansible_community.general.django_manage_payload_l53eeo4g/ansible_community.general.django_manage_payload.zip", "/usr/lib/python39.zip", "/usr/lib/python3.9", "/usr/lib/python3.9/lib-dynload", "/usr/local/lib/python3.9/dist-packages", "/usr/lib/python3/dist-packages"]}
PLAY RECAP *****************************************************************************************************************************************************************
django1.domain.com : ok=14 changed=1 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Thanks in advance.
Q: "Nothing is printed out."
A: Register the output if you want to see it. For example
- hosts: localhost
tasks:
- command: echo $DJANGO_DEBUG
register: out
environment:
DJANGO_DEBUG: 0
- debug:
var: out.stdout
gives (abridged)
out.stdout: '0'
This should work for you. Step by step add the complexity to your code and isolate the problem. For example, the playbook below should display the environment at the remote host
- hosts: test_11
vars:
env:
DJANGO_DEBUG: 0
DJANGO_SITE_NAME: mysite
DJANGO_SITE_PASSWORD: mypassword
DJANGO_SITE_USER: myuser
DJANGO_SITE_ID: 2
tasks:
- shell: env | grep DJANGO_
register: out
environment: "{{ env }}"
- debug:
var: out.stdout

Ansible valueFrom aws secrets manager

I need to set environment vars for Container in AWS Fargate,
Values for those vars are in AWS Secret Manager, secret ARN is arn:aws:secretsmanager:eu-west-1:909628726468:secret:secret.automation-user-KBSm8J, it stores two key/value secrets AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
In CloudFormation the following worked perfect:
ContainerDefinitions:
- Name: "prowler"
Image: !Ref Image
Environment:
- Name: AWS_ACCESS_KEY_ID
Value: '{{resolve:secretsmanager:secret.automation-user:SecretString:AWS_ACCESS_KEY_ID}}'
I have to do the same with Ansible (v2.9.15) and community.aws.ecs_taskdefinition module
Based on "official" example I have the following snippet:
- name: Create task definition
ecs_taskdefinition:
family: "{{ task_definition_name }}"
aws_access_key: "{{ aws_access_key }}"
aws_secret_key: "{{ aws_secret_key }}"
region: "{{ aws_region }}"
execution_role_arn: "{{ execution_role_arn }}"
containers:
- name: prowler
essential: true
image: "{{ image }}"
environment:
- name: "AWS_ACCESS_KEY_ID"
valueFrom: "arn:aws:secretsmanager:eu-west-1:909628726468:secret:secret.automation-user-KBSm8J/AWS_ACCESS_KEY_ID"
..but it doesn't work:
TASK [ansible-role-prowler-deploy : Create task definition] ********************
[0;31mAn exception occurred during task execution. To see the full traceback, use -vvv. The error was: KeyError: 'value'[0m
[0;31mfatal: [localhost]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n File \"/root/.ansible/tmp/ansible-tmp-1607197370.8633459-17-102108854554553/AnsiballZ_ecs_taskdefinition.py\", line 102, in <module>\n _ansiballz_main()\n File \"/root/.ansible/tmp/ansible-tmp-1607197370.8633459-17-102108854554553/AnsiballZ_ecs_taskdefinition.py\", line 94, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/root/.ansible/tmp/ansible-tmp-1607197370.8633459-17-102108854554553/AnsiballZ_ecs_taskdefinition.py\", line 40, in invoke_module\n runpy.run_module(mod_name='ansible.modules.cloud.amazon.ecs_taskdefinition', init_globals=None, run_name='__main__', alter_sys=True)\n File \"/usr/lib/python3.6/runpy.py\", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/usr/lib/python3.6/runpy.py\", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File \"/usr/lib/python3.6/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_ecs_taskdefinition_payload_5roid3ob/ansible_ecs_taskdefinition_payload.zip/ansible/modules/cloud/amazon/ecs_taskdefinition.py\", line 520, in <module>\n File \"/tmp/ansible_ecs_taskdefinition_payload_5roid3ob/ansible_ecs_taskdefinition_payload.zip/ansible/modules/cloud/amazon/ecs_taskdefinition.py\", line 357, in main\nKeyError: 'value'\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}[0m
I tried a few ways of that syntax, but no luck(
it turned out that secret section should have been used:
- name: Create ECS task definition
ecs_taskdefinition:
aws_access_key: "{{ aws_access_key }}"
aws_secret_key: "{{ aws_secret_key }}"
region: "{{ aws_region }}"
family: "{{ task_definition_name }}"
execution_role_arn: "{{ execution_role_arn }}"
containers:
- name: prowler
essential: true
image: "{{ image }}"
repositoryCredentials:
credentialsParameter: "{{ artifactory_creds_arn }}"
logConfiguration:
logDriver: awslogs
options:
"awslogs-group": "{{ log_group_name }}"
"awslogs-region": "{{ aws_region }}"
"awslogs-stream-prefix": "ecs"
secrets:
- name: "AWS_ACCESS_KEY_ID"
valueFrom: "{{ aws_ak_arn }}"
- name: "AWS_SECRET_ACCESS_KEY"
valueFrom: "{{ aws_sk_arn }}"
environment:
- name: "AWS_ACCOUNT_ID"
value: "{{ aws_id }}"

Ansible pamd: module failure

i have this playbook as a part of a role that do some changes in pam modules :
---
- name: "{{ BANNER}} - SET MODE"
copy:
remote_src: True
src: "{{ LOGIN_DEF }}"
dest: "{{ LOGIN_DEF_BCK }}_RH7-021_{{ CK_ORA }}"
replace:
path: "{{ LOGIN_DEF }}"
regexp: '{{ item.src }}'
replace: '{{ item.dst }}'
with_items:
- { src: '(.*FAIL_DELAY.*)', dst: '#\1' }
lineinfile:
path: "{{ LOGIN_DEF }}"
line: 'FAIL_DELAY 10'
replace:
path: "{{ PASSWORDAUTH }}"
regexp: '{{ item.src }}'
replace: '{{ item.dst }}'
with_items:
- { src: '^auth .* pam_faildelay.so', dst: '' }
pamd:
name: password-auth
type: auth
control: sufficient
module_path: 'pam_unix.so'
new_type: auth
new_control: optional
new_module_path: 'pam_faildelay.so'
module_arguments:
state: after
replace:
path: "{{ SYSTEMAUTH }}"
regexp: '{{ item.src }}'
replace: '{{ item.dst }}'
with_items:
- { src: '^auth .* pam_faildelay.so', dst: '' }
pamd:
name: system-auth
type: auth
control: sufficient
module_path: 'pam_unix.so'
new_type: auth
new_control: optional
new_module_path: 'pam_faildelay.so'
module_arguments:
state: after
debug: msg="{{ MSG_SET }}"
When i run i have this error :
TASK [RH7-021 : pamd] ***********************************************************************************************************************************************
fatal: [10.13.203.165]: FAILED! => {"changed": false, "module_stderr": "", "module_stdout": "\r\nTraceback (most recent call last):\r\n
File \"/home/ccansible/.ansible/tmp/ansible-tmp-1561986679.75-245340126875212/AnsiballZ_pamd.py\", line 113, in <module>\r\n _ansiballz_main()\r\n
File \"/home/ccansible/.ansible/tmp/ansible-tmp-1561986679.75-245340126875212/AnsiballZ_pamd.py\", line 105, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n
File \"/home/ccansible/.ansible/tmp/ansible-tmp-1561986679.75-245340126875212/AnsiballZ_pamd.py\", line 48, in invoke_module\r\n imp.load_module('__main__', mod, module, MOD_DESC)\r\n
File \"/tmp/ansible_pamd_payload_NpycuP/__main__.py\", line 880, in <module>\r\n File \"/tmp/ansible_pamd_payload_NpycuP/__main__.py\", line 816, in main\r\n File \"/tmp/ansible_pamd_payload_NpycuP/__main__.py\", line 458, in __init__\r\n
File \"/tmp/ansible_pamd_payload_NpycuP/__main__.py\", line 371, in rule_from_string\r\n
AttributeError: 'NoneType' object has no attribute 'group'\r\n",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
to retry, use: --limit #/home/PG005856/HARDENING/main.retry
I can't figure it out what's wrong morover i used the same method on other playbook at it worked fine.
The control node has this ansible version :
ansible 2.7.6
config file = /home/PG005856/HARDENING/ansible.cfg
configured module search path = [u'/home/PG005856/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /ansible/lib/python2.7/site-packages/ansible
executable location = /ansible/bin/ansible
python version = 2.7.5 (default, Feb 20 2018, 09:19:12) [GCC 4.8.5 20150623 (Red Hat 4.8.5-28)]
Thr target server is :
Linux rh7-test-ansible 3.10.0-693.17.1.el7.x86_64 #1 SMP Sun Jan 14 10:36:03 EST 2018 x86_64 x86_64 x86_64 GNU/Linux
cat /etc/redhat-release
Red Hat Enterprise Linux Server release 7.4 (Maipo)
I have read that was a bug but i could imagined that with ansible version 2.7 was resolved.
I don't know what to do i could archieve the same result with some lines of sed with shell module but i'd like to use the pamd: module.

How can I merge lists with ansible 2.6.3

In the past I used ansible 2.3.1.0 and now want to use ansible 2.6.3.
I now want to install some packages with apt. therefore I have to source lists and I want to merge them in one. In ansible 2.3.1.0 I just used the following:
apt_packages:
- "{{ packages_list1 + packages_list2 }}"
And I am getting the following error:
Traceback (most recent call last):\r\n File \"/tmp/ansible_k0tmag/ansible_module_apt.py\", line 1128, in <module>\r\n main()\r\n File \"/tmp/ansible_k0tmag/ansible_module_apt.py\", line 1106, in main\r\n
allow_unauthenticated=allow_unauthenticated\r\n File \"/tmp/ansible_k0tmag/ansible_module_apt.py\", line 521, in install\r\n pkgspec = expand_pkgspec_from_fnmatches(m, pkgspec, cache)\r\n File \"/tmp/ansible_k0tmag/ansible_module_apt.py\", line 439, in expand_pkgspec_from_fnmatches\r\n
pkgname_pattern, version = package_split(pkgspec_pattern)\r\n File \"/tmp/ansible_k0tmag/ansible_module_apt.py\", line 312, in package_split\r\n
parts = pkgspec.split('=', 1)\r\nAttributeError: 'list' object has no attribute 'split'\r\n", "msg": "MODULE FAILURE", "rc": 1}
Content of the role:
apt:
state: present
name: "{{ apt_packages }}"
force: yes
when: apt_packages is defined```
With apt_packages: "{{ packages_list1 + packages_list2 }}"(without the -), it will work.
Working sample:
- hosts: localhost
gather_facts: no
vars:
pkgs1:
- perl
- python
pkgs2:
- vim
- tar
packages: "{{ pkgs1 + pkgs2 }}"
tasks:
- apt:
name: "{{ packages }}"
state: present

os_user (and other os modules) Unable to establish connection error

This same ansible playbook/role works fine with my openstack liberty deployment but with my newer pike openstack (deployed using OAD) I am having an issue running any os modules. Example belows:
ansible version: 2.3.2.0
cloud.yml on openstack utility container:
# Ansible managed
clouds:
default:
auth:
auth_url: http://172.29.236.10:5000/v3
project_name: admin
tenant_name: admin
username: admin
password: admin
user_domain_name: Default
project_domain_name: Default
region_name: RegionOne
interface: internal
identity_api_version: "3"
role/task:
---
- name: Install random password generator package
apt: name={{item}} state=present
with_items:
- apg
- name: Random generate passwords
command: apg -n {{ pass_cnt }} -M NCL -q
register: passwdss
- name: Create users
os_user:
cloud: "{{CLOUD_NAME}}"
state: present
name: "{{ item.0 }}"
password: "{{ item.1 }}"
domain: default
with_together:
- "{{userid}}"
- "{{passwdss.stdout_lines}}"
- name: Create user environments
os_project:
cloud: "{{CLOUD_NAME}}"
state: present
name: "{{ item }}"
description: "{{ item }}"
domain_id: default
enabled: True
with_items: "{{tenantid}}"
- name: Assign user to specified role in designated environment
os_user_role:
cloud: "{{CLOUD_NAME}}"
user: "{{ item.0 }}"
role: "{{ urole }}"
project: "{{ item.1 }}"
with_together:
- "{{userid}}"
- "{{tenantid}}"
- name: User password assignment
debug: msg="User {{ item.0 }} was added to {{ item.2 }} project, with the assigned password of {{ item.1 }}"
with_together:
- userid
- passwdss.stdout_lines
- tenantid
The initial passwd generator and password create tasks complete without issue. But once os_user is run I get the following error:
ndebug2: Received exit status from master 0\r\nShared connection to 172.29.239.130 closed.\r\n",
"module_stdout": "\r\nTraceback (most recent call last):\r\n File \"/tmp/ansible_BXmCwn/ansible_module_os_user.py\", line 284, in <module>\r\n main()\r\n File \"/tmp/ansible_BXmCwn/ansible_module_os_user.py\", line 220, in main\r\n user = cloud.get_user(name)\r\n File \"/usr/local/lib/python2.7/dist-packages/shade/openstackcloud.py\", line 1016, in get_user\r\n return _utils._get_entity(self.search_users, name_or_id, filters)\r\n File \"/usr/local/lib/python2.7/dist-packages/shade/_utils.py\", line 220, in _get_entity\r\n entities = func(name_or_id, filters, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/shade/openstackcloud.py\", line 999, in search_users\r\n users = self.list_users()\r\n File \"/usr/local/lib/python2.7/dist-packages/shade/openstackcloud.py\", line 981, in list_users\r\n data = self._identity_client.get('/users')\r\n File \"/usr/local/lib/python2.7/dist-packages/shade/openstackcloud.py\", line 419, in _identity_client\r\n 'identity', version_required=True, versions=['3', '2'])\r\n File \"/usr/local/lib/python2.7/dist-packages/shade/openstackcloud.py\", line 496, in _discover_endpoint\r\n self.keystone_session, base_url)\r\n File \"/usr/local/lib/python2.7/dist-packages/positional/__init__.py\", line 108, in inner\r\n return wrapped(*args, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/keystoneauth1/identity/base.py\", line 428, in get_discovery\r\n authenticated=authenticated)\r\n File \"/usr/local/lib/python2.7/dist-packages/positional/__init__.py\", line 108, in inner\r\n return wrapped(*args, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/keystoneauth1/discover.py\", line 1164, in get_discovery\r\n disc = Discover(session, url, authenticated=authenticated)\r\n File \"/usr/local/lib/python2.7/dist-packages/positional/__init__.py\", line 108, in inner\r\n return wrapped(*args, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/keystoneauth1/discover.py\", line 402, in __init__\r\n authenticated=authenticated)\r\n File \"/usr/local/lib/python2.7/dist-packages/positional/__init__.py\", line 108, in inner\r\n return wrapped(*args, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/keystoneauth1/discover.py\", line 101, in get_version_data\r\n resp = session.get(url, headers=headers, authenticated=authenticated)\r\n File \"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py\", line 845, in get\r\n return self.request(url, 'GET', **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/positional/__init__.py\", line 108, in inner\r\n return wrapped(*args, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py\", line 703, in request\r\n resp = send(**kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py\", line 777, in _send_request\r\n raise exceptions.ConnectFailure(msg)\r\nkeystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://10.0.11.100:5000: ('Connection aborted.', BadStatusLine(\"''\",))\r\n",
"msg": "MODULE FAILURE",
"rc": 0
}
I've tested and 10.0.11.100:5000 is avalable from the utility container. I'm not sure what would cause the Unable to establish connection to http://10.0.11.100:5000: ('Connection aborted.', BadStatusLine(\"''\",))\r\n".
I am surprised also that its trying to connect to my external vip for :5000 instead of the internal authurl that is defined in the cloud.yml file: 172.29.236.10:5000
Any ideas for what to look for would be extremely welcome.
Unless specifically specified the os_modules appear to grab and use the public endpoints. By providing the openstack modules with a parameter/value of "endpoint_type: internal" the issue seems to be resolved in my OAD openstack deployment.

Resources