run ssh commands on vxworks with ansible - ansible

I have some old equipment that I manage and it is running VXWorks. I am able to ssh into it and run commands, but the commands and/or the prompts are not standard. I would like to use ansible to automate some of the tasks that I do. I am not sure what module to use for this.
Is there a way to just ssh into a box and start running commands with Ansible for non Linux boxes?
Is there a way to download files via scp/sftp with Ansible for non Linux boxes?
How can I get the raw output. The commands I run are generally show commands and I need to see the output of the commands.

Is there a way to just ssh into a box and start running commands with Ansible for non Linux boxes?
That's what Ansible's raw module is for: it's a minimal wrapper for ssh <somehost> <somecommand>.
Is there a way to download files via scp/sftp with Ansible for non Linux boxes?
If the remote system supports scp or sftp, your Ansible playbooks can just run the appropriate scp/sftp command on your local system. E.g.,
- hosts: localhost
tasks:
- name: copy a file from the remote system
command: scp myserver:somefile.txt .
How can I get the raw output. The commands I run are generally show commands and I need to see the output of the commands.
When you run a command in a playbook, you can register the result, and for raw tasks that registered variable will have stdout and stderr attributes containing the output from the command. For example:
- hosts: myserver
gather_facts: false
tasks:
- name: demonstrate the raw module
raw: date
register: date
- debug:
var: date.stdout
The output from that playbook will include:
TASK [demonstrate the raw module] ************************************************************************************************************************************************************
changed: [myserver]
TASK [debug] *********************************************************************************************************************************************************************************
ok: [myserver] => {
"date.stdout": "Fri Dec 18 09:09:34 PM EST 2020\r\n"
}
The gather_facts: false part is critical, because that prevents Ansible from trying to implicitly run the setup module on your target host.

Related

Is there a way to run shell script which prompts for input values from Ansible playbook?

I have an Ansible playbook which calls an existing shell script. Shell script when triggered standalone, prompts for some user input. I want the same functionality from Ansible playbook as well (call shell script with prompting from Ansible playbook).
I tried with shell/command/raw options in Ansible playbook (with no luck).
- hosts: localhost
gather_facts: false
become: true
become_user: oracle
become_flags: 'content-ansible'
pre_tasks:
- include_vars: vars.yml
tasks:
- name: Do Create Users....
shell: cd "{{v_dir}}" && yes | sh script.sh
Ansible does not give you access to interactive commands.
You have to duplication the user interaction. First you have to ask for the input with Prompts and second you have to feed the values to your interactive program with expect.
But this is not the Ansible way of life, because it is not reproducible. The main reason to use Ansible is to create idempotent jobs, which do everytime the same thing. If you ask for user input, the job depends an the input and this means it may do different things each time it is called.

How to execute a dynamic "top" command on a remote server with Ansible

I have to execute a "top" command on a remote server with Ansible Playbook.
But when I run the playbook, the transfer dont't successful
Playbook:
---
- name: CPU load
hosts: all
become: yes
gather_facts: false
tasks:
- name: CPU load
command: top
register: cpu_result
- debug:
var: cpu_result.changed
P.S. "mpstat" command works right (with cpu.result.stdout_lines)
top by default runs in interactive mode and periodically updates the values displayed in terminal. You cannot get this functionality using Ansible, if this is what you meant by "dynamic".
Instead you can run it:
in batch mode (top -b -n 1) in GNU version of top, or
in logging mode (top -l 1) on other Unix flavours.

Running a command in an ansible-playbook from the ansible host using variables from the current ansible process

Having hit a brick wall with troubleshooting why one shell script is hanging when I'm trying to run it via Ansible on the remote host, I've discovered that if I run it in an ssh session from the ansible host it executes successfully.
I now want to build that into a playbook as follows:
- name: Run script
local_action: shell ssh $TARGET "/home/ansibler/script.sh"
I just need to know how to access the $TARGET that this playbook is running on from the selected/limited inventory so I can concatenate it into that local_action.
Is there an easy way to access that?
Try with ansible_host:
- name: Run script
local_action: 'shell ssh {{ ansible_host }} "/home/ansibler/script.sh"'

Missing become password in ansible playbook

I am trying to create playbook for deploy with simple scenario: login to server and clone/update open github repo.
All access parameters written in ~/.ssh/config
Here are my files:
hosts
[staging]
staging
deploy.yml
- hosts: staging
tasks:
- name: Update code
git: repo=https://github.com/travis-ci-examples/php.git dest=hello_ansible
When I am trying to run ansible-playbook -s deploy.yml -i hosts, it outputs error like this:
GATHERING FACTS ***************************************************************
fatal: [staging] => Missing become password
TASK: [Update code] ***********************************************************
FATAL: no hosts matched or all hosts have already failed -- aborting
I have tried to add sudo: False and become: False, but it does not seem to have any effect. I assume this operation should not request sudo password as I am trying work with files in ssh user's home directory.
I am sorry if my question is a bit lame, but I do not have much experience with Ansible.
It is asking for the sudo password because you are using the -s option. It seems like you do not want to use sudo for this task so try running the command without -s.
ansible-playbook deploy.yml -i hosts

Using ansible to launch a long running process on a remote host

I'm new to Ansible. I'm trying to start a process on a remote host using a very simple Ansible Playbook.
Here is how my playbook looks like
-
hosts: somehost
gather_facts: no
user: ubuntu
tasks:
- name: change directory and run jetty server
shell: cd /home/ubuntu/code; nohup ./run.sh
async: 45
run.sh calls a java server process with a few parameters.
My understanding was that using async my process on the remote machine would continue to run even after the playbook has completed (which should happen after around 45 seconds.)
However, as soon as my playbook exits the process started by run.sh on the remote host terminals as well.
Can anyone explain what's going and what am I missing here.
Thanks.
I have ansible playbook to deploy my Play application. I use the shell's command substitution to achieve this and it does the trick for me. I think this is because command substitution spawns a new sub-shell instance to execute the command.
-
hosts: somehost
gather_facts: no
user: ubuntu
tasks:
- name: change directory and run jetty server
shell: dummy=$(nohup /run.sh &) chdir={{/home/ubuntu/code}}
Give a longer time to async say 6 months or an year or evenmore and this should be fine.
Or convert this process to an initscript and use the service module.
and add poll: 0
I'd concur. Since it's long running, I'd call it a service and run it like so. Just create an init.d script, push that out with a 'copy' then run the service.

Resources