Currently, I'm using shell script to download the files from FTP server. Ansible will execute my script and continue other automated jobs.
Please let me know the best way to do this in Ansible playbook using "get_url" instead of "shell". The following syntax is working only to download the single file but my requirements are to download the multiple files and directories.
Appreciated your help.
- name: FTP Download
get_url: url=ftp://username:password#ftp.server.com/2016/03/value/myfile dest=/home/user/03/myfile1
register: get_url_result
According to the get_url documentation, and as far as I know, get_url does not support recursive download.
One possibility, as #helloV suggested, is to loop through a list with with_items. But this would require to have a static list of files or to obtain this list somehow, probably with wget.
Consequently, you could simply use wget -m directly in order to recursively download all files with one task. See How to recursively download a folder via FTP on Linux.
Use a list for urls (or a dict for urls and dest) and then loop through it using with_items.
Give an example for few URLs and destination files.
Related
I am investigating an issue with ansible and suspect whether a registered variable was saved in ansible tmp folder, because I also suspect that such temporary directory is being removed by async_wrapper module during execution (the environment is based on ansible 2.2 and there is a known issue with async_wrapper module).
Therefore I would like to know what kind of items are expected to be saved in ansible tmp folder such as in .ansible/tmp/ansible-tmp-xxx... during execution of a task. Then at least it would be possible to make some further estimations.
Use:
export ANSIBLE_KEEP_REMOTE_FILES=1
This will retain the files that Ansible copies to .ansible/tmp/ansible-tmp-xxx... and runs on the destination host.
Set the env variable before running the playbook with -vvv. This will output the paths used to store the scripts on the destination host.
After the playbook has completed. SSH onto the destination host and take a look at the files.
The files are most likely on a path like:
/home/user/.ansible/tmp/..../modulename
The easiest way to view/test them is to explode them and then execute them.
python /home/user/.ansible/tmp/..../modulename explode
This will create a subdirectory containing the module, arguments and ansible wrapper.
python /home/user/.ansible/tmp/..../modulename execute
This will run the exploded files.
You will be able to see from this exactly what is being saved where. It's also possible to edit the module and test to see what changes are made to the /tmp folder.
I am writing an Ansible role that installs and updates some specific enterprise software. I would like to compare the installed version (if it is installed) to the one I am trying to install, for various reasons, but mainly to be able to verify that installation is necessary and allowed before actually executing the installer. Both installer package and installation contain an INI-file which contains component versions as options (component_name=version).
What is the proper way in Ansible to read some option(s) from some INI-file on remote node? As far as I understand:
ini_file -module is meant for modifying target file, which is not what I want to do.
ini lookup is meant for files on controller, not on remote nodes.
I can see two possibilities here:
Use fetch -module to get file from remote node to controller machine, then use ini lookup.
Use command or shell -module, parse INI file using grep/sed/awk and register output.
The first option seems unnecessarily clumsy (although I do realize I may think about it in the wrong way). Second one seems a bit clumsy from another point of view (yet another INI-file parsing method), but I may be wrong here too. Right now I am leaning on the latter, but I can't help thinking that there must be an easier and more elegant way.
Seems like a use case for facts.d.
Write a shell or Python script that inspects those ini files and dumps required fields as JSON object to stdout.
Place this script into /etc/ansible/facts.d/custom_soft.fact and make it executable.
Then you can use these facts as follows:
- shell: install_custom_soft.sh
when: ansible_local.custom_soft.component_ver | int > 4
If your ini files are very simple, you may do the job even without script, just make a link like this:
ln -s /etc/custom_soft/config.ini /etc/ansible/facts.d/custom_soft.fact
and all config.ini keys will be available to Ansible via ansible_local.custom_soft variable.
P.S. Despite the name "local facts" this should be done on remote machine.
There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.
I have do download all log files from a virtual directory within a site. The access to virtual directory is forbidden but files are accessible.
I have manually entered the file names to download
dir="Mar"
for ((i=1;i<100;i++)); do
wget http://sz.dsyn.com/2014/$dir/log_$i.txt
done
The problem is the script is not generic and most of the time I need to find out how many files are there and tweak the for loop. Is there a way to trigger wget to fetch all files without me bothering to specify the exact count.
Note:
If I use the browser to view http://sz.dsyn.com/2014/$dir, it is 403 forbidden. I cant pull all the files via browser tool/extension.
First of all check this similar question If this is not what you are looking for, you need to generate a file of URLs within and feed wget. e.g.
wget --input-file=http://sz.dsyn.com/2014/$dir/filelist.txt
wget will have the same problem your browser has: it cannot read the directory. Just pull until your first failure then quit.
I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?
I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene
See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.
I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.
If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.
It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.
I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *