Downloading FTP files in Q (KDB) - ftp

Does anybody know if there is any way to download files from ftp server directly from Q (kdb) ? I know that it's possible to use http but didn't see any examples of using ftp. Seems only way is to write wrapper around something like curl etc, but may be it is already done ? Any thoughts ?

Why not either:
Write a script to fetch the file then start the q processing.
Use a system command to call any linux/dos commands you want, then use the kdb key command to check that the files exist as expected

use a system call to curl without a file destination -- its default destination is stdout, so the file contents will be returned to q as the return value of system
data:system"curl ftp://wherever/whatever"

For Linux, you can simply run any curl command or system command using q. I used following for example:
system "curl --proxy my_proxy_details ftp://ftp.microsoft.com/developr/visual_c/README.TXT -o README.txt"
-> -o option is to give name for the downloaded file.
Similarly you can run other curl commands or other system commands to get ftp files in Q.
This site has good curl examples:
http://www.cyberciti.biz/faq/curl-download-file-example-under-linux-unix/

Related

How to get file count over ftp server

As we all know, "WC -l" command is used to get number of lines of a file,but i couldn't able to use the same command over ftp server.
Please help me out with this.
You can't run arbitrary shell commands over FTP, you can only run those commands the FTP server accepts.
I'm not aware of any common FTP commands to get the line count of a file. You may want to use SSH instead if possible. Otherwise, you would have to download the file to get the line count.

How can I send an HTTPS request from a file?

Let's assume I have a file request.txt that looks like:
GET / HTTP/1.0
Some_header: value
text=blah
I tried:
cat request.txt | openssl -s_client -connect server.com:443
Unfortunately it didn't work and I need to manually copy & paste the file contents. How can I do it within a script?
cat is not ideally suited to download remote files, it's best used for files local to the file system running the script. To download a remote file you have other commands that you can use which handle this better.
If your environment has wget installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
wget https://server.com/request.txt
If your environment has curl installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
curl -O https://server.com/request.txt
Please note that if you want to store the response in a variable for further modification you can do this as well with a bit more work.
Also worth noting is that if you really must use cat to download a remote file it's possible, but it may require ssh to be used and I'm not a fan of using that method as it requires access to a file via ssh where it's already publicly available over HTTP/S. There isn't a practical reason I can think of to go about it this way, but for the sake of completion I wanted to mention that it could be done but probably shouldn't.

copy file using an URL from command line

I have a batch script that is used to collect some data and upload that on other servers, using xcopy in a windows 7 command line. I want that script to collect some files that are on share point, so I need to get them using an URL and I need to login.
xcopy can't do the job, but are there other programs that can do it?
Theoretically, you can bend cURL to download a file from a SharePoint site. If site is publicly available, it's all very simple. If not, you'll have to authenticate first, and this might be a problem.
wget for windows maybe? http://gnuwin32.sourceforge.net/packages/wget.htm
The login part can be done using CURL, supplying the user name and password as post arguments. You can supply post args using -d or --data flag. Once you are logged in (and have required permission), you can fetch the required file and then simply transfer it using xcopy as you are already doing for the local files.

How to resume an ftp download at any point? (shell script, wget option)?

I want to download a huge file from an ftp server in chunks of 50-100MB each. At each point, I want to be able to set the "starting" point and the length of the chunk I want. I won't have the "previous" chunks saved locally (i.e. I can't ask the program to "resume" the download).
What is the best way of going about that? I use wget mostly, but would something else be better?
I'm really interested in a pre-built/in-build function rather than using a library for this purpose... Since wget/ftp (also, I think) allow resumption of downloads, I don't see if that would be problem... (I can't figure out from all the options though!)
I don't want to keep the entire huge file at my end, just process it in chunks... fyi all - I'm having a look at continue FTP download afther reconnect which seems interesting..
Use wget with:
-c option
Extracted from man pages:
-c / --continue
Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.
For those who'd like to use command-line curl, here goes:
curl -u user:passwd -C - -o <partial_downloaded_file> ftp://<ftp_path>
(leave out -u user:pass for anonymous access)
I'd recommend interfacing with libcurl from the language of your choice.

How can I ftp multiple files?

I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?
I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene
See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.
I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.
If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.
It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.
I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *

Resources