As we all know, "WC -l" command is used to get number of lines of a file,but i couldn't able to use the same command over ftp server.
Please help me out with this.
You can't run arbitrary shell commands over FTP, you can only run those commands the FTP server accepts.
I'm not aware of any common FTP commands to get the line count of a file. You may want to use SSH instead if possible. Otherwise, you would have to download the file to get the line count.
Related
I was trying to download multiple files from our ftp server using the script:
mget cd\dir_here\subdir_here\sample*.txt
but it didn't work so I tried to change to back slash:
mget cd/dir_here/subdir_here/sample*.txt
a message
Type set to A
appeared. What does that mean?
Type set to A
This means that you've told the FTP server you will be transferring by ASCII.
You almost never actually want to do ASCII, it is better to make a habit of using Binary for all transfers. (ASCII breaks the chain of custody, ASCII transfers can modify file contents such as changing New line characters and not sending correct Unicode characters and are completely unsuited to binary file contents, while the speed benefits are trivial.)
You want to make sure your FTP script uses commands separately it looks like you put a CD along with an MGet here...?
Example FTP Script:
user USERNAME PASSWORD
binary
lcd C:\Local\File\System\Folder
cd /root/dir_here/subdir_here
mget sample*.txt
quit
This would be in a script file which would be specified in the FTP command when you call it from the command line or from a script.
eg:
ftp -n -i -v "-s:C:\Local\Path\To\FTP\Script.txt" SERVER_IP
There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.
Does anybody know if there is any way to download files from ftp server directly from Q (kdb) ? I know that it's possible to use http but didn't see any examples of using ftp. Seems only way is to write wrapper around something like curl etc, but may be it is already done ? Any thoughts ?
Why not either:
Write a script to fetch the file then start the q processing.
Use a system command to call any linux/dos commands you want, then use the kdb key command to check that the files exist as expected
use a system call to curl without a file destination -- its default destination is stdout, so the file contents will be returned to q as the return value of system
data:system"curl ftp://wherever/whatever"
For Linux, you can simply run any curl command or system command using q. I used following for example:
system "curl --proxy my_proxy_details ftp://ftp.microsoft.com/developr/visual_c/README.TXT -o README.txt"
-> -o option is to give name for the downloaded file.
Similarly you can run other curl commands or other system commands to get ftp files in Q.
This site has good curl examples:
http://www.cyberciti.biz/faq/curl-download-file-example-under-linux-unix/
FTP Server - xxx.xx.xxx.xxx (Mainframe)
Unix
$> ftp xxx.xx.xxx.xxx
$> get filename
Problem
In the utility , we want to get row count from a file.
It can be done by :
Get the file in UNIX path using ftp
Apply wc -l on that file.
But we have few issues on the above technique:
Space issue (File size > 100 GB).
Resource and time consuming.
Any easy solution to get row-count from a file using FTP.
USS on the mainframe has the wc utility installed. Assuming you have to run your utility on a distributed system you could use FTP to submit a JCL job that used bpx batch to run the wc utility on the USS file you’re interested in. You could pipe the wc output to a file and then retrieve that file for use in your utility.
Suggestion:
Create some script or program that periodically check the number of line and write it to a file, eg number_of_line.txt
Ftp the file number_of_line.txt and extract the information.
I have a list of SWF files that I want to download to my PC. Is there a quick way to do this from my server, like using WGET or something similar. Each file line list is a new line:
http://super.xx-cdn.com/730_silvxxxen/v/v.swf
http://super.xx-cdn.com/730_sixxxxheen/73xxxxversheen.swf
http://super.xx-cdn.com/730_rxxxd/v/v.swf
There are thousands of lines.
If you use ssh over putty to access your server you could easily use winscp from putty
otherwise you could also use pscp
If you do not have putty installed get it and make up a ssh to your server
Another easy way to download them is just getting an FTP client and download them over FTP
You can use simple SH script, if I correctly understand your question:
#!/bin/sh
while IFS= read -r line
do
wget $line
done < "urls.txt"