FTP Server - xxx.xx.xxx.xxx (Mainframe)
Unix
$> ftp xxx.xx.xxx.xxx
$> get filename
Problem
In the utility , we want to get row count from a file.
It can be done by :
Get the file in UNIX path using ftp
Apply wc -l on that file.
But we have few issues on the above technique:
Space issue (File size > 100 GB).
Resource and time consuming.
Any easy solution to get row-count from a file using FTP.
USS on the mainframe has the wc utility installed. Assuming you have to run your utility on a distributed system you could use FTP to submit a JCL job that used bpx batch to run the wc utility on the USS file you’re interested in. You could pipe the wc output to a file and then retrieve that file for use in your utility.
Suggestion:
Create some script or program that periodically check the number of line and write it to a file, eg number_of_line.txt
Ftp the file number_of_line.txt and extract the information.
Related
As we all know, "WC -l" command is used to get number of lines of a file,but i couldn't able to use the same command over ftp server.
Please help me out with this.
You can't run arbitrary shell commands over FTP, you can only run those commands the FTP server accepts.
I'm not aware of any common FTP commands to get the line count of a file. You may want to use SSH instead if possible. Otherwise, you would have to download the file to get the line count.
Im trying to copy a zip file located on a server by a ssh2 library.
the way i'm about to do is using less command and write it down on client side.
Less -r -L -f zipfile
but the output file is bigger than the original.
i know this not a good practice but i have to.
so how can i handle this to have my zip file on the client machine?
Is Less an mandatory command to do that ?
You can simply use scp to achieve that by providing user and host and then typing the directory, where to copy the file from the server to local host, like on the example below:
scp your_username#remotehost.edu:foobar.txt /some/local/directory
Is there a way to loop through the files in an FTP from MS DOS.? Thanks.
The technique I'm using is to have two FTP scripts. One I use to capture the output of a DIR command. Having captured the DIR, I parse that for the names of files to be downloaded. Then I use the second script with information from the first to download the files.
I have a list of files inside a TXT file that I need to upload to my FTP. Is there any Windows Bat file or Linux shell script that will process it?
cat ftp_filelist | xargs --max-lines=1 ncftpput -u user -p pass ftp_host /remote/path
You can use the wput command.
The syntax is somewhat like this
wput -i [name of the file.txt]
Go through this link
http://wput.sourceforge.net/wput.1.html
It works for linux.With this it will upload all the URLs given in the text file onto your ftp server one by one.
You may want to check out Jason Faulkners' batch script about which he wrote here.
i want to automate the task of uploading of a file at FTP site "uploads.google.com"
how can i achive this
please see here for example
One of the example is depicted as follows :
Prepare a file (say ftp_cmd.txt)with all the ftp commands to upload the files to our specific site
as below:
binary
cd
mput file.*
bye
Now create a batch file with the following content:
ftp -i -v -s:
ex: ftp -i -v -s:ftp_cmd.txt updates.google.com
Now, when you execute this batch file, it will put all files with format file.* to the specified directory.