I am using mget to retrieve files from a remote server to local directory in Windows.
lcd C:\E920_1\autopkg\saveE1logafterDir\serverlog
mget /slot/ems2576/appmgr/jdedwards/e920/6210/log/jde_*.log
Now, I wish to add additional step to retrieve out of this list, only the files which contains the word "PACKAGE BUILD" inside it.
How do I accomplish it?
It's not possible. FTP protocol does not have an API to find files by their contents.
See also Search Within Files On Remote FTP Site.
So any implementation you will use will have to download all log files and search their contents locally.
In a batch file, you can use findstr command for that:
Batch file to search a keyword in all files of a directory
You may have a different way of accessing the server files. For example, if you have a (SSH) shell access, you can search the files directly on the server. But that's a completely different topic.
Related
I need to recursively find *.log files at C:\ and send them to my server using WinSCP. I've experimented with put but it can only send files from a given directory. After that I've tried using cmd's dir to get the list of required files and then send them using WinSCP, but I can't both open connection AND send files: cmd prompt changes to winscp> after I open connection from cmd.
I'd appreciate any help.
Use -filemask switch in put command to upload only files matching a mask:
put -filemask=*.log C:\ /remote/path/
If you want to avoid "uploading" folders that contain no *.log files:
put -filemask=*.log -rawtransfersettings ExcludeEmptyDirectories=1 C:\ /remote/path/
I am transferring files through PSFTP to 3rd party server using Batch files. While transferring files, due to buffering issues, files are being broken/not transferred fully.
As a remedy, 3rd party requested us to name each file with '.new' before starting file transfer and remove '.new' once file is transferred fully/successfully.
Please let me know Batch script commands to implement above. Please let me know if you need additional info.
To rename a file, use mv command (or it's ren alias):
put c:\local\path\file /remote/path/file.new
mv /remote/path/file.new /remote/path/file
Though if you are transferring multiple files using a wildcard, this won't help you.
A relatively simple solution for multiple files is using a temporary upload folder. After the upload finishes, you can move all files at once to the target folder:
mput c:\local\path\* /temp/path
mv /temp/path/* /remote/path
For a similar discussion, see also SFTP file lock mechanism.
If you need to use the solution with extensions, you can use WinSCP, as it allows you to automatically use a temporary file name for upload. Though it uses .filepart, not .new extension.
put -resumesupport=on c:\local\path\* /remote/path/
See WinSCP article on Uploading to temporary file name for more details.
The article also shows (a way more complicated) solution using WinSCP .NET assembly that allows you to use even the .new extension.
If you choose to switch to WinSCP, there's a guide for converting psftp script to WinSCP.
(I'm the author of WinSCP)
There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.
Using FTP commands I want to upload a large file once and then copy that file to many directories on the remote FTP server. All the copy commands seems to relate to copying from local to remote or the other way around.
Is there an FTP command to copy remote to remote?
are you trying to move the file? if yes, you can do it using rename command to move the file, as for copy i guess you still have to do the the get,send from local-remote way.
as for move command should be something like this
rename /oldpath/file2move.txt /newpath/file2move.txt
Basically just rename your file path that is infront of the file that you wish to move.
As far as I know, there is no such command available in FTP protocol. There are some extensions to SFTP protocol to do this (and, having SSH access, you can issue cp commands), but SFTP is not an FTP.
I'm trying to find a way to see if a file exists on an ftp site via DOS. I tried a get command on the file hoping that if it didn't exist it wouldn't download it to my local directory. However it seams that it still does, but it's an empty file. This doesn't work for me however because the file I'm looking for is just a empty trigger file so I can't tell the difference.
I would like to dump a listing ls of the ftp directory to a text file on my local drive and so I try
ls > listing.txt.
It creates the listing.txt file locally but it's always empty even though there are files on the ftp site.
What are my options with this?
I have used dir > listing.txt and ls > listing.txt and every time listing.txt is empty even though there are files in the directories I'm running those commands on.
Sorry if I didn't make this clear, but I'm trying to get the listing for an automated process and not simply for my visual when manually doing this.
Unless you're on FreeDOS, you're probably not using DOS. Perhaps you're using ftp.exe in the windows console? If that's the case, don't use a normal file redirect. Instead check here the syntax for ls in the standard Windows ftp client:
ls [RemoteDirectory] [LocalFile]
So you can do a ls . listing.txt to get a list of files in the current remote directory. The listing.txt file will appear in your user directory, e.g. c:\Users\user.