I'm trying to find a way to see if a file exists on an ftp site via DOS. I tried a get command on the file hoping that if it didn't exist it wouldn't download it to my local directory. However it seams that it still does, but it's an empty file. This doesn't work for me however because the file I'm looking for is just a empty trigger file so I can't tell the difference.
I would like to dump a listing ls of the ftp directory to a text file on my local drive and so I try
ls > listing.txt.
It creates the listing.txt file locally but it's always empty even though there are files on the ftp site.
What are my options with this?
I have used dir > listing.txt and ls > listing.txt and every time listing.txt is empty even though there are files in the directories I'm running those commands on.
Sorry if I didn't make this clear, but I'm trying to get the listing for an automated process and not simply for my visual when manually doing this.
Unless you're on FreeDOS, you're probably not using DOS. Perhaps you're using ftp.exe in the windows console? If that's the case, don't use a normal file redirect. Instead check here the syntax for ls in the standard Windows ftp client:
ls [RemoteDirectory] [LocalFile]
So you can do a ls . listing.txt to get a list of files in the current remote directory. The listing.txt file will appear in your user directory, e.g. c:\Users\user.
Related
I tried ls command (and multiple others), but it's not working, how do I list the files in the directory and navigate around?
I'm attaching file to explain what I mean:
I have tried in my cloud shell and am able to list my files and navigate to the directories present. The ls command is used to list files or directories in Linux and unix. You can try using ls-a which will list the files and directories including hidden files that begin with a “.”.
If you want to display files and directories that are in the VM then you need to SSH to the respective VM instance and then try using ls -a command then it will display the respective files and directories along with hidden files. There may be a possibility that there are no directories within your cloud shell. Please refer to the images attached.
There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.
I am using ssh to work on a remote server, however when I try to download a file using scp in this format:
scp name#website.com:somefile.zip ~/Desktop
It asks me for my password, and shows this:
somefile.zip 100% 6491 6.3KB/s 00:00
however, this file never appears on my desktop. Any help
I think that you are logging into the remote machine using ssh and then running the command on the remote machine. You should actually be running the command without logging into your remote server first.
You need to specify the file path
scp name#website.com:/path/to/somefile.zip ~/Desktop
~/Desktop should actually be a directory, not a file. I suggest that you do the following:
Remove the ~/Desktop file with rm ~/Desktop (or move it with mv if you want to keep its contents).
Create the directory with mkdir ~/Desktop.
Try again to scp the zip file.
BTW, when I need to copy files into directories, I usually put a slash after the directory to avoid such problems (in case I make a mistake), e.g. scp server:file ~/Desktop/; if the directory doesn't exist, I get an error instead of unwanted file creation.
You are doing this from a command line, and you have a working directory for that command line (on your local machine), this is the directory that your file will be downloaded to. The final argument in your command is only what you want the name of the file to be. So, first, change directory to where you want the file to land. I'm doing this from git bash on a Windows machine, so it looks like this:
cd C:\Users\myUserName\Downloads
Now that I have my working directory where I want the file to go:
scp -i 'c:\Users\myUserName\.ssh\AWSkeyfile.pem' ec2-user#xx.xxx.xxx.xxx:/home/ec2-user/IwantThisFile.tar IgotThisFile.tar
Or, in your case, (that is with the VERY strong password you must be using):
cd ~/Desktop
scp name#website.com:/path/to/somefile.zip somefile.zip
Im trying to use rsysnc to transfer files from my main PC to my server. Once the files are transferred to my server I want to be able to move the files around on my PC and not have rsync send them again when I rerun rsync.
I think I can do this by having rsync write out a log file with the names of the files it transfers. Then reference that same file as the exclude list.
I'm having trouble getting the format of the log file to be readable as an exclude list. It needs to only print out the file or folder names.
Here is the current command I'm running.
rsync -avz --exclude-from=Desktop/file.txt --log-file=Desktop/file.txt --log-file-format=%i Desktop/Source Desktop/Destination
What do I need to do to make the log file only output the name of the files or folders?
You could grab the list with a find . > log.txt before running rsync
I used SSH to connect to a server and navigate to the folder where I want to store some files from my Mac. I think what I need to do is use SCP to do the copy but I'm not sure exactly about the terminology in the command parameters. And so far everything I've tried gets some sort of "not found" error.
Before logging on to the server the prompt is :
Apples-MacBook-Pro-2:~ neiltayl$
After logging in and navigating to the folder I want to store things in it is :
[neiltayl#cs136 Tracer]$
I need to copy several files from the Tracer folder on my local computer to the Tracer folder on cs136 and cannot fathom the correct parts of the respective FROM and TO parts of SCP to make it work.
This is the nearest I got so far;
Apples-MacBook-Pro-2:~ neiltayl$ ls
Applications Downloads Music Tracer
Desktop Library Pictures c151
Documents Movies Public dwhelper
Apples-MacBook-Pro-2:~ neiltayl$ scp ./Tracer/*.* neiltayl#cs136.cs.iusb.edu:Tracer
neiltayl#cs136.cs.iusb.edu's password:
./Tracer/*.*: No such file or directory
The scp command is -
$ scp File1 username#someting:DIRNAME
Here File 1 is the file that you are sending over to the other computer.
DIRNAME is the path to the directory where you want the file to be stored.
In your case the command would be
scp -r Tracer neiltayl#cs136:New_Tracer
Here Tracer is the folder that contains all the files that you want to copy.