Files created but, No data getting uploaded via using FTP in shell - shell

I am using command
ftp -n -s:C:\FTP_cmd.txt ftp.madrecha.com
the FTP_cmd.txt file contains
user
myName#domain.com
Pa$$Word
Put C:\AccessDocumentation.pptx
quit
The file is getting created on server. but, size is 0 bytes. No data in the file. I tried using FileZilla to upload same file using same user. That was successful and file was created with 352 KB
Is there issue in the command or this is server side issue?
PS: I tried running using cmd (on windows) and also on Powershell (on windows). But resulted in same issue.
Thanks in advance.
UPDATE: Attaching screenshot of the command run.

I don't have the reputation to comment at the moment, so I'm writing my guesses as an answer.
I think the "put" command has to be lowercase.
Additionally you should check the file permissions, you may have write access to the FTP server but no right to read from the file you want to copy to the server.

Related

FTP - automatically rename/move file on FTP server when it's downloaded

I need to make a script for an FTP server that will detect when a file is downloaded and then either rename or move or delete that file to prevent it from being re-downloaded. Is there a way to do this by storing a file on the FTP server that will always be looking for when a file is downloaded, and then executing that process? I assume it could be done with a bash script, but I dont know enough about them to know if it can be constantly running/checking for if a file is downloaded.
Thanks!

Transfer file in Secure Shell

I use Secure Shell as a Chrome extension (https://chrome.google.com/webstore/detail/secure-shell/pnhechapfaindjhompbnflcldabbghjo?hl=da)
Now I have finished some programming and I need the file on my computer.
How can I transfer the file to my computer? Can I send it by email or something?
I have tried yanking all the lines in vim, but I still don't get it copied to my windows clipboard.
One entertaining (and moderately ridiculous) approach would be sprunge.
Run this on the remote machine:
cat myFile | curl -F 'sprunge=<-' http://sprunge.us
And then visit the URL it prints on your local machine. :D
I presume that you are using Windows OS and trying to download your file from a Linux like OS.
You use MobaXterm and it comes with a file transfer features.
http://mobaxterm.mobatek.net
On a CLI you can use "scp" to download and upload.
Another one is you can also use FileZilla using SFTP protocol

503 RNFR command not understood

I'm using a (cheap branded) local media station as an FTP server and I'm using FIleZilla to transfer files to it.
When I try to move or rename a file located on the media station, I'm getting
Command: RNFR [filename]
Response: 503 Command not understood.
I don't know whether this is because of an old or corrupted FTP version (it's a device older than 5 years and I think there are no updates available).
Is there an alternative to perform FTP rename or move commands?
Is there an alternative to perform FTP rename or move commands?
If you have telnet or SSH access to the machine you could do the renaming their. If not you might try to use the FTP SITE command with "mv from-name to-name". But I doubt that the server will support this if it does not even support the standard way of FTP to rename files.
Apart from that the only alternative is probably to download the file, remove it on the server and upload it again with a different name.

Copy files from authenticated windows server to Unix server

I have a set of zip files that need to be copied from an authenticated windows server to a unix server which is authenticated too.
I have tried using Pentaho but have not found any success. Is there any other alternative way with which this copy can be done like using scripts or any such method?
Thanks in advance.
Assuming your server supports ssh..
Putty comes with a utility called pscp which works the same as scp.
To copy a file you would typically do this:
pscp myfile.zip me#myserver:/my_directory/.
There is also winscp if you want something more GUI.
Use scp command. For more detail visit http://www.garron.me/en/linux/scp-linux-mac-command-windows-copy-files-over-ssh.html

copy file using an URL from command line

I have a batch script that is used to collect some data and upload that on other servers, using xcopy in a windows 7 command line. I want that script to collect some files that are on share point, so I need to get them using an URL and I need to login.
xcopy can't do the job, but are there other programs that can do it?
Theoretically, you can bend cURL to download a file from a SharePoint site. If site is publicly available, it's all very simple. If not, you'll have to authenticate first, and this might be a problem.
wget for windows maybe? http://gnuwin32.sourceforge.net/packages/wget.htm
The login part can be done using CURL, supplying the user name and password as post arguments. You can supply post args using -d or --data flag. Once you are logged in (and have required permission), you can fetch the required file and then simply transfer it using xcopy as you are already doing for the local files.

Resources