here's the problem:
First step
transfer the *.gz file to the remote host using ftp, the code below
open $IP
user nfc nfc123
bin
passive
cd /nfc/APPBAK
put $FULLNAME $DESTFILE
cd $DESTDIR
tar -zxvf $local_filename
quit
FTPIT
Second step
tar -zxvf $local_filename
but it says:
"?Invalid command. "
Should I change the mode of of the *.gz file first, any help will be appreciated.
You are trying to run the tar command inside FTP, as far as I can see, rather than in the shell after you've fetched the file with FTP. It is confusing since some shell commands, like cd, seem to work in FTP too, but the cd command actually attempts to change directory on the remote machine (you need lcd to change directory on the local machine).
Put simply, tar isn't a valid FTP command, which is why you get the ?Invalid command error.
try this one::
tar -xvf $local_filename
Please make sure that file has right permissions.
Related
Windows tar command works on cmder.
tar -zxvf D:\backup\a.tar.gz
But when I add it to bat file, it doesn't work. I tried these versions
call tar -zxvf D:\backup\a.tar.gz
tar -zxvf D:\backup\a.tar.gz
call tar -zxvf ./a.tar.gz
tar -zxvf ./a.tar.gz
no one worked and I get an error
'tar' is not recognized as an internal or external command,
operable program or batch file.
call tar from the directory
e.g. tar is in the same folder as your batch file
"%~dp0tar" -zxvf "D:\backup\a.tar.gz"
otherwise i suggest you use the full path for tar within quotes.
"C:\users\yourname\Desktop\tar" -zxvf "D:\backup\a.tar.gz"
I had the same problem. My issue was caused because I created a variable called path in my batch file which overwrote the windows path variable which contains the directory where tar is stored. Maybe you did the same thing.
This worked for me. First set the current path to the directory were the zip file is located, then issue the tar on the file.
cd /d "F:/some/path/to/zip" & tar -zf <file_name>.zip
Use relative paths or if you need absolute paths then use forward slash / instead of backslash \ (e.g. D:/backup/a.tar.gz)
It doesn't take anyone a second to try it, and it is simply the solution that worked for me after many trials.
I am using ssh to work on a remote server, however when I try to download a file using scp in this format:
scp name#website.com:somefile.zip ~/Desktop
It asks me for my password, and shows this:
somefile.zip 100% 6491 6.3KB/s 00:00
however, this file never appears on my desktop. Any help
I think that you are logging into the remote machine using ssh and then running the command on the remote machine. You should actually be running the command without logging into your remote server first.
You need to specify the file path
scp name#website.com:/path/to/somefile.zip ~/Desktop
~/Desktop should actually be a directory, not a file. I suggest that you do the following:
Remove the ~/Desktop file with rm ~/Desktop (or move it with mv if you want to keep its contents).
Create the directory with mkdir ~/Desktop.
Try again to scp the zip file.
BTW, when I need to copy files into directories, I usually put a slash after the directory to avoid such problems (in case I make a mistake), e.g. scp server:file ~/Desktop/; if the directory doesn't exist, I get an error instead of unwanted file creation.
You are doing this from a command line, and you have a working directory for that command line (on your local machine), this is the directory that your file will be downloaded to. The final argument in your command is only what you want the name of the file to be. So, first, change directory to where you want the file to land. I'm doing this from git bash on a Windows machine, so it looks like this:
cd C:\Users\myUserName\Downloads
Now that I have my working directory where I want the file to go:
scp -i 'c:\Users\myUserName\.ssh\AWSkeyfile.pem' ec2-user#xx.xxx.xxx.xxx:/home/ec2-user/IwantThisFile.tar IgotThisFile.tar
Or, in your case, (that is with the VERY strong password you must be using):
cd ~/Desktop
scp name#website.com:/path/to/somefile.zip somefile.zip
I have a set of files in my ftp folder. I have access to only ftp mode. I want to rename those files with extension .txt to .done
Ex:
1.txt, 2.txt, 3.txt
to
1.done, 2.done, 3.done
Only rename command is working in this ftp. I am expecting something like
rename *.txt *.done
to rename them all in a single command.
In short: You can't.
FTP is very basic and does not support mass renaming. You can either write a small script for it, or download some helper software, such as the one here.
Hallo to all,
Even if the question is quite old, I think could be usefull for others to read my suggestion.
I found a great and easy solution combining curlftpfs, "A FTP filesystem based on cURL and FUSE" as they define it, and rename linux and unix multi rename tool.
I tested on linux mint 17 (and I think it should work in other debian based distributions)
install curlftpfs
sudo apt-get install curlftpfs
create the mount folder
sudo mkdir /mnt/ftp_remote_root
mount remote ftp on folder
sudo curlftpfs -o allow_other -o user="USERWITH#CHARACTERTOO:PASSWORDTOACCESSUSER" ftp://my_ftp_server.com /mnt/ftp_remote_root/
jump into desired ftp remote folder
cd /mnt/ftp_remote_root/path/to/folder
rename as you need files (-v shw new names, -n show interested files, omitt them to rename files)
sudo rename -v -n 's/match.regexp/replace.regexp/' *.file.to.change
It could took few seconds because it works on network.
I think it is really powerfull and easy to use.
Let me know if you find any problems.
Bye
Lorenzo
try something like this:
the following example move/rename files on the FTP server
for f in $(lftp -u 'username,password' -e 'set ssl:verify-certificate
no; ls /TEST/src/*.csv; quit' ftp.acme.com| awk '{print $9;}'); do
lftp -u 'username,password' -e "set ssl:verify-certificate no; mv
/TEST/src/$f /TEST/dst/$f; quit" ftp.acme.com; done
note: use .netrc to store username and password.
Use the following command:
ren *.txt *.done
I am trying to run a script from /var/www/backups/scripts and when i try tell it to zip up a file i get the below error,
I can confirm that /var/www is the home dir and that the scripts work when ran manually though putty but just not though a script.
I'm using the below code to run the zip
#!/bin/bash
unset PATH
#USER VARS
HOMEDIR=~/
BACKUP_TARG_DIR=~/sites/backups/auto
BACKUP_TEMP_NAME=tempBackupFile.tar
BACKUP_TARG_FILE=/var/www/back
DATE=`/bin/date '+%Y-%m-%d'`
echo `/bin/pwd`;
tar -zcvf test.rar /var/www/backups/scripts/tryThis
#cd /var/www
#scp "tempBackupFile.tar" 217.41.51.14:~/testfile.rar;
#tar -zcvf $BACKUP_TEMP_NAME $BACKUP_TARG_FILE;
echo "SITE-"$DATE;
below is the output i get
/var/www/backups/scripts
./autoBackup.bash: line 18: tar: No such file or directory
SITE-2011-09-05
Any one have any ideas as this is killing me, all I can think of is its something to do with where the bash script is being run from.
Why do you unset PATH ?? No wonder bash cannot execute tar.
Check your /etc/ssh/sshd_config to make sure that you don't have a chroot directory set. If you do, you will need to either create a bin directory in the chroot directory and either copy or link the necessary binaries into that directory.
Or you could always comment that line out in the config.
Either way, restart sshd and test.
Is it possible to compress a folder and create a .zip on my server through a command in terminal via FTP? Is there a archive command? Thanks.
Welcome to stackoverflow.
What I believe you want to do is ssh onto your server and use the tar command.
tar -cf archive.tar contents/
Takes everything from contents/ and puts it into archive.tar
You can find more information here.