wget+ftp: How to download remote directory without remote tree - ftp

I'm trying to download a single directory from a remote ftp server. I'm using this command
wget -nH -r -N -l inf --ask-password ftp://ftp.server.com/some/remote/dir/xyz -P dirName
I'd like the remote xyz directory to be copied and called dirName. There is a local directory called dirName, but its contents are dirName/some/remote/dir/xyz, which is not what I wanted.

After a careful reading of the man page I found the --cut-dirs option which cuts parent directories from the local storage. That does what I need it to do!

Related

Using (scp | rsync) to pull specific files while creating folder structure at same time?

I have a big project hosted on a server that has specific files that I want to copy to my local machine in the same folder structure but only the specific file I want. My current command to find these files is (while in the project on the server):
find ./ -type f -name '*_out.csv' ! -path './*/doc/*' 2>/dev/null
Which produces a list like this (truncated for brevity):
./validation/Riso_AN8/analysis/Riso_AN8_out.csv
./validation/FUMEXII_Regate/analysis/Regate_smeared_out.csv
./validation/FUMEXII_Regate/analysis/Regate_discrete_out.csv
./validation/IFA_432/analysis/rod3/IFA_432_rod3_out.csv
./validation/IFA_432/analysis/rod1/IFA_432_rod1_out.csv
./validation/IFA_432/analysis/rod2/IFA_432_rod2_out.csv
./validation/LOCA_REBEKA_cladding_burst_tests/analysis/rebeka_2d_06MPa/rebeka_singlerod_2d_06MPa_out.csv
./validation/LOCA_REBEKA_cladding_burst_tests/analysis/rebeka_2d_06MPa/rebeka_singlerod_2d_06MPa_tm_out.csv
./validation/LOCA_REBEKA_cladding_burst_tests/analysis/rebeka_2d_08MPa/rebeka_singlerod_2d_08MPa_tm_out.csv
I would like to use scp or rsync to pull these files to my local machine and create the folder structure without anything else in them. What would be the best way to go about this? I have a ton of files so I don't really want to create the folder structure before hand. I also can't pull the entire project from the server because it's huge and the system admins will get mad at me.
Is there a way to pull these files while simultaneously creating the folder structure on my local machine?
I would encourage rsync and I would probably call find via ssh using the basedir to pull the files from in a process substitution. You can then feed a while loop to read each filename found on the server, obtain the path and create the path on the local machine below the current directory using mkdir -p (with validation). Then you can call rsync to pull the file from the server to the correct directory using rsync -uav. For example you could do something similar to:
#!/bin/bash
server=${1:-yourserver} ## your server name
basedir=${2:-/path/to/files} ## the base directory to run find on server
while read -r line; do ## read line at a time from find output on server
dname="${line%/*}" ## separate directory name
mkdir -p "$dname" || { ## create/validate directory from remote file
printf "error: unable to create '%s'.\n", "$dname" >&2
continue
}
rsync -uav "$server:$line" "$dname" ## rsync file to correct directory
done < <(ssh "$server" "find $basedir -type f -name '*_out.csv' ! -path './*/doc/*' 2>/dev/null")
Then just call the script on the local machine providing the server name as the first argument and the base directory where the files are located on the server. Make sure you change directory on the local machine to the directory you want to create the remote directory structure under. This presumes your find call (executed on the server by ssh from the local machine) returns the list of files you wish to copy to your local machine.
This is not nearly as efficient as a single rsync call, but if your find command produces branches under the remote directory tree that have multi-levels of directories before the filename that would not otherwise be created on the local machine, you will have to manually ensure you create those paths before calling rsync on the remote file.
I think you can do it with rsync's --exclude and --include options.
rsync --recursive --include '*_out.csv' --exclude '*/doc/*' server:path/to/remote/dir path/to/local/dir

How to scp multiple files from remote to local in a folder other than ~/?

I'm trying to make a bash expect script that takes in input like user, host, password, and file names and then copies the files from remote to local. From what I've read so far, scp'ing multiple files from remote to local works just fine when you're assuming the files are coming from ~/, i.e:
scp -r user#host:"file1 file2 file3" .
would copy file1, file2, and file3 from ~/ into the current working directory. But I need to be able to pass in another directory as an argument, so my bash script looks like this (but doesn't work, I'll explain how):
eval spawn scp -oStrictHostKeyChecking=no -oCheckHostIP=no -r $user#$host:$dir/"$file1 $file2 $file3" ~/Downloads
This doesn't work after the first file; the shell raises a "No such file or directory" error after the first file, which I would assume means that the script only works on $dir for the first file, then kicks back into ~/ and of course can't find the files there. I've looked everywhere for an answer on this but can't find it, and it would be super tedious to do this one file at a time.
Assuming your remote login shell understands Brace Expansion, this should work
scp $user#$host:$dir/"{$file1,$file2,$file3}" ~/Downloads
If you want to download multiple files with a specific pattern, you can do the following for example if you want all zip files:
scp -r user#host:/path/to/files/"*.zip" /your/local/path

tar -zxvf cannot unzip file

here's the problem:
First step
transfer the *.gz file to the remote host using ftp, the code below
open $IP
user nfc nfc123
bin
passive
cd /nfc/APPBAK
put $FULLNAME $DESTFILE
cd $DESTDIR
tar -zxvf $local_filename
quit
FTPIT
Second step
tar -zxvf $local_filename
but it says:
"?Invalid command. "
Should I change the mode of of the *.gz file first, any help will be appreciated.
You are trying to run the tar command inside FTP, as far as I can see, rather than in the shell after you've fetched the file with FTP. It is confusing since some shell commands, like cd, seem to work in FTP too, but the cd command actually attempts to change directory on the remote machine (you need lcd to change directory on the local machine).
Put simply, tar isn't a valid FTP command, which is why you get the ?Invalid command error.
try this one::
tar -xvf $local_filename
Please make sure that file has right permissions.

SCP says file has downloaded, but the file does not appear

I am using ssh to work on a remote server, however when I try to download a file using scp in this format:
scp name#website.com:somefile.zip ~/Desktop
It asks me for my password, and shows this:
somefile.zip 100% 6491 6.3KB/s 00:00
however, this file never appears on my desktop. Any help
I think that you are logging into the remote machine using ssh and then running the command on the remote machine. You should actually be running the command without logging into your remote server first.
You need to specify the file path
scp name#website.com:/path/to/somefile.zip ~/Desktop
~/Desktop should actually be a directory, not a file. I suggest that you do the following:
Remove the ~/Desktop file with rm ~/Desktop (or move it with mv if you want to keep its contents).
Create the directory with mkdir ~/Desktop.
Try again to scp the zip file.
BTW, when I need to copy files into directories, I usually put a slash after the directory to avoid such problems (in case I make a mistake), e.g. scp server:file ~/Desktop/; if the directory doesn't exist, I get an error instead of unwanted file creation.
You are doing this from a command line, and you have a working directory for that command line (on your local machine), this is the directory that your file will be downloaded to. The final argument in your command is only what you want the name of the file to be. So, first, change directory to where you want the file to land. I'm doing this from git bash on a Windows machine, so it looks like this:
cd C:\Users\myUserName\Downloads
Now that I have my working directory where I want the file to go:
scp -i 'c:\Users\myUserName\.ssh\AWSkeyfile.pem' ec2-user#xx.xxx.xxx.xxx:/home/ec2-user/IwantThisFile.tar IgotThisFile.tar
Or, in your case, (that is with the VERY strong password you must be using):
cd ~/Desktop
scp name#website.com:/path/to/somefile.zip somefile.zip

How to determine full names for local Mac filename and remote filname to use SCP

I used SSH to connect to a server and navigate to the folder where I want to store some files from my Mac. I think what I need to do is use SCP to do the copy but I'm not sure exactly about the terminology in the command parameters. And so far everything I've tried gets some sort of "not found" error.
Before logging on to the server the prompt is :
Apples-MacBook-Pro-2:~ neiltayl$
After logging in and navigating to the folder I want to store things in it is :
[neiltayl#cs136 Tracer]$
I need to copy several files from the Tracer folder on my local computer to the Tracer folder on cs136 and cannot fathom the correct parts of the respective FROM and TO parts of SCP to make it work.
This is the nearest I got so far;
Apples-MacBook-Pro-2:~ neiltayl$ ls
Applications Downloads Music Tracer
Desktop Library Pictures c151
Documents Movies Public dwhelper
Apples-MacBook-Pro-2:~ neiltayl$ scp ./Tracer/*.* neiltayl#cs136.cs.iusb.edu:Tracer
neiltayl#cs136.cs.iusb.edu's password:
./Tracer/*.*: No such file or directory
The scp command is -
$ scp File1 username#someting:DIRNAME
Here File 1 is the file that you are sending over to the other computer.
DIRNAME is the path to the directory where you want the file to be stored.
In your case the command would be
scp -r Tracer neiltayl#cs136:New_Tracer
Here Tracer is the folder that contains all the files that you want to copy.

Resources