how to use SCP or SFTP in the script file for file transfer - bash

This script file reads the hello_world.py and source$n.vtu will be the input
data. In the end I will produce some .png files. I would like to transfer them to my local computer from the remote machine (cluster or super computer).
Can anyone tell me how to do with SCP or SFTP. Thanks!
low=0
high=9
mult=2
for i in $(eval echo {$low..$((high/mult))}); do
n=$(printf '%06d' $((i*mult)))
./pvpython hello_world.py source$n.vtu
done

You add a line
scp /path/to/file.png user#10.1.1.1:/destinationpath/
where 10.1.1.1 is your local machine.
I'd advise to set up a separate account for user, and generate a key pair to allow the cluster to transfer it passwordlessly to your local machine.

Related

How to stop an actively running for loop

This might be a silly question. I wanted to copy thousands of files from a remote server to my local machine using scp. I ran the below command directly on the command line of the remote server (after logging in with ssh)
for file in $(ls <SOURCE_DIR>)
do scp $file <LOCAL_ID>#<LOCAL_IP_ADDRESS>:<TARGET_DIR>
done
But after running this I realized that <TARGET_DIR> is owned by root, so the for loop requests a password, and if I enter the password it throws a Permission denied error message. This is repeating over and over again every loop cycle. Is there any way to get out of the for loop without pressing Ctrl + C thousands (the number of files in <SOURCE_DIR>) of times? Both the server and local use Ubuntu 18.04.
Bash itself will not let you access remote host , but you can use SSH:
Step 1: On your local PC generate key to perform password-free authentication later
$ ssh-keygen
It will ask you to enter passphrase. If you want your bash script to be fully non-interactive, you can opt not to use any password.
Step 2: Copy you public key to the remote host and paste it in .ssh/authorized_keys
Step 3: Use scp to copy files:
$ scp -ir .ssh/sshfile username#host:/sourcedirectory targetdirectory
The above command will copy all the files and folders in the respective source directory in the server

How can i transfer image files from local machine to remote server using plink

Usually we use SCP or PSCP to transfer files between local machine and remote machine. But i need to know if there's a way i can transfer image/text files between machines using PLINK.
Any help will be appreciated.
To post a file from local machine to remote machine, following command works.
plink ubuntu#111.111.01.xyz -pw password < "D:\\CSV\\001.jpg" "cat > /home/001.jpg"
This will transfer 001.jpg from local machine's D:\\CSV\\001.jpg directory to home/ directory of remote machine.
Background: I did not have permission to transfer file to remote server using PSCP. I could use plink, and it worked.
This will transfer a.txt from the machine you execute the command to the base folder of the machine your connecting.
plink username#10.20.30.40 -pw password < C:\Users\username\Desktop\a.txt "cat > ~/a.txt"

Drop a file in the root directory with bash?

I have around 30 servers that have a single server's SSH keys in authorized_keys.
I want to write a program which connects one by one to these boxes and does two things.
[1] srm's various directories with the -R flag
[2] leaves a txt document in the root directory
I know this would be possible through bash but I don't have the experience to write something like this.
Can anyone help me out? The servers run on the sshd port.
A for loop + ssh user#host -p port command can get it done
#!/bin/bash
while read server; do
#execute commands on remote server
ssh user#$server ifconfig
#copy local file over
scp /path/to/file user#$server:/path/on/remote_server/
done < servers.txt

Copy file with rsync or scp over multiple level or hops of SSH

I need to transfer around 4.2 GB of files from my local computer to a server B. However to ssh into server B, I need to ssh into server A.
Currently I'm copying files from my local computer to server A and then from server A to server B.
So the flow goes like this:
rsync -avz --del ~/Desktop/abc/ <my-user-name>#<server-A>:~/abc
rsync -avz --del ~/Desktop/abc/ <my-user-name>#<server-B>:~/abc
This is slow and copies 4.2 gb of data two times instead of one!
Can I transfer files with rsync from my local computer to directly server B ?
You can always use ssh with proxy command, which allows you to transfer files transparently. Using this config (~/.ssh/config):
Host <server-A>
User <user-A>
Host <server-B>
User <user-B>
ProxyCommand ssh <server-A> -W %h:%p
You can call your rsync:
rsync -avz --del ~/Desktop/abc/ <server-B>:~/abc
The data will be only "routed" over the middle host.
What you want is to use port-forwarding to forward the ssh/rsync port (generally port 22) from server B to alternate ports on server A so when you call rsync -e "ssh -p altport" serverA:/sourcedir /destdir, you are actually invoking rsync from serverB.
There are many good howtos available on StackExchange and other sites. For example:
How to forward a port from one machine to
another?
or
How To Forward Ports through a Linux Gateway with
Iptables
will get you started. Using port-forwarding, you are essentially using serverA as a pass-through host so you will only have to transfer your 4.2G once.
Yes, you can copy the files (and even folders) directly without making any intermediate copies on the contact/login server, which is by default the machine known to the outside world, or contacted to get access to a specific local network.
Below is a simple demonstration using scp without any unnecessary complications. On the local machine, simply do the following:
$ scp -r -o ProxyCommand="ssh -W %h:%p your_username#contact-server.de" your_username#machine_name:/file/path/on/this/machine ~/destination/path/to/save/the/copied/folder
-r option instructs scp to copy the contents of the entire folder.
your_username need not be the same on both machines.
If it is successful, you'll be asked for your passwords on both machines for authentication.
In the above command it is assumed that the typical way to access the machine named as "machine_name" would be via the contact server.
Note:
The above command also works for transferring data from a source remote machine (e.g. s) to a target remote machine (say t). In such a scenario, first ssh to the source remote machine (s) and navigate to the path where the data resides. After that you can simply think of/treat that remote machine as a local/source machine and then simply use the same scp command listed above for copying folders.
For copying individual files, just remove the -r option and provide the path to the specific file that you want to copy.

How do I copy a folder from remote to local using scp?

How do I copy a folder from remote to local host using scp?
I use ssh to log in to my server.
Then, I would like to copy the remote folder foo to local /home/user/Desktop.
How do I achieve this?
scp -r user#your.server.example.com:/path/to/foo /home/user/Desktop/
By not including the trailing '/' at the end of foo, you will copy the directory itself (including contents), rather than only the contents of the directory.
From man scp (See online manual)
-r Recursively copy entire directories
To use full power of scp you need to go through next steps:
Public key authorisation
Create SSH aliases
Then, for example if you have this ~/.ssh/config:
Host test
User testuser
HostName test-site.example
Port 22022
Host prod
User produser
HostName production-site.example
Port 22022
you'll save yourself from password entry and simplify scp syntax like this:
scp -r prod:/path/foo /home/user/Desktop # copy to local
scp -r prod:/path/foo test:/tmp # copy from remote prod to remote test
More over, you will be able to use remote path-completion:
scp test:/var/log/ # press tab twice
Display all 151 possibilities? (y or n)
For enabling remote bash-completion you need to have bash-shell on both <source> and <target> hosts, and properly working bash-completion. For more information see related questions:
How to enable autocompletion for remote paths when using scp?
SCP filename tab completion
To copy all from Local Location to Remote Location (Upload)
scp -r /path/from/local username#hostname:/path/to/remote
To copy all from Remote Location to Local Location (Download)
scp -r username#hostname:/path/from/remote /path/to/local
Custom Port where xxxx is custom port number
scp -r -P xxxx username#hostname:/path/from/remote /path/to/local
Copy on current directory from Remote to Local
scp -r username#hostname:/path/from/remote .
Help:
-r Recursively copy all directories and files
Always use full location from /, Get full location/path by pwd
scp will replace all existing files
hostname will be hostname or IP address
if custom port is needed (besides port 22) use -P PortNumber
. (dot) - it means current working directory, So download/copy from server and paste here only.
Note: Sometimes the custom port will not work due to the port not being allowed in the firewall, so make sure that custom port is allowed in the firewall for incoming and outgoing connection
What I always use is:
scp -r username#IP:/path/to/server/source/folder/ .
. (dot): it means current folder. so copy from server and paste here only.
IP: can be an IP address like 125.55.41.311 or it can be host like ns1.mysite.example.
Better to first compress catalog on remote server:
tar czfP backup.tar.gz /path/to/catalog
Secondly, download from remote:
scp user#your.server.example.com:/path/to/backup.tar.gz .
At the end, extract the files:
tar -xzvf backup.tar.gz
Typical scenario,
scp -r -P port username#ip:/path-to-folder .
explained with an sample,
scp -r -P 27000 abc#10.70.12.12:/tmp/hotel_dump .
where,
port = 27000
username = "abc" , remote server username
path-to-folder = tmp/hotel_dump
. = current local directory
And if you have one hell of a files to download from the remote location and if you don't much care about security, try changing the scp default encryption (Triple-DES) to something like 'blowfish'.
This will reduce file copying time drastically.
scp -c blowfish -r user#your.server.example.com:/path/to/foo /home/user/Desktop/
Go to Files on your unity toolbar
Press Ctrl + l and write here_goes_your_user_name#192.168.10.123
The 192.168.1.103 is the host that you want to connect.
The here one example
In case you run into "Too many authentication failures", specify the exact SSH key you have added to your severs ssh server:
scp -r -i /path/to/local/key user#remote.tld:/path/to/folder /your/local/target/dir
The question was how to copy a folder from remote to local with scp command.
$ scp -r userRemote#remoteIp:/path/remoteDir /path/localDir
But here is the better way for do it with sftp - SSH File Transfer Protocol (also Secure File Transfer Protocol, or SFTP) is a network protocol that provides file access, file transfer, and file management over any reliable data stream.(wikipedia).
$ sftp user_remote#remote_ip
sftp> cd /path/to/remoteDir
sftp> get -r remoteDir
Fetching /path/to/remoteDir to localDir 100% 398 0.4KB/s 00:00
For help about sftp command just type help or ?.
I don't know why but I was had to use local folder before source server directive . to make it work
scp -r . root#888.888.888.888:/usr/share/nginx/www/example.org/
For Windows OS, we used this command.
pscp -r -P 22 hostname#IP:/path/to/Downloads ./
The premise of the question is incorrect. The idea is, once logged into ssh, how to move files from the logged-in machine back to the client that is logged in. However, scp is not aware of nor can it use the ssh connection. It is making its own connections. So the simple solution is create a new terminal window on the local workstation, and run scp that transfers files from the remote server to local machine. E.g., scp -i key user#remote:/remote-dir/remote-file /local-dir/local-file

Resources