How to tar a directory from different - different server and copy to current server on which I am working - bash

Actually I want to tar a directory from different-different server and need to move that tar to my current server on which I am working so I am writing a shell script for that and had used the following command for this
ssh hostname " tar -cvj site1.tar.gz -C /www/logs/test " > site2.tar.gz
I had also tried moving it to a directory on my current server but not able to achieve that also, command tried -
ssh hostname " tar -cvj site1.tar.gz -C /www/logs/test " > /www/logs/automate
I tried this and think that it will not work, so is their any other way to achieve this, Please help. It will be appreciable. Thanks.

With GNU tar:
ssh hostname "tar -C /www/logs/test -cjf -" > your.tar.bz2
-f -: write archive to stdout

Related

Syntax error when calling variable in bash

Here is my code:
#!bin/bash
id=$(sshpass -p password ssh -tt username#ipaddress -p PORT "grep --include=\*.cr -rlw '/usr/local/bin/' -e '$1' | cut -c16-")
echo $id
sshpass -p password rsync -avHPe 'ssh -p PORT' username#ipaddress:/usr/local/bin/"$id" /usr/local/bin/
id echos correctly, but I get an rsync error when trying to call the variable.
If I manually populate and run rsync, the command works, so I'm not sure what is going on.
Rsync gives me the following output on error.
rsync: link_stat "/usr/local/bin/match.cr\#015" failed: No such file or directory (2)
It seems to be grabbing extra characters? Any help is appreciated :)
Looks like your file contains Windows specific "CR LF" characters. You need to convert these to Linux specific "LF" characters in your script. You can use a tool like dos2unix or Notepad++.

Gunzip a file on remote server without copying

I have a file named abc.tar.gz on server1 and wanted to extract it on server2 using SSH and without copying it to the server2.
Tried like this, but doesn't work:
gunzip -c abc.tar.gz "ssh user#server2" | tar -xvf -
You are mixing stuffs. Try to understand what do you copy (and maybe also this answer).
Your program needs few steps:
1- read the file on remote server: gunzip -c abc.tar.gz
2- send the file to your machine: | ssh user#server2
3- and make ssh to execute a local program: (still on ssh) ` tar -xvf -
so gunzip -c abc.tar.gz | ssh user#server2 tar -xvf -
It server2 is a good machine (not a old embedded device), probably it is better to just use cat on server1 and do the gunzip on server2: less traffic to be sent, so probably also faster.
Please: try to understand it, before to copy and execute on your machine. There are man pages of all such commands.

How to use a source file while executing a script on a remote machine over SSH

I am running a bash script on a remote machine over SSH.
ssh -T $DBHOST2 'bash -s' < $DIR/script.sh <arguments>
Within the script I am using a source file for defining functions used in the script script.sh.
DIR=`dirname $0` # to get the location where the script is located
echo "Directory is $DIR"
. $DIR/source.bashrc # source file
But since the source file is not present in the remote machine it results in an error.
Directory is .
./source.bashrc: No such file or directory
I can always define the functions along with the main script rather than using a source file, but I was wondering is there any way to use a separate source file.
Edit : Neither the source file nor the script is located in the remote machine.
Here are to ways to this - both only requiring one ssh session.
Option 1: Use tar to copy your scripts to the server
tar cf - $DIR/script.sh $DIR/source.bashrc | ssh $DBHOST2 "tar xf -; bash $DIR/script.sh <arguments>"
This 'copies' your scripts to your $DBHOST2 and executes them there.
Option 2: Use bashpp to include all code in one script
If copying files onto $DBHOST2 is not an option, use bashpp.
Replace your . calls with #include and then run it through bashpp:
bashpp $DIR/script.sh | ssh $DBHOST2 bash -s
ssh -T $DBHOST2 'bash -s' <<< $(cat source_file $DIR/script.sh)
The following acheives what I am trying to do.
1.Copy the source file to remote machine.
scp $DIR/source.bashrc $DBHOST2:./
2.Execute the local script with arguments on the remote machine via SSH
ssh $DBHOST2 "bash -s" -- < $DIR/script.sh <arguments>
3. Copy remote logfile logfile.log to local file dbhost2.log and remove the source file and logfile from the remote machine
ssh $DBHOST2 "cat logfile.log; rm -f source.bashrc logfile.log" > dbhost.log

Bash script for gathering info from multiple servers [duplicate]

This question already has answers here:
Multiple commands on remote machine using shell script
(3 answers)
Closed 6 years ago.
I've only got a little question for you.
I have made a little shell script that allows me to connect to a server and gather certain files and compress them to another location on another server, which works fine.
It is something in the vane of:
#!/bin/bash
ssh -T user#server1
mkdir /tmp/logs
cd /tmp/logs
tar -czvf ./log1.tgz /somefolder/docs
tar -czvf ./log2.tgz /somefolder/images
tar -czvf ./log3.tgz /somefolder/video
cd ..
-czvf logs_all.tgz /tmp/logs
What I would really like to do is:
Login with the root password when connect via ssh
Run the commands
Logout
Login to next server
Repeat until all logs have been compiled.
Also, it is not essential but, if I can display the progress (as a bar perhaps) then that would be cool!!
If anyone can help that would be awesome.
I am in between n00b and novice so please be gentle with me!!
ssh can take a command as argument to run on the remote machine:
ssh -T user#server1 "tar -czf - /somefolder/anotherfolder"
This will perform the tar command on the remote machine, writing the tar's output to stdout which is passed to the local machine by the ssh command. So you can write it locally somewhere (there's no need for that /tmp/logs/ on the remote machine):
ssh -T user#server1 "tar -czf - /somefolder/anotherfolder" > /path/on/local/machine/log1.tgz
If you just want to collect them on the remove server (no wish to transfer them to the local machine), just do the straight forward version:
ssh -T user#server1 "mkdir /tmp/logs/"
ssh -T user#server1 "tar -cvzf /tmp/logs/log1.tgz /somefolder/anotherfolder"
ssh -T user#server1 "tar -cvzf /tmp/logs/log2.tgz /somefolder/anotherfolder"
…
ssh -T user#server1 "tar -czvf /tmp/logs_all.tgz /tmp/logs"
You could send a tar command that writes a compressed archive to standard out and save it locally:
ssh user#server1 'tar -C /somefolder -czvf - anotherfolder' > server1.tgz
ssh user#server2 'tar -C /somefolder -czvf - anotherfolder' > server2.tgz
...

Bash scp several files password issue

I am trying to copy several files from a remote server into local drive in Bash using scp.
Here's the part of the code
scp -r -q $USR#$IP:/home/file1.txt $PWD
scp -r -q $USR#$IP:/home/file2.txt $PWD
scp -r -q $USR#$IP:/root/file3.txt $PWD
However, the problem is that EVERY time that it wants to copy a file, it keeps asking for the password of the server, which is the same. I want it to ask only once and then copy all my files.
And please, do not suggest rsync nor making a key authentication file since I do not want to do that.
Are there any other ways...?
Any help would be appreciated
You can use expect script or sshpass
sshpass -p 'password' scp ...
#!/usr/bin/expect -f
spawn scp ...
expect "password:"
send "ur_password"
An disadvantage is that your password is now in plaintext
I'm assuming that if you can scp files from the remote server, you can also ssh in and create a tarball of the remote files.
The -r flag is recursive, for copying entire directories but your listing distinct files in your command, so -r becomes superfluous.
Try this from the bash shell on the remote system:
$ mkdir /home/file_mover
$ cp /home/file1.txt /home/file_mover/
$ cp /home/file2.txt /home/file_mover/
$ cp /root/file3.txt /home/file_mover/
$ tar -cvf /home/myTarball.tar /home/file_mover/
$ scp -q $USR#$IP:/home/myTarball.tar $PWD
Well, in this particular case, you can write...
scp -q $USR#$IP:/home/file[1-3].txt $PWD

Resources