SCP file copy and overwrite existing file - bash

I have a Bash script where I'm issuing an SCP command to copy a text file to a remote server. I want subsequent files of the same name to overwrite the existing remote file. However, the copy does not replace the file but generate copies with a time/date stamp.
Code:
expect <<EOF
set timeout -1
spawn scp stats.txt user#192.168.0.188:/stats.txt
expect "*word:*"
send "user\r"
expect eof
EOF
How can I fix this?

Issue turned out to be a somewhat hidden setting on the Solarwinds Server to allow or disallow overwrite/renaming. Thanks everyone for contributing.

Related

Transferring large files using SFTP using linux bash scripts

I am intending to send a huge file around 1+GB over to the remote side using SFTP. However, it seems to work fine in interactive mode(when I sftp#xx.xx.xx.xx and enter the password manually, then I key in the put command). But when I run it in shell, it always timeout.
I have set the client and server ClientAliveTimeout settings at /etc/ssh/sshd_config but it still occurs.
Below is the linux script code
sshpass -p "password" sftp user#xx.xx.xx.xx << END
put <local file path> <remote file path>
exit
END
The transfer of files takes 10 min when using interactive mode
when run using script, the file was incomplete based on filesize.
Update: Current transfer during interactive mode shows the small files went through but the big file was stalled halfway during transfer.
I prefere lftp for such things
lftp -u user,passwd domain.tld -e "put /path/file; quit"
lftp can handle sftp too
open sftp://username:password#server.address.com

Expect Script to SCP a File to Remote System

I am a newbie at scripting and simply trying to utilize scp within a script to move a file to a remote system. I keep encountering errors within my code/or nothing happens/the file does not get copied.
I attempted multiple scripts but I feel I'm just not quite getting the language. In the code I've included, I am trying to scp the test.txt file to the remote system. I've also tried including a send "scp test.txt ${user}#XXXXXXXX.com" line as well.
#!/usr/bin/expect
set user "XXXXXX"
set password "XXXXXX"
log_file XXXX.txt
spawn /usr/bin/scp -f test.txt ${user}#XXXXXXXXXXX.com
expect "*assword"
send "${password}\n"
interact
I believe the file should be copied to the remote server but when I attempt to display it with ls -l I get nothing.

Script to upload to sftp is not working

I have 2 Linux boxes and i am trying to upload files from one machine to other using sftp. I have put all the commands I use in the terminal to she'll script like below.
#!/bin/bash
cd /home/tests/sftptest
sftp user1#192.168.0.1
cd sftp/sftptest
put test.txt
bye
But this is not working and gives me error like the directory does not exist. Also, the terminal remain in >sftp, which means bye is not executed. How can I fix this?
I suggest to use a here-document:
#!/bin/bash
cd /home/tests/sftptest
sftp user1#192.168.0.1 <<< EOF
cd sftp/sftptest
put test.txt
EOF
When you run the sftp command, it connects and waits for you to type commands. It kind of starts its own "subshell".
The other commands in your script would execute only once the sftp command finishes. And they would obviously execute as local shell commands, so particularly the put will fail as a non existing command.
You have to provide the sftp commands directly to sftp command.
One way to do that, is using an input redirection. E.g. using the "here document" as the answer by #cyrus already shows:
sftp username#host <<< EOF
sftp_command_1
sftp_command_2
EOF
Other way is using an external sftp script:
sftp username#host -b sftp.txt
Where, the sftp.txt script contains:
sftp_command_1
sftp_command_2

While loop executes only once when using rsync

I've created a bash script to migrate sites and databases from one server to another: Algorithm:
Parse .pgpass file to create individual dumps for all the specified Postgres db's.
Upload said dumps to another server via rsync.
Upload a bunch of folders related to each db to the other server, also via rsync.
Since databases and folders have the same name, the script can predict the location of the folders if it knows the db name. The problem I'm facing is that the loop is only executing once (only the first line of .pgpass is being completed).
This is my script, to be run in the source server:
#!/bin/bash
# Read each line of the input file, parse args separated by semicolon (:)
while IFS=: read host port db user pswd ; do
# Create the dump. No need to enter the password as we're using .pgpass
pg_dump -U $user -h $host -f "$db.sql" $db
# Create a dir in the destination server to copy the files into
ssh user#destination.server mkdir -p webapps/$db/static/media
# Copy the dump to the destination server
rsync -azhr $db.sql user#destination:/home/user
# Copy the website files and folders to the destination server
rsync -azhr --exclude "*.thumbnails*" webapps/$db/static/media/ user#destination.server:/home/user/webapps/$db/static/media
# At this point I expect the script to continue to the next line, but if exits at the first line
done < $1
This is .pgpass, the file to parse:
localhost:*:db_name1:db_user1:db_pass1
localhost:*:db_name3:db_user2:db_pass2
localhost:*:db_name3:db_user3:db_pass3
# Many more...
And this is how I'm calling it:
./my_script.sh .pgpass
At this point everything works. The first dump is created, and it is transferred to the destination server along with the related files and folders. The problem is the script finishes there, and won't parse the other lines of .pgpass. I've commented out all lines related to rsync (so the script only creates the dumps), and it works correctly, executing once for each line in the script. How can I get the script to not exit after executing rsync?
BTW, I'm using key based ssh auth to connect the servers, so the script is completely prompt-less.
Let's ask shellcheck:
$ shellcheck yourscript
In yourscript line 4:
while IFS=: read host port db user pswd ; do
^-- SC2095: ssh may swallow stdin, preventing this loop from working properly.
In yourscript line 8:
ssh user#destination.server mkdir -p webapps/$db/static/media
^-- SC2095: Add < /dev/null to prevent ssh from swallowing stdin.
And there you go.

How to CD inside a SFTP connection where the connection is established using - Shell script

In my script - i create a sftp connection.
I read some directory value from user earlier and once the sftp connection is established, i try to cd to that dir which i got from the user.
But its not working, probably bec the prompt goes inside the server to which the SFTP connection was established.
In this case how to make it work ?
I also faced this problem and was able to find the solution. The solution is right there in the man page of sftp. In the man page, you will find where it is written the format of using sftp like this:
sftp [options] [user#]host[:dir[/]]
Actually, two formats are given there but this is the one I wanted to use and it worked.
So, what do you do? You simply supply the username#host as seen there, then, without any space followed by : and the path you want to change to in the client/remote server in your script and that's all. Here is a practical example:
sftp user#host:/path/
If your script does, as you state somewhere in this page,
sftp $user#$host cd $directory
and then tries to do something else, like:
sftp $user#$host FOO
That command FOO will not be executed in the same directory $directory since you're executing a new command, which will create a new connection to the SFTP server.
What you can do is use the "batchfile" option of sftp, i.e. construct a file which contains all the commands you'd like sftp to do over one connection, for example:
$ cat commands.txt
cd foo/bar
put foo.tgz
lcd /tmp/
get foo.tgz
Then, you will be able to tell sftp to execute those commands in one connection, by executing:
sftp -b commands.txt $user#$host
So, I propose your solution to be:
With user's input, create a temporary text file which contains all the commands to be executed over one SFTP connection, then
Execute sftp using that temporary text file as "batch file"
Your script would do something like:
echo "Directory in which to go:"
read directory
temp=$( mktemp /tmp/FOOXXX )
echo "" > $temp
echo "cd $directory" >> $temp
# other commands
sftp -b $temp $user#$host
rm $temp
If you are trying to change the directory of the machine, try lcd
In what way is it not working? To change directories on the remote server, you use the "cd" command. To change directories on the local server, you use the "lcd" command. Read "man sftp".

Resources