Prevent expansion of `~` - bash

I have a script which sync's a few files with a remote host. The commands that I want to issue are of the form
rsync -avz ~/.alias user#y:~/.alias
My script looks like this:
files=(~/.alias ~/.vimrc)
for file in "${files[#]}"; do
rsync -avz "${file}" "user#server:${file}"
done
But the ~ always gets expanded and in fact I invoke the command
rsync -avz /home/user/.alias user#server:/home/user/.alias
instead of the one above. But the path to the home directory is not necessarily the same locally as it is on the server. I can use e.g. sed to replace this part, but it get's extremely tedious to do this for several servers with all different paths. Is there a way to use ~ without it getting expanded during the runtime of the script, but still rsync understands that the home directory is meant by ~?

files=(~/.alias ~/.vimrc)
The paths are already expanded at this point. If you don't want that, escape them or quote them.
files=('~/.alias' \~/.vimrc)
Of course, then you can't use them, because you prevented the shell from expanding '~':
~/.alias: No such file or directory
You can expand the tilde later in the command using eval (always try to avoid eval though!) or a simple substitution:
for file in "${files[#]}"; do
rsync -avz "${file/#\~/$HOME/}" "user#server:${file}"
done

You don't need to loop, you can just do:
rsync -avz ~/.alias/* 'user#y:~/.alias/'
EDIT: You can do:
files=(.alias .vimrc)
for file in "${files[#]}"; do
rsync -avz ~/"${file}" 'user#server:~/'"${file}"
done

Related

How to build script bash with SFTP connection to pull files

I'm implementing agent script bash to pull files from the remote server with SFTP service.
The script must:
connect SFTP
file listing
cycling on files found
get every file and copy agent side
after that files copied must be deleted
The script is followed:
#!/bin/bash
SFTP_CONNECTION="sftp -oIdentityFile=/home/account_xxx/.ssh/service_ssh user#host"
DEST_DATA=/tmp/test/data/
# GET list file by ls command ###############
$SFTP_CONNECTION
$LIST_FILES_DATA_OSM1 = $("ls fromvan/test/data/test_1")
echo $LIST_FILES_DATA_OSM1
for file in "${LIST_FILES_DATA_OSM1[#]}"
do
$SFTP_CONNECTION get $file $DEST_DATA
$SFTP_CONNECTION rm $file
done
I tried the script but it seems that the connection and command execution (ls) are distinct on thread separated.
How can I provide command sequential as described above ?
Screenshoot:
Invalid find command
SSH it seem not available
RSYNC result to take the files is the followed:
Thanks
First of all, I would recommend the following syntax changes:
#!/bin/bash
sftp_connection() {
sftp -oIdentityFile=/home/account_xxx/.ssh/service_ssh user#host "$#";
}
Dest_Data=/tmp/test/data/
# GET list file by ls command ###############
sftp_connection
List_Files_D_OSM1=$("ls fromvan/test/data/test_1")
echo "$LIST_FILES_DATA_OSM1"
for file in "${LIST_FILES_DATA_OSM1[#]}"
do
sftp_connection get "$file" $Dest_Data
sftp_connection rm "$file"
done
Quoting $file and $List_Files_D_OSM1 to prevent globbing and word splitting.
Assignments can't start with a $, otherwise bash will try to execute List_Files_D_OSM1 and will complain with a command not found
No white spaces in assignments like List_Files_D_OSM1 = $("ls fromvan/test/data/test_1")
You can use ShellCheck to catch this kind of errors.
Having said that, it is in general not a good idea to use ls in such way.
What you can use instead is something like find. For example:
find . -type d -exec echo '{}' \;
Use a different client. lftp supports sftp as a transport, and has a subcommand for mirroring which will do the work of listing the remote directory and iterating over files for you.
Assuming your ~/.ssh/config contains an entry like:
Host myhost
IdentityFile /home/account_xxx/.ssh/service_ssh
...you can run:
lftp -e 'mirror -R fromvan/test/data/test_1 /tmp/test/data' sftp://user#myhost

How to rename all files over SSH

I am trying to rename all files in a remote directory over SSH or SFTP. The rename should convert the file into a date extension, for example .txt into .txt.2016-05-25.
I have the following command to loop each .txt file and try to rename, but am getting an error:
ssh $user#$server "for FILENAME in $srcFolder/*.txt; do mv $FILENAME $FILENAME.$DATE; done"
The error I am getting is:
mv: missing destination file operand after `.20160525_1336'
I have also tried this over SFTP with no such luck. Any help would be appreciated!
You need to escape (or single-quote) the $ of variables in the remote shell. It's also recommended to quote variables that represent file paths:
ssh $user#$server "for FILENAME in '$srcFolder'/*.txt; do mv \"\$FILENAME\" \"\$FILENAME.$DATE\"; done"
Try this:
By using rename (perl tool):
ssh user#host /bin/sh <<<$'
rename \047use POSIX;s/$/strftime(".%F",localtime())/e\047 "'"$srcFolder\"/*.txt"
To prepare/validate your command line, replace ssh...bin/sh by cat:
cat <<<$'
rename \047use POSIX;s/$/strftime(".%F",localtime())/e\047 "'"$srcFolder\"/*.txt"
will render something like:
rename 'use POSIX;s/$/strftime(".%F",localtime())/e' "/tmp/test dir"/*.txt
And you could localy try (ensuring $srcFolder contain a path to a local test folder):
/bin/sh <<<$'
rename \047use POSIX;s/$/strftime(".%F",localtime())/e\047 "'"$srcFolder\"/*.txt"
Copy of your own syntax:
ssh $user#$server /bin/sh <<<'for FILENAME in "'"$srcFolder"'"/*.txt; do
mv "$FILENAME" "$FILENAME.'$DATE'";
done'
Again, you could locally test your inline script:
sh <<<'for FILENAME in "'"$srcFolder"'"/*.txt; do
mv "$FILENAME" "$FILENAME.'$DATE'";
done'
or preview by replacing sh by cat.
When using/sending variables over SSH, you need to be careful what is a local variable and which is a remote variable. Remote variables must be escaped; otherwise they will be interpreted locally versus remotely as you intended. Other characters also need to be escaped such as backticks. The example below should point you in the right direction:
Incorrect
user#host1:/home:> ssh user#host2 "var=`hostname`; echo \$var"
host1
Correct
user#host1:/home:> ssh user#host2 "var=\`hostname\`; echo \$var"
host2

Error handling in shell script

I have the following the shell script. Which throws the following error if there is no file exist in the folder. So, how we do handle this so that script doesn't stop executing?
Error:
mv: cannot stat `*': No such file or directory
Script:
for file in *
do
fl=$(basename "$file")
flname="${fl%%.*}"
gunzip "$file"
mv "$flname" "$flname-NxDWeb2"
tar -zcf "$flname-NxDWeb2".tar.gz "$flname-NxDWeb2"
rm "$flname-NxDWeb2"
done;
If the shell is bash, you can allow * to expand to the null string: shopt -s nullglob before your loop.
BTW, you might want to explicitly specify the uncompressed filename to produce, in case your logic doesn't completely agree with gunzip's (which it probably won't, if there's more than one dot in the name, or the file ends with .tgz or .taz):
gunzip -c "$file" >"$flname"
(you will need to remove the original yourself in this case, though)
You can avoid the need to move, too:
flname="${fl%%.*}-NxDWeb2"
And you might want to use trap to ensure your temporaries are cleaned up in the failure case (possible make your temporaries in $TMPDIR, too).

Using wildcard in bash loop array

I've got a collection of hundreds of directories ordered in alphabetical order and with differently named files inside. These directories I want to copy over to another location using rsync.
I don't wanna go over all the directories manually, but instead I want to use the --include option of rsync or create a loop in bash to go over the directories.
For far I've tried using the bash script below, but had no success yet.
for dir in {A..Z}; do
echo "$dir";
rsync --progress --include $dir'*' --exclude '*' -rt -e ssh username#192.168.1.123:/source/directory/ ~/target/directory/
done;
Does anyone know what would be the correct way to go over the directories using rsync's --include option?
Update:
The bash script above was more to try out the loop to go over my directories and see what comes out. The command I actually wanted to use was this one:
for dir in /*; do
rsync --progress --include $dir'*' --exclude '*' --bwlimit=2000 -rt -e ssh username#192.168.1.123:/source/directory/ ~/target/directory/
done;
I know bash can do something like {A..Z}, but this doesn't seem to get me the result I want. I already copied half of the alphabet of directories so I was trying {F..Z} as an array.
Update
I've come up with the following script to run from my source directories location.
#!/bin/bash
time=$(date +"%Y-%m-%d %H:%M:%S") # For time indication
dir=/source/directory/[P-Z]* # Array of directories with name starting with "P" to "Z"
printf "[$time] Transferring: /source/directory/\"$dir\"\n"
rsync -trP -e 'ssh -p 123' --bwlimit=2000 $dir username#192.168.1.123:/target/directory
This will transfer all directories from the source directory with names starting with character "P" to "Z" over ssh using port 123.
This works for me in a shell script. I'm sure there are better ways to do this in a single line command, but this one I just came up with to help myself out.
Sounds like you want recursive rsync. I'd go with:
rsync -r / --restOfYourRsyncArgs
That walks over every file/folder/subfolder in / (could be A LOT, consider excludes and/or a different target path) and uploads/downloads. Set excludes for files and folders you don't want sent.
I've come up with the following script to run from my source directories location.
#!/bin/bash
time=$(date +"%Y-%m-%d %H:%M:%S") # For time indication
dir=/source/directory/[P-Z]* # Array of directories with name starting with "P" to "Z"
printf "[$time] Transferring: /source/directory/\"$dir\"\n"
rsync -trP -e 'ssh -p 123' --bwlimit=2000 $dir username#192.168.1.123:/target/directory
This will transfer all directories from the source directory with names starting with character "P" to "Z" over ssh using port 123. This works for me in a shell script. I'm sure there are better ways to do this in a single line command, but this one I just came up with to help myself out.

Pass url to a bash script for use in scp

I'm writing a cron to backup some stuffs on a server.
Basically I'm sending specific files form a local directory using scp.
I'm using a public key to avoid authentication.
For reusability I'm passing the local directory and the server url by arguments to my bash script.
How I set my parameters:
#!/bin/bash
DIR="$1"
URL="$2"
FILES="$DIR*.ext"
My problem is about formatting the url.
Without formatting
How I send files to the server:
#!/bin/bash
for F in $FILEs
do
scp $F $URL;
if ssh $URL stat $(basename "$F")
then
rm $F
else
echo "Fails to copy $F to $URL"
fi
done
If I try to copy at user's home on the server I do:
$ ~/backup /path/to/local/folder/ user#server.com:
If I try to copy at a specific directory on the server I do:
$ ~/backup /path/to/local/folder/ user#server.com:/path/to/remote/folder/
In all cases it gives me the well known error (and my custom echo):
ssh: Could not resolve hostname user#server.com: nodename nor [...]
Can't upload /path/to/local/folder/file.ext to user#server.com
And it works anyway (the file is copied). But that's not a solution, cause as scp fails (seems to), the file is never deleted.
With formatting
I tried sending files using this method:
#!/bin/bash
for F in $FILES
do
scp $F "$URL:"
done
I no longer get an error, and it works for copying at user's home directory then deleting the local file:
$ ~/backup /path/to/local/folder/ user#server.com
But, of course, sending to a specific directory don't work at all.
Finally
So I think that my first method is more appropriate, but how can I get rid of that error?
Your mistake is that you can scp to user#server.com: but not ssh to it : you need to remove the trailing : character (and possible path after it). You can do it easily like this with bash parameter expansion :
ssh "${URL%:*}" stat "$(basename "$F")"
RECOMMENDATIONS
"USE MORE QUOTES!" They are vital. Also, learn the difference between ' and " and `. See http://mywiki.wooledge.org/Quotes and http://wiki.bash-hackers.org/syntax/words
if you have spaces in filenames, your code will breaks things up. Better use while IFS= read -r line; do #stuff with $line; done < file.txt
See bash parameter expansion

Resources