Mimic rsync command behaviour in shell script - shell

I'm attempting to write my first short backup script. I'm using "plain" sh. When I issue the following command on the command line:
rsync /home/username/.* /home/username/backups
everything works well, as expected [IE ONLY my dotfiles and dot-directories are backed up].
But when I run a shell script with this line in it, rsync behaves strangely and ends up repeatedly and recursively backing up everything and my dotfiles. So I end up with a backup like this:
/backups/dotfiles/backups/dotfiles/backups/dotfiles/ etc etc etc.
How do I make use of the "*" in a this shell script to make rsync behave as expected?
Here is the whole script:
#!/bin/sh
#Backup dotfiles
echo 'Backing up dotfiles...'
rsync -avzu --delete --progress --exclude='.wine' --max-size='100M' /home/slowmo/.* /home/slowmo/backups/dotfiles/
#Compress into single file
echo 'Compressing dotfiles...'
tar cvzf /home/slowmo/backups/dotfiles.tar.gz /home/slowmo/backups/dotfiles/

I'm not sure if this is exactly the same, but I just ran across a similar issue.
I saw different results when trying to back up my hidden files from my home directory using both:
rsync from the shell - worked as expected, all dot files backed up when doing rsync .* /media/backup/dotfiles
rsync from a bash script. The same command as above didn't work, backed up the dot files, but also every other folder in the home directory.
On some investigation and reflection it turns out my shell (zsh) expands wildcards differently to bash, and this difference is crucial in this particular case. zsh does not expand .* to include . (this directory) and .. (parent directory), whereas bash does, so in the bash script rsync was given all hidden files, including the folder itself and the parent directory.
I've changed my script to instead call:
rsync .[^.]* /media/backup/dotfiles
which solves my problem - but do note that this pattern won't match regular files that start with .., like ..tmp
Some places to look for more information (not always in the context of rsync) include:
https://unix.stackexchange.com/a/476265/180385
https://www.cyberciti.biz/faq/linux-unix-appleosx-bsd-rsync-copy-hidden-dot-files/
https://unix.stackexchange.com/a/90075/180385
https://unix.stackexchange.com/a/685367/180385

Related

failed to delete multiple folders inside one folder except 3 folders using rm

I am trying to delete all folders inside a folder except 2 3 using rm. But, command doesnt work with below error
Tried using escape character, but it will not delete folders.
Any solutions?
EDIT 1
Using double quotes after parentheses is a not working
EDIT 2
using shopt is also not working
Note 1: Shell Option needs to be set first for extglob. I had the same issue running my script until I added it to my .sh file. Link below is a reference from Unix stack exchange.
https://unix.stackexchange.com/questions/153862/remove-all-files-directories-except-for-one-file
Full Working Script: test.sh
shopt -s extglob;
cd Test;
rm -rfi !("atest3"|"atest2")
Note 2: If working in Ansible, the default shell is SH vs BASH. Due to this, you may need to call the script with BASH or use solutions found in this answer to get it working. The script below works in BASH. In ZSH it may require a setopt command similar to extglob.
Script call from SH basic terminal using bash
The following command will remove (rm) by force (f) recursively (r) and interactively (i) all files or folders except (! bang operator) for the file or folders strings ('arg1'|'arg2'|...) listed as piped (|) arguments.
If a nested files and folders is passed as an arg, but the parent is not, that file or folder will be deleted:
rm -rfi !('atest2'|'atest3')
Example of using rm with -rf flags and bang operator to prevent passed string arguments being deleted
I am leaving the below commands as examples from an earlier answer I wrote.
This command will create multiple folders:
mkdir {test1,atest2,atest3,atest4}
And this command will remove the folders:
rm -rfi {test1,atest2,atest3,atest4}
Example of mkdir with args and rm -rfi with args shell command

Shell Command For File To Delete Itself

I want to make a file that runs a script, then deletes itself. I know that its root would most likely be "~/Library/Downloads/filename.app". How would I go about having it self destruct? I'm working in script editor.
I'm not sure if I understand correctly as shell script would traditionally have .sh suffix instead of .app one (if any) and I'm not familiar with anything that I'd call "script editor", but alas here's my solution.
If you are in bash environment, you can make use of the BASH_SOURCE array. Provided that you didn't change the current working directory, you can directly call
rm "${BASH_SOURCE[0]}"
(or just rm "$BASH_SOURCE").
If you are using cd or make larger script, it might be advisable to save fully resolved path to the script at the beginning and remove that file at the end (not somewhere in the middle as running bash scripts are NOT independent on their source files*) like so:
#!/bin/bash
self=$(realpath "${BASH_SOURCE[0]}")
#
# code so ugly I want to delete it when I'm done
#
rm "$self"
*Edit shell script while it's running

Why can't my shell function change directory from piped input?

I have made a cli which computes the location of a certain directory, and I would really like to have the script change into it. My first approach was ./myscript | cd which I've learned doesn't work as 1. cd doesn't take piped arguments, and 2. the script won't be allowed to change the directory of the parent shell.
I learned that there are workarounds, and I tried creating a function in .bash_profile:
function cdir(){
DIR=""
while read LINE; do
DIR=$LINE
done
echo $DIR
cd $DIR
}
Now running ./myscript | cdir the correct directory is output, however, the directory is still not changed.
Using command substitution works: cd $(./myscript), but I'd really prefer to be able to write this using pipe. Do you have any idea how I can get this to work, or at least explain why it isn't possible?
cd changes the current working directory, which is an attribute of the process. When you use a pipeline, shell creates a subprocess and the effect of cd (inside your function) is lost when the subprocess finishes.
cd -- "$(./myscript)"
is the right way of doing it. Here, cd runs in your current shell.
See also:
Why doesn't "cd" work in a bash shell script?
chdir() doesn't change directory after exiting to shell on Unix & Linux Stack Exchange

Why does this script work in the current directory but fail when placed in the path?

I wish to replace my failing memory with a very small shell script.
#!/bin/sh
if ! [ –a $1.sav ]; then
mv $1 $1.sav
cp $1.sav $1
fi
nano $1
is intended to save the original version of a script. If the original has been preserved before, it skips the move-and-copy-back (and I use move-and-copy-back to preserve the original timestamp).
This works as intended if, after I make it executable with chmod I launch it from within the directory where I am editing, e.g. with
./safe.sh filename
However, when I move it into /usr/bin and then I try to run it in a different directory (without the leading ./) it fails with:
*-bash: /usr/bin/safe.sh: /bin/sh: bad interpreter: Text file busy*
My question is, when I move this script into the path (verified by echo $PATH) why does it then fail?
D'oh? Inquiring minds want to know how to make this work.
The . command is not normally used to run standalone scripts, and that seems to be what is confusing you. . is more typically used interactively to add new bindings to your environment (e.g. defining shell functions). It is also used to similar effect within scripts (e.g. to load a script "library").
Once you mark the script executable (per the comments on your question), you should be able to run it equally well from the current directory (e.g. ./safe.sh filename) or from wherever it is in the path (e.g. safe.sh filename).
You may want to remove .sh from the name, to fit with the usual conventions of command names.
BTW: I note that you mistakenly capitalize If in the script.
The error bad interpreter: Text file busy occurs if the script is open for write (see this SE question and this SF question). Make sure you don't have it open (e.g. in a editor) when attempting to run it.

Understanding script language

I'm a newbie to scripting languages trying to learn bash programming.
I have very basic question. Suppose I want to create three folders like $HOME/folder/
with two child folders folder1 and folder2.
If I execute command in shell like
mkdir -p $HOME/folder/{folder1,folder2}
folder will be created along with child folder.
If the same thing is executed through script I'm not able get expected result. If sample.sh contains
#!/bin/sh
mkdir -p $HOME/folder/{folder1,folder2}
and I execute sh ./sample.sh, the first folder will be created then in that a single {folder1,folder2} directory is created. The separate child folders are not created.
My query is
How the script file works when we compared to as terminal command? i.e., why is it not the same?
How to make it work?
bash behaves differently when invoked as sh, to more closely mimic the POSIX standard. One of the things that changes is that brace expansion (which is absent from POSIX) is no longer recognized. You have several options:
Run your script using bash ./sample.sh. This ignores the hashbang and explicitly uses bash to run the script.
Change the hashbang to read #!/bin/bash, which allows you to run the script by itself (assuming you set its execute bit with chmod +x sample.sh).
Note that running it as sh ./sample.sh would still fail, since the hashbang is only used when running the file itself as the executable.
Don't use brace expansion in your script. You could still use as a longer method for avoiding duplicate code:
for d in folder1 folder2; do
mkdir -p "$HOME/folder/$d"
done
Brace expansion doesn't happen in sh.
In sh:
$ echo {1,2}
produces
{1,2}
In bash:
$ echo {1,2}
produces
1 2
Execute your script using bash instead of using sh and you should see expected results.
This is probably happening because while your tags indicate you think you are using Bash, you may not be. This is because of the very first line:
#/bin/sh
That says "use the system default shell." That may not be bash. Try this instead:
#!/usr/bin/env bash
Oh, and note that you were missing the ! after #. I'm not sure if that's just a copy-paste error here, but you need the !.

Resources