How to temporarily backup the current folder? - bash

I would like to do the following:
Backup the current folder
Run some test files
Restore the backup (delete all changes from previous commands)
Delete backup
Unfortunately I do not have Git available. Otherwise I would do
git add .
git commit -m "backup"
# run commands
git checkout .

The simplest possible way would be to just create a copy in the parent directory or other convenient location.
You could create such aliases to make it easier:
alias bak-cur-dir='(DIR="${PWD##*/}" && cd .. && cp -r "$DIR" "$DIR".bak)'
alias res-cur-dir='(DIR="${PWD##*/}" && cd .. && rm -rf "$DIR" && mv "$DIR".bak "$DIR") && cd .. && cd -'

I use in such a case to create a directory ${TMP:-/tmp}/$$ for this purpose. You can later change the location, just by defining TMP, and it uses something reasonable, when TMP is unset.
tmpdir=${TMP:-/tmp}/$$
mkdir -p "$tmpdir"
cp -r . "$tmpdir"
.... # Do your processing and set
.... # the variable keep_backup to
.... # your liking.
# Remove unneeded backup, when done.
if ((keep_backup == 0))
then
rm -r "$tmpdir"
else
echo You find a backup in $tmpdir
fi

cd /working/dir
tar cfz ../backup.tar.gz .
# do your thing
#
# and then some
rm *
tar xf ../backup.tar.gz
rm ../backup.tar.gz

Related

bash script to create folder from filename and move two kinds of file into it

I have a set of banners that include .jpg and .psd. I need to create folder for them and move them into it.
example:
Banner-A.jpg
Banner-A.psd
Banner-B.jpg
Banner-B.psd
Banner-C.jpg
Banner-C.psd
Create folder and move them:
Banner-A/Banner-A.jpg Banner-A.psd
Banner-B/Banner-B.jpg Banner-B.psd
Banner-C/Banner-C.jpg Banner-C.psd
I manage to find a script here that work for the first part but I can't get the .psd to move as well.
for f in "$#"; do
cd "$f"
for file in *.jpg; do
folder=$(basename "$file" ".jpg")
mkdir -p "$folder" && mv "$file" "$folder"
done
done
Change your mv command to use the * wildcard as follows:
for file in *.jpg; do
folder=$(basename "$file" ".jpg")
mkdir -p "${folder}" && mv "${folder}".* "${folder}"
done
Make sure the .* is outside the quotes and it should work.
Example script:
#!/bin/bash
set -uo pipefail
doWork() {
dir="${1}"
cd "${dir}" || return
for file in *.jpg; do
folder=$(basename "$file" ".jpg")
mkdir -p "${folder}" && mv "${folder}".* "${folder}"
done
}
doWork "$#"
Example data directory: (before executing script)
$ ls data
Banner-A.jpg Banner-A.psd Banner-B.jpg Banner-B.psd Banner-C.jpg Banner-C.psd
Run script:
./script.sh ./data
data directory after script:
$ ls data
Banner-A Banner-B Banner-C
data directory subdirectories:
$ ls data/Banner-*
data/Banner-A:
Banner-A.jpg Banner-A.psd
data/Banner-B:
Banner-B.jpg Banner-B.psd
data/Banner-C:
Banner-C.jpg Banner-C.psd
I'd probably just use
for f in Banner*.jpg Banner*.psd; do
mkdir -p "${f%.???}/"
mv "$f" "${f%.???}/"
done
It will execute the mkdir -p for the *.psg files after the *.jpg has already created the folder, but covers the odd cases where there might be one of the files but not the other.

Bash doesn't create all files and directories with loop

I'm trying to create a directory with list of directories with list of files.
Could you explain me, why this script doesn't work properly?
createapp() {
local folders=('graphql' 'migrations' 'models' 'tests')
local files=('__init__.py' 'admin.py' 'apps.py' 'views.py')
cd backend/apps
mkdir $COMMAND2
cd $COMMAND2
for folder in $folders
do mkdir $folder && cd $folder
for file in $files
do touch $file && cd ..
done
done
}
It creates a graphql directory, and an __init__.py file in it, but that's all.
There are a few problems:
You aren't iterating over the contents of the array, only the first element.
You are executing cd .. too soon; you want to do that after the loop that creates each file.
for folder in "${folders[#]}"; do
mkdir -p "$folder" &&
cd "$folder" &&
for file in "${files[#]}"; do
touch "$file"
done &&
cd ..
done
There are two ways to simplify this. If you keep the cd command, you only need one call to touch:
for folder in "${folders[#]}"; do
mkdir -p "$folder" && cd "$folder" && touch "${files[#]}" && cd ..
done
Or you can get rid of the cd command, and pass a longer path to touch:
for folder in "${folders[#]}"; do
mkdir -p "$folder" &&
for file in "${files[#]}"; do
touch "$folder/$file"
done
done
or even
for folder in "${folders[#]}"; do
mkdir -p "$folder" &&
touch "${files[#]/#/$folder/}"
done
If you want to get fancy, you can do all of this with zero loops and a combination of bash brace and array expansions:
#!/bin/bash
createapp() {
local folders=('graphql' 'migrations' 'models' 'tests')
local files=('__init__.py' 'admin.py' 'apps.py' 'views.py')
IFS=,
eval mkdir -p "{${folders[*]}}"
eval touch "{${folders[*]}}/{${files[*]}}"
}
Note that the use of eval can be dangerous, but it's pretty limited in this implementation as the two arrays are local and not using user-defined input. If this was zsh, you could embed brace and array expansion without the need for eval

How to make this command work from parent directory zip files/test.zip $(tar tf files/test.gz)

I've this command:
zip files/test.zip $(tar tf files/test.gz)
But it doesn't work because everything from $(tar tf files/test.gz) are in files/
So zip can't find it.
It works perfectly if I change directory to files and exec this one:
zip test.zip $(tar tf test.gz)
But I need to make it work from parent directory.
My full command is:
tar xzf files/test.tar.gz && zip files/test.zip $(tar tf files/test.tar.gz) && rm -r -- $(tar tf files/test.tar.gz)
From Is there any command on Linux to convert .tar.gz files to .zip?
Thank you
I would make a little script for that. If you have many tgz files to handle, you can apply a loop on it.
#!/bin/bash
TMP=/a/tmp/directory
TARGET=/dir/to/your/files
mkdir -p $TMP
cd $TMP
tar -C $TMP -xzf $TARGET/your.tar.gz && zip $TARGET/your.zip *
rm -rf $TMP

Ruby chmod works, but not for one directory called "js/"

I've been putting together a ruby script that delpoys a git repository on my webserver (running gitolite) with a post-recieve hook.
After checking out the files I try to chmod the directories first, and the files after like this:
FileUtils.chmod_R(0755, Dir.glob("#{deploy_to_dir}/**/*/"))
FileUtils.chmod_R(0644, Dir.glob("#{deploy_to_dir}/**/*"))
The first command works for all directories but one: js/. It just dosn't set the +x to this directory – while at the same time setting the +r.
Here's what happens:
Before: dr-------- js/
Skript does chmod 755 on js/
After: drw-r--r-- js/
Expected: drwxr-xr-x js/
Ich checked the arrtibutes with lsattr. It gives only -----------------e- ./js/ which shows nothing special. Is there anything else that could be wrong?
Changing it in bash directly works fine. What does Ruby do to this single directory?
Try reversing the order:
FileUtils.chmod_R(0644, Dir.glob("#{deploy_to_dir}/**/*"))
FileUtils.chmod_R(0755, Dir.glob("#{deploy_to_dir}/**/*/"))
Otherwise all files and directories will be matched by 0644 chmod and undo your execute bit.
In the end it was the problem, that js/ or en/ were the first directories in the glob. => Now it's a bash script and it works.
#!/bin/bash
# post-receive
# 1. Read STDIN (Format: "from_commit to_commit branch_name")
read from to branch
if [[ $branch =~ master$ ]] ; then
deploy_to_dir='/var/www/virtual/whnr/vectoflow'
GIT_WORK_TREE="$deploy_to_dir" git checkout -f master
elif [[ $branch =~ development$ ]] ; then
deploy_to_dir='/var/www/virtual/whnr/vectotest'
GIT_WORK_TREE="$deploy_to_dir" git checkout -f development
else
echo "Received branch $branch, not deploying."
exit 0
fi
# 3. chmod +r whole deploy_to_dir
find $deploy_to_dir -type d -print0 | xargs -0 chmod 755
echo "DEPLOY: Changed Permissions on all directories 755"
find $deploy_to_dir -type f -print0 | xargs -0 chmod 644
echo "DEPLOY: Changed Permissions on all files 644"

untar filename.tr.gz to directory "filename"

I would like to untar an archive e.g. "tar123.tar.gz" to directory /myunzip/tar123/" using a shell command.
tar -xf tar123.tar.gz will decompress the files but in the same directory as where I'm working in.
If the filename would be "tar233.tar.gz" I want it to be decompressed to /myunzip/tar233.tar.gz" so destination directory would be based on the filename.
Does anyone know if the tar command can do this?
tar -xzvf filename.tar.gz -C destination_directory
With Bash and GNU tar:
file=tar123.tar.gz
dir=/myunzip/${file%.tar.gz}
mkdir -p $dir
tar -C $dir -xzf $file
You can change directory before extracting with the -C flag, but the directory has to exist already. (If you create file-specific directories, I strongly recommend against calling them foo.tar.gz - the extension implies that it's an archive file but it's actually a directory. That will lead to confusion.)
Try
file=tar123.tar.gz
dir=/myunzip/$(basename $file .tar.gz) # matter of taste and custom here
[ -d "$dir" ] && { echo "$dir already exists" >&2; exit 1; }
mkdir "$dir" && ( gzip -d "$file | ( cd "$dir" && tar xf - ) )
If you're using GNU tar you can also give it an option -C "$dir" which will cause it to change to the directory before extracting. But the code above should work even with a Bronze Age tar.
Disclaimer: none of the code above has been tested.

Resources