Extracting file with folders then using bash completion - bash

I don't work much with bash scripting and was trying to do something like this:
#!/bin/bash
wget https://johnvansickle.com/ffmpeg/builds/ffmpeg-git-i686-static.tar.xz
tar xvf ffmpeg-git-i686-static.tar.xz
cd ./ffmpeg-git-20210501-i686-static/
cp ./ffmpeg-git-20210501-i686-static/ffmpeg /etc/bin
Is there a variable or a way I can determine what that extracted folder is called during script execution. For example with bash completion at the command line I would use cd ./ffmpeg (since I know it starts with ffmpeg)
Make sense?

There is no way to know beforehand the directory structure, but you can use a wildcard in the copy command:
cp ./*/ffmpeg /etc/bin
Although, I do not recommend installing tarball extracted executable within /etc/bin.
I'd put a symbolic link inside /usr/local/bin/ instead:
#!/usr/bin/env sh
__opwd="$PWD"
trap 'cd "$__opwd"' EXIT ABRT INT
wget https://johnvansickle.com/ffmpeg/builds/ffmpeg-git-i686-static.tar.xz || exit 1
# Create place where to install the unpacked archive
mkdir -p '/opt/ffmpeg-git-i686-static' || exit 1
# Unpack the archive
cd '/opt/ffmpeg-git-i686-static' || exit 1
tar xf "$__opwd/ffmpeg-git-i686-static.tar.xz" || exit 1
# Create a symbolic link from the ffmpeg command into `/usr/local/bin/`
ln -sf /opt/ffmpeg-git-i686-static/*/ffmpeg /usr/local/bin/

Related

bash shell generates a link that was not specified

I wrote a simple bash script (in /homedir) to run an executable and then move the outputs to /workdir. I also made a soft link of /workdir named work to /homedir for me to switch easily between folders.
All steps are working well, except that an unspecified soft link named 'grids' is created in /workdir to itself. I can't delete it otherwise all outputs are gone as well.
How can this happen?
#!/bin/bash
cd ..
expname=`basename "$PWD"`
echo 'experiment name: '$expname
homedir=/home/b/b380963/icon_foehn/$expname/grids/
workdir=/work/bb1096/b380963/icon_foehn/$expname/grids/
if [ ! -d ${workdir} ]; then
mkdir -p ${workdir}
fi
cd $homedir
ln -s ${workdir} work
cd /home/b/b380963/nwp/dwd_icon_tools_v2/icontools/
./icongridgen --nml $homedir/gridgen_MCH_july.nml
mv ICON_1E_* $workdir/
mv base_grid* $workdir/
It's quite easy to see in your code:
workdir=/work/bb1096/b380963/icon_foehn/$expname/grids/
...
ln -s ${workdir} work
The command ln -s is the command, creating the symlink.
If you don't like the creation of that symlink, you might put that line in comment (don't delete it: in case you're not satisfied, it's easier to uncomment it).
You can solve your issue, using this command:
ln -sTf ...
This removes the existing destination files beforehand.

Creating multiple directories with shell script results in odd directory structure

I wrote a simple bash script to create some directories, like so:
#!/bin/sh
PROJECT_DIR=$(cd "$(dirname "$0")" && pwd)
cd ${PROJECT_DIR} || exit 1
mkdir -p Website/{static/{cs,js},templates/{html,xhtml}}
However, after I run the script (./script.sh), the directory structure looks like this:
There is no syntax error in my script. When I try the mkdir command directly on the terminal, the directories are created correctly.
Why is the bash script run behaving this way?

how to unzip a file using unzip command?

I have an script which creates a folder named "data". Then it downloads a file using wget and these files (.zip format) are moved from the current directory to the folder "data". After that what I want is to unzip these files. I'm using unzip filename.zip and it works when I use it on the cmd, however I don't know why it's not working in the script.
Here is the script:
#!/bin/bash
mkdir data
wget http://187.191.75.115/gobmx/salud/datos_abiertos/datos_abiertos_covid19.zip && mv datos_abiertos_covid19.zip data && unzip datos_abiertos_covid19.zip
wget http://187.191.75.115/gobmx/salud/datos_abiertos/diccionario_datos_covid19.zip && mv diccionario_datos_covid19.zip data && unzip diccionario_datos_covid19.zip
datos_abiertos_covid19.zip and diccionario_datos_covid19.zip are the files I want to unzip once they are in my folder "data". I would really appreciate if someone can help me. Thanks in advance!
It fails because unzip foo.zip assumes foo.zip is in the current directory, but you just moved it to a subdirectory data. Interactively, you probably cd data first and that's why it works.
To make it work in your script, just have your script cd data as well:
#!/bin/bash
mkdir data
cd data || exit 1
wget http://187.191.75.115/gobmx/salud/datos_abiertos/datos_abiertos_covid19.zip && unzip datos_abiertos_covid19.zip
That way, the file is downloaded directly to the data directory so no mv is necessary, and the unzip command works as expected.
My approach:
#!/bin/bash
set -e # Exit if any command fails
mkdir data
pushd ./data >/dev/null
for i in 'datos_abiertos_covid19.zip' 'diccionario_datos_covid19.zip'; do
# Don't unzip (or exit) if 'wget' fails, don't exit if 'unzip' fails
wget "http://187.191.75.115/gobmx/salud/datos_abiertos/$i" -O "./$i" || continue
unzip "./$i" || true
done
popd >/dev/null
The file names don't need to be quoted in this case, but I did so anyway, to emphasise you can/should do so if necessary
You could of course use variables for the file list, URL, download dir, etc. if you wanted to build a more general script for downloading zip files
I know it's marked bash, but worth mentioning: pushd and popd are not defined in POSIX, you can change those to cd ./data and cd .. for more portability. Obviously wget is not POSIX either, but very common (see this thread for interesting info on that topic)

Extract all tar.bz2 files to a directory

I have many tar.bz2 files in a directory, and would like to extract them to another directory.
Here is my bash script:
for i in *.tar.bz2 do;
sudo tar -xvjf $i.tar.bz2 -C ~/myfiles/
done
It doesn't work. How can I make it work? Thanks!
Your variable $i contains the entire file name (as you have applied the regex *.tar.bz2). So inside your for loop you don't need to attach the extension.
Try:
for i in *.tar.bz2; do
sudo tar -xvjf "$i" -C ~/myfiles/
done
You also have ; misplaced.

Including a chunk of code in a shell script

I have a number of shell scripts that all look like this:
#!/bin/bash
cd ~/Dropbox/cms_sites/examplesite/media
sass -C --style compressed --update css:css
cd ~/Dropbox/cms_sites/examplesite
rm -f ./cache/*.html
rm -fr ./media/.sass-cache/
rm -fr ./admin/media/.sass-cache/
rsync -auvzhL . username#host:/home/username/remote_folder
(I know the use of cd seems weird, but they have evolved!)
Now, all these scripts have a few differences, in that they have different usernames, hosts, local folder and remote folder names, and I want an inexperienced user to be able to run them without arguments (so he can drag and drop them into a terminal without issue).
What I'd like to do is something like:
#!/bin/bash
cd ~/Dropbox/cms_sites/examplesite/media
sass -C --style compressed --update css:css
cd ~/Dropbox/cms_sites/examplesite
include ~/scripts/common.sh
rsync -auvzhL . username#host:/home/username/remote_folder
then have a file in common.sh that looks like:
rm -f ./cache/*.html
rm -fr ./media/.sass-cache/
rm -fr ./admin/media/.sass-cache/
so that I can easily change sections of the code in lots of scripts at once.
Is this possible, or is there a better way to do this without using arguments and having one script?
Use the source command. It's bash's version of 'include'
No need for "include" if the script is executable:
~/scripts/common.sh
If the script is not executable or does not have an appropriate shebang line then you'll need to specify the interpreter:
bash ~/scripts/common.sh

Resources