Extract all tar.bz2 files to a directory - bash

I have many tar.bz2 files in a directory, and would like to extract them to another directory.
Here is my bash script:
for i in *.tar.bz2 do;
sudo tar -xvjf $i.tar.bz2 -C ~/myfiles/
done
It doesn't work. How can I make it work? Thanks!

Your variable $i contains the entire file name (as you have applied the regex *.tar.bz2). So inside your for loop you don't need to attach the extension.
Try:
for i in *.tar.bz2; do
sudo tar -xvjf "$i" -C ~/myfiles/
done
You also have ; misplaced.

Related

how to unzip a file using unzip command?

I have an script which creates a folder named "data". Then it downloads a file using wget and these files (.zip format) are moved from the current directory to the folder "data". After that what I want is to unzip these files. I'm using unzip filename.zip and it works when I use it on the cmd, however I don't know why it's not working in the script.
Here is the script:
#!/bin/bash
mkdir data
wget http://187.191.75.115/gobmx/salud/datos_abiertos/datos_abiertos_covid19.zip && mv datos_abiertos_covid19.zip data && unzip datos_abiertos_covid19.zip
wget http://187.191.75.115/gobmx/salud/datos_abiertos/diccionario_datos_covid19.zip && mv diccionario_datos_covid19.zip data && unzip diccionario_datos_covid19.zip
datos_abiertos_covid19.zip and diccionario_datos_covid19.zip are the files I want to unzip once they are in my folder "data". I would really appreciate if someone can help me. Thanks in advance!
It fails because unzip foo.zip assumes foo.zip is in the current directory, but you just moved it to a subdirectory data. Interactively, you probably cd data first and that's why it works.
To make it work in your script, just have your script cd data as well:
#!/bin/bash
mkdir data
cd data || exit 1
wget http://187.191.75.115/gobmx/salud/datos_abiertos/datos_abiertos_covid19.zip && unzip datos_abiertos_covid19.zip
That way, the file is downloaded directly to the data directory so no mv is necessary, and the unzip command works as expected.
My approach:
#!/bin/bash
set -e # Exit if any command fails
mkdir data
pushd ./data >/dev/null
for i in 'datos_abiertos_covid19.zip' 'diccionario_datos_covid19.zip'; do
# Don't unzip (or exit) if 'wget' fails, don't exit if 'unzip' fails
wget "http://187.191.75.115/gobmx/salud/datos_abiertos/$i" -O "./$i" || continue
unzip "./$i" || true
done
popd >/dev/null
The file names don't need to be quoted in this case, but I did so anyway, to emphasise you can/should do so if necessary
You could of course use variables for the file list, URL, download dir, etc. if you wanted to build a more general script for downloading zip files
I know it's marked bash, but worth mentioning: pushd and popd are not defined in POSIX, you can change those to cd ./data and cd .. for more portability. Obviously wget is not POSIX either, but very common (see this thread for interesting info on that topic)

Bash If then that reads a list in a file condition

Here is the condition:
I have a file with all packages installed.
I have a folder with all kinds of other packages, but they include all of the ones in the list, plus more.
I need a bash script that will read the file and check a folder for packages that don't exist in the list then remove them, they are not needed, but keep the packages that are on the list in that folder.
Or perhaps the bash should read folder then if packages in the folder aren't on the list them rm -f that or those packages.
I am familiar with writing if then conditional statements, I just don't know how to do if making the items in the list a variable or variables (in a loop).
thanks!
I would move the packages on the list to a new folder, delete the original folder, and move the temporary folder back:
DIR=directory-name
mkdir "$DIR-tmp"
while read pkgname; do
if [[ -f "$DIR/$pkgname" ]]; then
mv "$DIR/$pkgname" "$DIR-tmp"
fi
done < package-list.txt
# Confirm $DIR-tmp has the files you want first!
rm -rf "$DIR"
mv "$DIR-tmp" "$DIR"
I think you want something like this:
for file in $(ls folder) ; do
grep -E "$file" install-list-file >/dev/null || \
echo $file
done > rm-list
vi rm-list # view file to ensure correct
rm $(<rm_list)
There are ways to make this faster (using parameter substitution to avoid fork/exec's), but I recommend avoiding fancy shell stuff [${file##*/}] until you've got the basics down. Also, this script basically translates the description into a script and is not intended to be much more than a guide on how to approach the problem.

Loop to unzip from one directory to individual directories

I am trying to design a loop that implements a lot of single elements I have seen before and the combination is throwing me off. Basically I have a structure like this:
/bin/script.sh
/conf/patches/1.zip
/conf/patches/2.zip
/conf/patches/3.zip
/sharedir
I want a loop that will go through however many patches I have in /conf/patches, unzip each patch into a separate directory in /sharedir. Each directory should be named the name of the file.
What I was trying so far was:
for file in '../conf/patches/*.zip'
do
unzip "${file%%.zip}" -d /sharedir/$file
done
As you can see...there is definitely something I am missing in this combination.
Try this:
for file in /conf/patches/*.zip
do
f="${file##*/}"
mkdir -p "/sharedir/${f%.zip}"
unzip -d "/sharedir/${f%.zip}" "${file}"
done
Remove quotes from glob pattern otherwise it is not expanded:
for file in ../conf/patches/*.zip
do
unzip "${file%%.zip}" -d /sharedir/
done
EDIT: You can try
for f in ../conf/patches/*.zip; do
echo g="${f%%/*}"
unzip -d "sharedir/${g%*.zip}" "$f"
done

Bash shell: how to add a name

I am trying to rename some zip files in bash with an _orig but I seem to be missing something. Any suggestions??
My goal:
move files to an orig directory
rename original files with a "_orig" in the name
The code Ive tried to write:
mv -v $PICKUP/*.zip $ORIGINALS
for origfile in $(ls $ORIGINALS/*.zip);do
echo "Adding _orig to zip file"
echo
added=$(basename $origfile '_orig').zip
mv -v $ORIGINALS/$origfile.zip $ORIGINALS/$added.zip
done
Sorry still kinda new at this.
Using (p)rename :
cd <ZIP DIR>
mkdir -p orig
rename 's#(.*?)\.zip#orig/$1_orig.zip#' *.zip
rename is http://search.cpan.org/~pederst/rename/ (default on many distros)
Thanks to never use
for i in $(ls $ORIGINALS/*.zip);do
but use globs instead :
for i in $ORIGINALS/*.zip;do
See http://porkmail.org/era/unix/award.html#ls.
I know you've got a solution already, but just for posterity, this simplified version of your own shell script should also work for the case you seem to be describing:
mkdir -p "$ORIGINALS"
for file in "$PICKUP"/*.zip; do
mv -v "$file" "$ORIGINALS/${file%.zip}_orig.zip"
done
This makes use of "Parameter Expansion" in bash (you can look that up in bash's man page). The initial mkdir -p simply insures that the target directory exists. The quotes around $PICKUP and $ORIGINALS are intended to make it safe to include special characters like spaces and newlines in the directory names.
While prename is a powerful solution to many problems, it's certainly not the only hammer in the toolbox.

bash - For every file in a directory, copy it into another directory, only if it doesn't exists there already

Thank you very much in advance for helping!
I have a directory with some html files
$ ls template/content/html
devel.html
idex.html
devel_iphone.html
devel_ipad.html
I'd like to write a bash function to copy every file in that folder into a new location (introduction/files/), ONLY if a file with the same name doesn't exist already there.
This is what I have so far:
orig_html="template/content/html";
dest_html="introduction/files/";
function add_html {
for f in $orig_html"/*";
do
if [ ! -f SAME_FILE_IN_$dest_html_DIRECTORY ];
then
cp $f $dest_html;
fi
done
}
The capital letters is where I was stuck.
Thank you very much.
Would the -n option be enough for your needs?
-n, --no-clobber
do not overwrite an existing file (overrides a previous -i option)
use rsync like this:
rsync -c -avz --delete $orig_html $dest_html
which keep $orig_html indentical with $dest_html based file checksum.
Do you need a bash script ? cp supports the -r (recursive) option, and the -u (update) option. From the man page:
-u, --update
copy only when the SOURCE file is newer than the destination
file or when the destination file is missing
Your $f variable contains the full path, because of the /*.
Try doing something like:
for ff in $orig_html/*
do
thisFile=${ff##*/}
if [ ! -f ${dest_html}/$thisFile ]; then
cp $ff ${dest_html}
fi
done

Resources