/tmp/-> ls ab*
/tmp/-> ls: ab*: No such file or directory
/tmp/-> tar -cvf ab.tar abc*
tar: abc*: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors
/tmp/->
/tmp/-> ls ab*
ab.tar
/tmp/-> tar -tvf ab.tar
/tmp/->
As can be seen there are no files matching pattern abc*, however output file named ab.tar got created with no content. Is there a switch/option than can be passed to tar command so that no output file is created when there are no input file?
I’m fond of using a for-as-if construct for such cases:
for x in abc*; do
# exit the loop if no file matching abc* exists
test -e "$x" || break
# by now we know at least one exists (first loop iteration)
tar -cvf ab.tar abc*
# and since we now did the deed already… exit the “loop”
break
done
The body of the “loop” is run through exactly once, but the shell does the globbing for us. (I normally use continue in the place of the first break, but that’s probably not needed.)
Alternatively, you can use the shell to expand the glob into $*…
set -- abc*
test -e "$1" && tar -cvf ab.tar abc*
If your script runs under set -e, use if test …; then tar …; fi instead, otherwise it will abort when no file exists.
All these variants work in plain sh as well.
There is a way to get the shell to do it:
#!/bin/sh
# safetar -- execute tar safely
sh -O failglob -c 'tar cvf ab.tar abc*'
Is there a switch/option than can be passed to tar command so that no output file is created when there are no input file?
Gnu tar does not have such an option.
Here are two alternatives. You need to study them and figure out what would work for you, as they're a bit of a hack.
You could do something like:
Tar, test, remove when empty
tar -cvf ab.tar abc* ||
tar tf ab.tar | read ||
rm ab.tar
Explanation:
If tar -cvf ... fails, get the contents with tar tf ....
If the read fails, the archive was empty, and it's save to remove it.
Or you could try:
Test, then tar
ls abc* | read && tar -cvf ab.tar abc*
This would not create the empty tar file in the first place.
Related
I wanted to write a bash script that will unpack .tar.gz archives and for each result file it will set an additional attribute with the name of the original archive. Just to know what the origin is of the unpacked file.
I tried to store the inside files in an array and then for-loop them.
for archive in "$1"*.tar.gz; do
if [ -f "${archive}" ]
then
readarray -t fileNames < <(tar tzf "$archive")
for file in "${fileNames}"; do
echo "${file}"
tar xvzf "${archive}" -C "$1" --no-wildcards "${file}" &&
attr -s package -V "${archive}" "${file}"
done
fi
done
The result is that only one file is extracted and no extra attribute is set.
#! /bin/bash
for archive in "$1"*.tar.gz; do
if [ -f "${archive}" ] ; then
# Unpack the archive into subfolder $1
tar xvf "$archive" -C "$1"
# Assign attributes
tar tf "$archive" | (cd "$1" && xargs -t -L1 attr -s package -V "$archive" )
fi
done
Notes:
Script is unpacking each archive with a single 'tar'. This is more efficient than unpacing one file at a time. It also avoid issues with unpacking folders, which will lead to unnecessary repeated work.
Script is using 'attr'. Will be better to use 'setfattr', if supported on target file system to set attributes on multiple files with a few calls (using xargs, with multiple files per command)
It is not clear what is the structure of the output folder. From the question, it looks as if all archives will be placed into the same folder "$1". The following solution assume that this is the intended behavior, and that each archive will have distinct file names. If each archive is to be placed into different sub folder, it will be easier/more efficient to implement.
I'm trying to exclude the current directory from the tarball without excluding its contents, because when I extract it out using the -k flag I get an exit status of 1 and a message
./: Already exists
tar: Error exit delayed from previous errors.
How do I do this? I've tried the --exclude flag but that excludes the contents also (rightly so). I'm trying to code this for both the OSX/BSD and GNU versions of tar.
Test case:
# Setup
mkdir /tmp/stackoverflow
cd /tmp/stackoverflow
mkdir dir
touch dir/file
# Create
tar cCf dir dir.tar .
# List contents
tar tf dir.tar
gives
./
./file
showing that the current directory ./ is in the tar. This would be fine, but when I do the following:
mkdir dir2
tar xkfC dir.tar dir2
due to the -k flag, I get an exit code of 1 and the message
./: Already exists
tar: Error exit delayed from previous errors.
To exclude the current directory you can create your archive on this way:
tar cf /path/to/dir.tar ./*
use ./*instead of ., this will not match current directory (.) and therefore not include in the archive
This does the trick:
GLOBIGNORE=".:.."
cd dir
tar cf ../dir.tar *
The extra cd is to replace the use of the -C flag, so that we can use a glob. The GLOBIGNORE ignores the current directory, but also sets shopt -s dotglob implicitly to include hidden files. Using * instead of ./* means that the tar file listing doesn't list ./file, but instead lists it as file. The entire listing of the tar is:
file
I am writing a bash script that pulls files from another server to the current directory. The issue is that I get a lot of files and I only need ~3 of them; however all 3 might not be there.
For example, make server all:
server call --> file1.txt file2.txt file3.xls file4.json .... (etc)
Then compress files with tar:
tar zcf needed_files.tgz file4.json file23.doc *.txt
But file4.json was not there, so I would expect tar to compress file23.doc and all .txt files but the script fails with:
tar: file4.json: Cannot stat: No such file or directory
I have tried other combinations of tar commands like czvf but no luck.
tar should successfully compress the existing files despite the "no such file or directory" errors.
Anyway, you could also use nullglob in combination with extglob #() to get only the existing files:
shopt -s extglob nullglob
files=( "fileA"#() "fileB"#() *.txt )
(( ${#files[#]} )) && tar zcf needed_files.tgz -- "${files[#]}"
Try an extended glob.
shopt -s extglob # set extended globbing on
if echo file[1234].+(txt|xls|json) | grep -vq '\['
then tar cvzf needed_files.tgz file[1234].+(txt|xls|json)
else echo No matching files for extglob 'file[1234].+(txt|xls|json)'
fi
If matching files exist, it will list them.
If not, it will literally echo back the pattern.
grepping out the pattern metacharacters tells you whether there are any files in the set. If they do exist, use the same glob to provide the files to tar, and it will receive exactly the set of matching files. If they don't, the condition test lets you skip it.
Of course, it breaks if you make files with [ in the names, etc...
Or, you could do it in a loop....
for f in file[1234].+(txt|xls|json)
do if [[ -e "$f" ]]
then [[ -e needed_files.tar ]] && c=r || c=c
tar ${c}vf needed_files.tar "$f"
fi
done
Not perfect, but might suit your tastes better.
Neither is a great solution, but one of them ought to get you rolling.
tar zcf needed_files.tgz $(ls -d file4.json file23.doc *.txt 2>/dev/null)
Notice that prints only existing files
ls -d file4.json file23.doc *.txt 2>/dev/null
Also you can use --ignore-failed-read option, but it will also ignore other read errors.
Often after unzipping a file I end up with a directory containing nothing but another directory (e.g., mkdir foo; cd foo; tar xzf ~/bar.tgz may produce nothing but a bar directory in foo). I wanted to write a script to collapse that down to a single directory, but if there are dot files in the nested directory it complicates things a bit.
Here's a naive implementation:
mv -i $1/* $1/.* .
rmdir $1
The only problem here is that it'll also try to move . and .. and ask overwrite ./.? (y/n [n]). I can get around this by checking each file in turn:
IFS=$'\n'
for file in $1/* $1/.*; do
if [ "$file" != "$1/." ] && [ "$file" != "$1/.." ]; then
mv -i $file .
fi
done
rmdir $1
But this seems like an inelegant workaround. I tried a cleaner method using find:
for file in $(find $1); do
mv -i $file .
done
rmdir $1
But find $1 will also give $1 as a result, which gives an error of mv: bar and ./bar are identical.
While the second method seems to work, is there a better way to achieve this?
Turn on the dotglob shell option, which allows the your pattern to match files beginning with ..
shopt -s dotglob
mv -i "$1"/* .
rmdir "$1"
First, consider that many tar implementations provide a --strip-components option that allows you to strip off that first path. Not sure if there is a first path?
tar -tf yourball.tar | awk -F/ '!s[$1]++{print$1}'
will show you all the first-level contents. If there is only that one directory, then
tar --strip-components=1 -tf yourball.tar
will extract the contents of that directory in tar into the current directory.
So that's how you can avoid the problem altogether. But it's also a solution to your immediate problem. Having extracted the files already, so you have
foo/bar/stuff
foo/bar/.otherstuff
you can do
tar -cf- foo | tar --strip-components=2 -C final_destination -xf-
The --strip-components feature is not part of the POSIX specification for tar, but it is on both the common GNU and OSX/BSD implementations.
I would like to untar an archive e.g. "tar123.tar.gz" to directory /myunzip/tar123/" using a shell command.
tar -xf tar123.tar.gz will decompress the files but in the same directory as where I'm working in.
If the filename would be "tar233.tar.gz" I want it to be decompressed to /myunzip/tar233.tar.gz" so destination directory would be based on the filename.
Does anyone know if the tar command can do this?
tar -xzvf filename.tar.gz -C destination_directory
With Bash and GNU tar:
file=tar123.tar.gz
dir=/myunzip/${file%.tar.gz}
mkdir -p $dir
tar -C $dir -xzf $file
You can change directory before extracting with the -C flag, but the directory has to exist already. (If you create file-specific directories, I strongly recommend against calling them foo.tar.gz - the extension implies that it's an archive file but it's actually a directory. That will lead to confusion.)
Try
file=tar123.tar.gz
dir=/myunzip/$(basename $file .tar.gz) # matter of taste and custom here
[ -d "$dir" ] && { echo "$dir already exists" >&2; exit 1; }
mkdir "$dir" && ( gzip -d "$file | ( cd "$dir" && tar xf - ) )
If you're using GNU tar you can also give it an option -C "$dir" which will cause it to change to the directory before extracting. But the code above should work even with a Bronze Age tar.
Disclaimer: none of the code above has been tested.