I'm trying to exclude the current directory from the tarball without excluding its contents, because when I extract it out using the -k flag I get an exit status of 1 and a message
./: Already exists
tar: Error exit delayed from previous errors.
How do I do this? I've tried the --exclude flag but that excludes the contents also (rightly so). I'm trying to code this for both the OSX/BSD and GNU versions of tar.
Test case:
# Setup
mkdir /tmp/stackoverflow
cd /tmp/stackoverflow
mkdir dir
touch dir/file
# Create
tar cCf dir dir.tar .
# List contents
tar tf dir.tar
gives
./
./file
showing that the current directory ./ is in the tar. This would be fine, but when I do the following:
mkdir dir2
tar xkfC dir.tar dir2
due to the -k flag, I get an exit code of 1 and the message
./: Already exists
tar: Error exit delayed from previous errors.
To exclude the current directory you can create your archive on this way:
tar cf /path/to/dir.tar ./*
use ./*instead of ., this will not match current directory (.) and therefore not include in the archive
This does the trick:
GLOBIGNORE=".:.."
cd dir
tar cf ../dir.tar *
The extra cd is to replace the use of the -C flag, so that we can use a glob. The GLOBIGNORE ignores the current directory, but also sets shopt -s dotglob implicitly to include hidden files. Using * instead of ./* means that the tar file listing doesn't list ./file, but instead lists it as file. The entire listing of the tar is:
file
Related
I have a bash script I'm trying to write
I have 2 base directories:
./tmp/serve/
./src/
I want to go through all the directories in ./tmp and copy the *.html files into the same folder path in ./src
i.e
if I have a html file in ./tmp/serve/app/components/help/ help.html -->
copy to ./src/app/components/help/ And recursively do this for all subdirectories in ./tmp/
NOTE: the folder structures should exist so just need to copy them only. If it doesn't then hopefully it could create the folder for me (not what I want) but with GIT I can track these folders to manually handle those loose html files.
I got as far as
echo $(find . -name "*.html")\n
But not sure how to actually extract the file path with pwd and do what I need to, maybe it's not a one liner and needs to be done with some vars.
something like
for i in `echo $(find /tmp/ -name "*.html")\n
do
cp -r $i /src/app/components/help/
done
going so far to create the directories would take some more time for me.
I'll try to do it on my own and see if I come up with something
but for argument sake if you do run pwd and get a response the pseudo code for that:
pwd
get response
if that directory does not exist in src create that directory
copy all the original directories contents into the new folder at /src/$newfolder
(possibly running two for loops, one to check the directory tree, and then one to go through each original directory, copying all the html files)
You process substitution to loop the output from your find command and create the destination directory(ies) and then copy the file(s):
#!/bin/bash
# accept first parameters to script as src_dir and dest values or
# simply use default values if no parameter(s) passed
src_dir=${1:-/tmp/serve}
dest=${2-src}
while read -r orig_path ; do
# To replace the first occurrence of a pattern with a given string,
# use ${parameter/pattern/string}
dest_path="${orig_path/tmp\/serve/${dest}}"
# Use dirname to remove the filename from the destination path
# and create the destination directory.
dest_dir=$(dirname "${dest_path}")
mkdir -p "${dest_dir}"
cp "${orig_path}" "${dest_path}"
done < <(find "${src_dir}" -name '*.html')
This script copy .html files from src directory to des directory (create the subdirectory if they do not exist)
Find the files, then remove the src directory name and copy them into the destination directory.
#!/bin/bash
for i in `echo $(find src/ -name "*.html")`
do
file=$(echo $i | sed 's/src\///g')
cp -r --parents $i des
done
Not sure if you must use bash constructs or not, but here is a GNU tar solution (if you use GNU tar), which IMHO is the best way to handle this situation because all the metadata for the files (permissions, etc.) are preserved:
find ./tmp/serve -name '*.html' -type f -print0 | tar --null -T - -c | tar -x -v -C ./src --strip-components=3
This finds all the .html files (-type f) in the ./tmp/serve directory and prints them nul-terminated (-print0), then sends these filenames via stdin to tar as nul-terminated literals (--null) for inclusion (-T -), creating (-c) an archive which is then sent to another tar instance which extracts (-x) the archive printing its contents along the way (optional: -v), changing directory to the destination (-C ./src) before commencing and stripping (--strip-components=3) the ./tmp/serve/ prefix from the files. (You could also cd ./tmp/serve beforehand, using find . instead, and change -C to ../../src.)
I have a folder foo containing 3 files.
How do I create a tarball containing only the files. Instead of foo/ with those three files. Using the terminal.
Thanks for your help.
With bash and GNU tar:
shopt -s dotglob
cd /path/to/foo && tar -cvf /tmp/file.tar -- *
From man bash:
dotglob: If set, bash includes filenames beginning with a `.' in the results of pathname expansion.
I have a bash script that is copying some files, but it doesn't seem to be working properly. A side note is that there are no matching files in the source directory. But the point of the script is to copy files if there are files to copy.
A basic snippet of what I'm trying to do:
source_loc=/u01
target_log=/u02
/usr/bin/cp "$source_loc"/dir/*file* "$target_loc"/dir/
Results in
Usage: cp [-fhipHILPU][-d|-e] [-r|-R] [-E{force|ignore|warn}] [--] src target
or: cp [-fhipHILPU] [-d|-e] [-r|-R] [-E{force|ignore|warn}] [--] src1 ... srcN directory
If I add set -x to my script, I get this...
+ /usr/bin/cp /u02/dir/
Usage: cp [-fhipHILPU][-d|-e] [-r|-R] [-E{force|ignore|warn}] [--] src target
or: cp [-fhipHILPU] [-d|-e] [-r|-R] [-E{force|ignore|warn}] [--] src1 ... srcN directory
+ set +x
The EXTRA peculiar thing about this is that if I re-run the script without changing anything, I get this as my output:
cp: /u01/dir/*file*: No such file or directory
Now I haven't tested this script with matching files in the source (I will be very shortly) but I want to make sure I'm not missing something. I don't care about getting an error, I just want to be sure I get the correct error (i.e. no such file or directory).
Any insight would be appreciated.
You can use find as suggested by #elliotfrisch:
find "$source_dir/dir" -type f -name "*file*" -maxdepth 1 -exec cp {} "$target_loc/dir" \;
Alternatively, in Bash, you can capture the glob results into an array and invoke cp when the array is not empty:
shop -s nullglob # glob expands to nothing if there are no matching files
files=("$source_loc/dir/"*file*)
((${#files[#]} > 0)) && cp "${files[#]}" "$target_loc"/dir/
/tmp/-> ls ab*
/tmp/-> ls: ab*: No such file or directory
/tmp/-> tar -cvf ab.tar abc*
tar: abc*: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors
/tmp/->
/tmp/-> ls ab*
ab.tar
/tmp/-> tar -tvf ab.tar
/tmp/->
As can be seen there are no files matching pattern abc*, however output file named ab.tar got created with no content. Is there a switch/option than can be passed to tar command so that no output file is created when there are no input file?
I’m fond of using a for-as-if construct for such cases:
for x in abc*; do
# exit the loop if no file matching abc* exists
test -e "$x" || break
# by now we know at least one exists (first loop iteration)
tar -cvf ab.tar abc*
# and since we now did the deed already… exit the “loop”
break
done
The body of the “loop” is run through exactly once, but the shell does the globbing for us. (I normally use continue in the place of the first break, but that’s probably not needed.)
Alternatively, you can use the shell to expand the glob into $*…
set -- abc*
test -e "$1" && tar -cvf ab.tar abc*
If your script runs under set -e, use if test …; then tar …; fi instead, otherwise it will abort when no file exists.
All these variants work in plain sh as well.
There is a way to get the shell to do it:
#!/bin/sh
# safetar -- execute tar safely
sh -O failglob -c 'tar cvf ab.tar abc*'
Is there a switch/option than can be passed to tar command so that no output file is created when there are no input file?
Gnu tar does not have such an option.
Here are two alternatives. You need to study them and figure out what would work for you, as they're a bit of a hack.
You could do something like:
Tar, test, remove when empty
tar -cvf ab.tar abc* ||
tar tf ab.tar | read ||
rm ab.tar
Explanation:
If tar -cvf ... fails, get the contents with tar tf ....
If the read fails, the archive was empty, and it's save to remove it.
Or you could try:
Test, then tar
ls abc* | read && tar -cvf ab.tar abc*
This would not create the empty tar file in the first place.
I'm using the command cp ./* "backup_$timestamp" in a bash script to backup all files in directory into a backup folder in a subdirectory. This works fine, but the script keeps outputting warning messages:
cp: omitting directory `./backup_1364935268'
How do I tell cp to shut up without silencing any other warnings that I might want to know about?
The solution that works for me is the following:
find -maxdepth 1 -type f -exec cp {} backup_1364935268/ \;
It copies all (including these starting with a dot) files from the current directory, does not touch directories and does not complain about it.
Probably you want to use cp -r in that script. That would copy the source recursively including directories. Directories will get copied and the messages will disappear.
If you don't want to copy directories you can do the following:
redirect stderr to stdout using 2>&1
pipe the output to grep -v
script 2>&1 | grep -v 'omitting directory'
quote from grep man page:
-v, --invert-match
Invert the sense of matching, to select non-matching lines.
When copying a directory, make sure you use -R
cp -R source source_duplicate_copy_name
-R, -r, --recursive copy directories recursively
--reflink[=WHEN] control clone/CoW copies. See below
--remove-destination remove each existing destination file before
attempting to open it (contrast with --force)
--sparse=WHEN control creation of sparse files. See below
--strip-trailing-slashes remove any trailing slashes from each SOURCE