Bash Extract tar.gz file - bash

I have my tar file under:
/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz
With the tar command:
tar -tf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz"
The command show me:
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/exit_codes/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/exit_codes/code_FUNC
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/exit_codes/code_SCRI
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/check_appprivilege.php
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/check_login.php
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/privilege.php
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/Lysto BackUp/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/Lysto BackUp/sys
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/Lysto BackUp/sys_func
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/SSH_ERROR
My Plan or better my wish is to handle it like this:
IFS=$'\n'
for PATHS in $(tar -tPf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz")
do
SED=$(echo "$PATHS" | sed 's/.*\///')
if [[ -n "$SED" ]]
then
tar -C "${target_archiv}" -xvf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz" "$PATHS"
#echo JA
echo "$PATHS"
fi
done
unset IFS
i only want one file of the tar and Store this to a different Directory....
but this command with the -C don´t work... it Extract all the files of the tar....
My Question is, is it possible to extract only one file of the Tar without cd to the Directory ??
Another Question: is it possible to Extract only the files of the tar without the Folders this is maybe the better way but I don´t know how...?
and no I can not tar the files without the paths of it I need them...
so this is no way for me...
I hope for help here :)

If your ultimate goal is to extract files without the full path, you can use a SED-like expression to rename the files while they are extracted, using the --xform option:
tar -C "${target_archiv}" -xvf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz" --xform='s,^.*/,,'
The 's,^.*/,,' expression asks to substitute (s) from the beginning of the filename (^), capture everything (.*) and stop at the last slash (/) then replace it with nothing. In other words, it removes the directory structure from the filenames.
If you want to get rid of the empty folders that have been extracted, you may call this command after extracting:
find "${target_archiv}" -mindepth 1 -maxdepth 1 -type d -exec rmdir {} \;
Keep in mind it will remove all the (empty) subfolders of "${target_archiv}", even the ones that were already here before extracting the tarball. However, because rmdir will not remove directories that contain files, it will be mostly harmless to the subdirectories you had.

Related

How to write a bash script to copy files from one base to another base location

I have a bash script I'm trying to write
I have 2 base directories:
./tmp/serve/
./src/
I want to go through all the directories in ./tmp and copy the *.html files into the same folder path in ./src
i.e
if I have a html file in ./tmp/serve/app/components/help/ help.html -->
copy to ./src/app/components/help/ And recursively do this for all subdirectories in ./tmp/
NOTE: the folder structures should exist so just need to copy them only. If it doesn't then hopefully it could create the folder for me (not what I want) but with GIT I can track these folders to manually handle those loose html files.
I got as far as
echo $(find . -name "*.html")\n
But not sure how to actually extract the file path with pwd and do what I need to, maybe it's not a one liner and needs to be done with some vars.
something like
for i in `echo $(find /tmp/ -name "*.html")\n
do
cp -r $i /src/app/components/help/
done
going so far to create the directories would take some more time for me.
I'll try to do it on my own and see if I come up with something
but for argument sake if you do run pwd and get a response the pseudo code for that:
pwd
get response
if that directory does not exist in src create that directory
copy all the original directories contents into the new folder at /src/$newfolder
(possibly running two for loops, one to check the directory tree, and then one to go through each original directory, copying all the html files)
You process substitution to loop the output from your find command and create the destination directory(ies) and then copy the file(s):
#!/bin/bash
# accept first parameters to script as src_dir and dest values or
# simply use default values if no parameter(s) passed
src_dir=${1:-/tmp/serve}
dest=${2-src}
while read -r orig_path ; do
# To replace the first occurrence of a pattern with a given string,
# use ${parameter/pattern/string}
dest_path="${orig_path/tmp\/serve/${dest}}"
# Use dirname to remove the filename from the destination path
# and create the destination directory.
dest_dir=$(dirname "${dest_path}")
mkdir -p "${dest_dir}"
cp "${orig_path}" "${dest_path}"
done < <(find "${src_dir}" -name '*.html')
This script copy .html files from src directory to des directory (create the subdirectory if they do not exist)
Find the files, then remove the src directory name and copy them into the destination directory.
#!/bin/bash
for i in `echo $(find src/ -name "*.html")`
do
file=$(echo $i | sed 's/src\///g')
cp -r --parents $i des
done
Not sure if you must use bash constructs or not, but here is a GNU tar solution (if you use GNU tar), which IMHO is the best way to handle this situation because all the metadata for the files (permissions, etc.) are preserved:
find ./tmp/serve -name '*.html' -type f -print0 | tar --null -T - -c | tar -x -v -C ./src --strip-components=3
This finds all the .html files (-type f) in the ./tmp/serve directory and prints them nul-terminated (-print0), then sends these filenames via stdin to tar as nul-terminated literals (--null) for inclusion (-T -), creating (-c) an archive which is then sent to another tar instance which extracts (-x) the archive printing its contents along the way (optional: -v), changing directory to the destination (-C ./src) before commencing and stripping (--strip-components=3) the ./tmp/serve/ prefix from the files. (You could also cd ./tmp/serve beforehand, using find . instead, and change -C to ../../src.)

How to exclude a list of files and folders while using tar? [duplicate]

Is there a simple shell command/script that supports excluding certain files/folders from being archived?
I have a directory that need to be archived with a sub directory that has a number of very large files I do not need to backup.
Not quite solutions:
The tar --exclude=PATTERN command matches the given pattern and excludes those files, but I need specific files & folders to be ignored (full file path), otherwise valid files might be excluded.
I could also use the find command to create a list of files and exclude the ones I don't want to archive and pass the list to tar, but that only works with for a small amount of files. I have tens of thousands.
I'm beginning to think the only solution is to create a file with a list of files/folders to be excluded, then use rsync with --exclude-from=file to copy all the files to a tmp directory, and then use tar to archive that directory.
Can anybody think of a better/more efficient solution?
EDIT: Charles Ma's solution works well. The big gotcha is that the --exclude='./folder' MUST be at the beginning of the tar command. Full command (cd first, so backup is relative to that directory):
cd /folder_to_backup
tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz .
You can have multiple exclude options for tar so
$ tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz .
etc will work. Make sure to put --exclude before the source and destination items.
You can exclude directories with --exclude for tar.
If you want to archive everything except /usr you can use:
tar -zcvf /all.tgz / --exclude=/usr
In your case perhaps something like
tar -zcvf archive.tgz arc_dir --exclude=dir/ignore_this_dir
Possible options to exclude files/directories from backup using tar:
Exclude files using multiple patterns
tar -czf backup.tar.gz --exclude=PATTERN1 --exclude=PATTERN2 ... /path/to/backup
Exclude files using an exclude file filled with a list of patterns
tar -czf backup.tar.gz -X /path/to/exclude.txt /path/to/backup
Exclude files using tags by placing a tag file in any directory that should be skipped
tar -czf backup.tar.gz --exclude-tag-all=exclude.tag /path/to/backup
old question with many answers, but I found that none were quite clear enough for me, so I would like to add my try.
if you have the following structure
/home/ftp/mysite/
with following file/folders
/home/ftp/mysite/file1
/home/ftp/mysite/file2
/home/ftp/mysite/file3
/home/ftp/mysite/folder1
/home/ftp/mysite/folder2
/home/ftp/mysite/folder3
so, you want to make a tar file that contain everyting inside /home/ftp/mysite (to move the site to a new server), but file3 is just junk, and everything in folder3 is also not needed, so we will skip those two.
we use the format
tar -czvf <name of tar file> <what to tar> <any excludes>
where the c = create, z = zip, and v = verbose (you can see the files as they are entered, usefull to make sure none of the files you exclude are being added). and f= file.
so, my command would look like this
cd /home/ftp/
tar -czvf mysite.tar.gz mysite --exclude='file3' --exclude='folder3'
note the files/folders excluded are relatively to the root of your tar (I have tried full path here relative to / but I can not make that work).
hope this will help someone (and me next time I google it)
You can use standard "ant notation" to exclude directories relative.
This works for me and excludes any .git or node_module directories:
tar -cvf myFile.tar --exclude=**/.git/* --exclude=**/node_modules/* -T /data/txt/myInputFile.txt 2> /data/txt/myTarLogFile.txt
myInputFile.txt contains:
/dev2/java
/dev2/javascript
This exclude pattern handles filename suffix like png or mp3 as well as directory names like .git and node_modules
tar --exclude={*.png,*.mp3,*.wav,.git,node_modules} -Jcf ${target_tarball} ${source_dirname}
I've experienced that, at least with the Cygwin version of tar I'm using ("CYGWIN_NT-5.1 1.7.17(0.262/5/3) 2012-10-19 14:39 i686 Cygwin" on a Windows XP Home Edition SP3 machine), the order of options is important.
While this construction worked for me:
tar cfvz target.tgz --exclude='<dir1>' --exclude='<dir2>' target_dir
that one didn't work:
tar cfvz --exclude='<dir1>' --exclude='<dir2>' target.tgz target_dir
This, while tar --help reveals the following:
tar [OPTION...] [FILE]
So, the second command should also work, but apparently it doesn't seem to be the case...
Best rgds,
I found this somewhere else so I won't take credit, but it worked better than any of the solutions above for my mac specific issues (even though this is closed):
tar zc --exclude __MACOSX --exclude .DS_Store -f <archive> <source(s)>
After reading all this good answers for different versions and having solved the problem for myself, I think there are very small details that are very important, and rare to GNU/Linux general use, that aren't stressed enough and deserves more than comments.
So I'm not going to try to answer the question for every case, but instead, try to register where to look when things doesn't work.
IT IS VERY IMPORTANT TO NOTICE:
THE ORDER OF THE OPTIONS MATTER: it is not the same put the --exclude before than after the file option and directories to backup. This is unexpected at least to me, because in my experience, in GNU/Linux commands, usually the order of the options doesn't matter.
Different tar versions expects this options in different order: for instance, #Andrew's answer indicates that in GNU tar v 1.26 and 1.28 the excludes comes last, whereas in my case, with GNU tar 1.29, it's the other way.
THE TRAILING SLASHES MATTER: at least in GNU tar 1.29, it shouldn't be any.
In my case, for GNU tar 1.29 on Debian stretch, the command that worked was
tar --exclude="/home/user/.config/chromium" --exclude="/home/user/.cache" -cf file.tar /dir1/ /home/ /dir3/
The quotes didn't matter, it worked with or without them.
I hope this will be useful to someone.
If you are trying to exclude Version Control System (VCS) files, tar already supports two interesting options about it! :)
Option : --exclude-vcs
This option excludes files and directories used by following version control systems: CVS, RCS, SCCS, SVN, Arch, Bazaar, Mercurial, and Darcs.
As of version 1.32, the following files are excluded:
CVS/, and everything under it
RCS/, and everything under it
SCCS/, and everything under it
.git/, and everything under it
.gitignore
.gitmodules
.gitattributes
.cvsignore
.svn/, and everything under it
.arch-ids/, and everything under it
{arch}/, and everything under it
=RELEASE-ID
=meta-update
=update
.bzr
.bzrignore
.bzrtags
.hg
.hgignore
.hgrags
_darcs
Option : --exclude-vcs-ignores
When archiving directories that are under some version control system (VCS), it is often convenient to read exclusion patterns from this VCS' ignore files (e.g. .cvsignore, .gitignore, etc.) This option provide such possibility.
Before archiving a directory, see if it contains any of the following files: cvsignore, .gitignore, .bzrignore, or .hgignore. If so, read ignore patterns from these files.
The patterns are treated much as the corresponding VCS would treat them, i.e.:
.cvsignore
Contains shell-style globbing patterns that apply only to the directory where this file resides. No comments are allowed in the file. Empty lines are ignored.
.gitignore
Contains shell-style globbing patterns. Applies to the directory where .gitfile is located and all its subdirectories.
Any line beginning with a # is a comment. Backslash escapes the comment character.
.bzrignore
Contains shell globbing-patterns and regular expressions (if prefixed with RE:(16). Patterns affect the directory and all its subdirectories.
Any line beginning with a # is a comment.
.hgignore
Contains posix regular expressions(17). The line syntax: glob switches to shell globbing patterns. The line syntax: regexp switches back. Comments begin with a #. Patterns affect the directory and all its subdirectories.
Example
tar -czv --exclude-vcs --exclude-vcs-ignores -f path/to/my-tar-file.tar.gz path/to/my/project/
I'd like to show another option I used to get the same result as the answers before provide, I had a similar case where I wanted to backup android studio projects all together in a tar file to upload to media fire, using the du command to find the large files, I found that I didn't need some directories like:
build, linux e .dart_tools
Using the first answer of Charles_ma I modified it a little bit to be able to run the command from the parent directory of the my Android directory.
tar --exclude='*/build' --exclude='*/linux' --exclude='*/.dart_tool' -zcvf androidProjects.tar Android/
It worked like a charm.
Ps. Sorry if this kind of answer is not allowed, if this is the case I will remove.
For Mac OSX I had to do
tar -zcv --exclude='folder' -f theOutputTarFile.tar folderToTar
Note the -f after the --exclude=
For those who have issues with it, some versions of tar would only work properly without the './' in the exclude value.
Tar --version
tar (GNU tar) 1.27.1
Command syntax that work:
tar -czvf ../allfiles-butsome.tar.gz * --exclude=acme/foo
These will not work:
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude=./acme/foo
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude='./acme/foo'
$ tar --exclude=./acme/foo -czvf ../allfiles-butsome.tar.gz *
$ tar --exclude='./acme/foo' -czvf ../allfiles-butsome.tar.gz *
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude=/full/path/acme/foo
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude='/full/path/acme/foo'
$ tar --exclude=/full/path/acme/foo -czvf ../allfiles-butsome.tar.gz *
$ tar --exclude='/full/path/acme/foo' -czvf ../allfiles-butsome.tar.gz *
I agree the --exclude flag is the right approach.
$ tar --exclude='./folder_or_file' --exclude='file_pattern' --exclude='fileA'
A word of warning for a side effect that I did not find immediately obvious:
The exclusion of 'fileA' in this example will search for 'fileA' RECURSIVELY!
Example:A directory with a single subdirectory containing a file of the same name (data.txt)
data.txt
config.txt
--+dirA
| data.txt
| config.docx
If using --exclude='data.txt' the archive will not contain EITHER data.txt file. This can cause unexpected results if archiving third party libraries, such as a node_modules directory.
To avoid this issue make sure to give the entire path, like --exclude='./dirA/data.txt'
After reading this thread, I did a little testing on RHEL 5 and here are my results for tarring up the abc directory:
This will exclude the directories error and logs and all files under the directories:
tar cvpzf abc.tgz abc/ --exclude='abc/error' --exclude='abc/logs'
Adding a wildcard after the excluded directory will exclude the files but preserve the directories:
tar cvpzf abc.tgz abc/ --exclude='abc/error/*' --exclude='abc/logs/*'
To avoid possible 'xargs: Argument list too long' errors due to the use of find ... | xargs ... when processing tens of thousands of files, you can pipe the output of find directly to tar using find ... -print0 | tar --null ....
# archive a given directory, but exclude various files & directories
# specified by their full file paths
find "$(pwd -P)" -type d \( -path '/path/to/dir1' -or -path '/path/to/dir2' \) -prune \
-or -not \( -path '/path/to/file1' -or -path '/path/to/file2' \) -print0 |
gnutar --null --no-recursion -czf archive.tar.gz --files-from -
#bsdtar --null -n -czf archive.tar.gz -T -
You can also use one of the "--exclude-tag" options depending on your needs:
--exclude-tag=FILE
--exclude-tag-all=FILE
--exclude-tag-under=FILE
The folder hosting the specified FILE will be excluded.
Use the find command in conjunction with the tar append (-r) option. This way you can add files to an existing tar in a single step, instead of a two pass solution (create list of files, create tar).
find /dir/dir -prune ... -o etc etc.... -exec tar rvf ~/tarfile.tar {} \;
You can use cpio(1) to create tar files. cpio takes the files to archive on stdin, so if you've already figured out the find command you want to use to select the files the archive, pipe it into cpio to create the tar file:
find ... | cpio -o -H ustar | gzip -c > archive.tar.gz
gnu tar v 1.26 the --exclude needs to come after archive file and backup directory arguments, should have no leading or trailing slashes, and prefers no quotes (single or double). So relative to the PARENT directory to be backed up, it's:
tar cvfz /path_to/mytar.tgz ./dir_to_backup --exclude=some_path/to_exclude
tar -cvzf destination_folder source_folder -X /home/folder/excludes.txt
-X indicates a file which contains a list of filenames which must be excluded from the backup. For Instance, you can specify *~ in this file to not include any filenames ending with ~ in the backup.
Success Case:
1) if giving full path to take backup, in exclude also should be used full path.
tar -zcvf /opt/ABC/BKP_27032020/backup_27032020.tar.gz --exclude='/opt/ABC/csv/' --exclude='/opt/ABC/log/' /opt/ABC
2) if giving current path to take backup, in exclude also should be used current path only.
tar -zcvf backup_27032020.tar.gz --exclude='ABC/csv/' --exclude='ABC/log/' ABC
Failure Case:
if giving currentpath directory to take backup and full path to ignore,then wont work
tar -zcvf /opt/ABC/BKP_27032020/backup_27032020.tar.gz --exclude='/opt/ABC/csv/' --exclude='/opt/ABC/log/' ABC
Note: mentioning exclude before/after backup directory is fine.
It seems to be impossible to exclude directories with absolute paths.
As soon as ANY of the paths are absolute (source or/and exclude) the exclude command will not work. That's my experience after trying all possible combinations.
Check it out
tar cvpzf zip_folder.tgz . --exclude=./public --exclude=./tmp --exclude=./log --exclude=fileName
I want to have fresh front-end version (angular folder) on localhost.
Also, git folder is huge in my case, and I want to exclude it.
I need to download it from server, and unpack it in order to run application.
Compress angular folder from /var/lib/tomcat7/webapps, move it to /tmp folder with name angular.23.12.19.tar.gz
Command :
tar --exclude='.git' -zcvf /tmp/angular.23.12.19.tar.gz /var/lib/tomcat7/webapps/angular/
Your best bet is to use find with tar, via xargs (to handle the large number of arguments). For example:
find / -print0 | xargs -0 tar cjf tarfile.tar.bz2
Possible redundant answer but since I found it useful, here it is:
While a FreeBSD root (i.e. using csh) I wanted to copy my whole root filesystem to /mnt but without /usr and (obviously) /mnt. This is what worked (I am at /):
tar --exclude ./usr --exclude ./mnt --create --file - . (cd /mnt && tar xvd -)
My whole point is that it was necessary (by putting the ./) to specify to tar that the excluded directories where part of the greater directory being copied.
My €0.02
I had no luck getting tar to exclude a 5 Gigabyte subdirectory a few levels deep. In the end, I just used the unix Zip command. It worked a lot easier for me.
So for this particular example from the original post
(tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz . )
The equivalent would be:
zip -r /backup/filename.zip . -x upload/folder/**\* upload/folder2/**\*
(NOTE: Here is the post I originally used that helped me https://superuser.com/questions/312301/unix-zip-directory-but-excluded-specific-subdirectories-and-everything-within-t)
The following bash script should do the trick. It uses the answer given here by Marcus Sundman.
#!/bin/bash
echo -n "Please enter the name of the tar file you wish to create with out extension "
read nam
echo -n "Please enter the path to the directories to tar "
read pathin
echo tar -czvf $nam.tar.gz
excludes=`find $pathin -iname "*.CC" -exec echo "--exclude \'{}\'" \;|xargs`
echo $pathin
echo tar -czvf $nam.tar.gz $excludes $pathin
This will print out the command you need and you can just copy and paste it back in. There is probably a more elegant way to provide it directly to the command line.
Just change *.CC for any other common extension, file name or regex you want to exclude and this should still work.
EDIT
Just to add a little explanation; find generates a list of files matching the chosen regex (in this case *.CC). This list is passed via xargs to the echo command. This prints --exclude 'one entry from the list'. The slashes () are escape characters for the ' marks.

Loop through and unzip directories and then unzip items in subdirectories

I have a folder designed in the following way:
-parentDirectory
---folder1.zip
----item1
-----item1.zip
-----item2.zip
-----item3.zip
---folder2.zip
----item1
-----item1.zip
-----item2.zip
-----item3.zip
---folder3.zip
----item1
-----item1.zip
-----item2.zip
-----item3.zip
I would like to write a bash script that will loop through and unzip the folders and then go into each subdirectory of those folders and unzip the files and name those files a certain way.
I have tried the following
cd parentDirectory
find ./ -name \*.zip -exec unzip {} \;
count=1
for fname in *
do
unzip
mv $fname $attempt{count}.cpp
count=$(($count + 1))
done
I thought the first two lines would go into the parentDirectory folder and unzip all zips in that folder and then the for loop would handle the unzipping and renaming. But instead, it unzipped everything it could and placed it in the parentDirectory. I would like to maintain the same directory structure I have.
Any help would be appreciated
excerpt from man unzip
[-d exdir]
An optional directory to which to extract files. By default, all files and subdirectories are recreated in the current directory; the -d option allows extraction in an arbitrary directory (always assuming one has permission to write to the directory).
It's doing exactly what you told it, and what would happen if you had done the same on the command line. Just tell it where to extract, since you want it to extract there.
see Ubuntu bash script: how to split path by last slash? for an example of splitting the path out of fname.
putting it all together, your command executed in the parentDirectory is
find ./ -name \*.zip -exec unzip {} \;
But you want unzip to extract to the directory where it found the file. I was going to just use backticks on dirname {} but I can't get it to work right, as it either executes on the "{}" literal before find, or never executes.
The easiest workaround was to write my own script for unzip which does it in place.
> cat unzip_in_place
unzip $1 -d `dirname $1`
> find . -name "*.zip" -exec ./unzip_in_place {} \;
You could probably alias unzip to do that automatically, but that is unwise in case you ever use other tools that expect unzip to work as documented.

Archive old files to different files

In past I have used following command to archive old files to one file
find . -mtime -1 | xargs tar -cvzf archive.tar
Now suppose we have 20 directories I need to make a script that goes in each directory and archives all files to different files which have same name as original file?
So suppose if I have following files in one directory named /Home/basic/
and this directory has following files:
first_file.txt
second_file.txt
third_file.txt
Now after I am done running the script I need output as follows:
first_file_05112014.tar
second_file_05112014.tar
third_file_05112014.tar
Use:
find . -type f -mtime -1 | xargs -I file tar -cvzf file.tar.gz file
I added .gz to indicate its zipped as well.
From man xargs:
-I replace-str
Replace occurrences of replace-str in the initial-arguments with names
read from standard input. Also, unquoted blanks do not terminate input
items; instead the separator is the newline character. Implies -x and -L 1.
The find command will produce a list of filepaths. -L 1 means that each whole line will serve as input to the command.
-I file will assign the filepath to file and then each occurrence of file in the tar command line will be replaced by its value, that is, the filepath.
So, for ex, if find produces a filepath ./somedir/abc.txt, the corresponding tar command will look like:
tar -czvf ./somedir/abc.txt.tar.gz ./somedir/abc.txt
which is what is desired. And this will happen for each filepath.
What about this shell script?
#!/bin/sh
mkdir /tmp/junk #Easy for me to clean up!
for p in `find . -mtime -1 -type f`
do
dir=`dirname "$p"`
file=`basename "$p"`
tar cvf /tmp/junk/${file}.tar $p
done
It uses the basename command to extract the name of the file and the dirname command to extract the name of the directory. I don't actually use the directory but I left it in there in case you might find it handy.
I put all the tar files in one place so I could delete them easily but you could easily substitute $P instead of $file if you wanted them in the same directory.

Unix tar: do not preserve full pathnames

When I try to compress files and directories with tar using absolute paths, the absolute path is preserved in the resulting compressed file. I need to use absolute paths to tell tar where the folder I wish to compress is located, but I only want it to compress that folder – not the whole path.
For example, tar -cvzf test.tar.gz /home/path/test – where I want to compress the folder test. However, what I actually end up compressing is /home/path/test. Is there anything that can be done to avoid this? I have tried playing with the -C operand to no avail.
This is ugly... but it works...
I had this same problem but with multiple folders, I just wanted to flat every files out. You can use the option "transform" to pass a sed expression and... it works as expected.
this is the expression:
's/.*\///g' (delete everything before '/')
This is the final command:
tar --transform 's/.*\///g' -zcvf tarballName.tgz */*/*.info
Use -C to specify the directory from which the files look like you want, and then specify the files as seen from that directory:
tar -cvzf test.tar.gz -C /home/path test
multi-directory example
tar cvzf my.tar.gz example.zip -C dir1 files_under_dir1 -C dir2 files_under_dir2
the files under dir 1/2 should not have path.
tar can perform transformations on the filenames on the way in and out of the archive. Consider this example that stores a bunch of files in a flat tarfile:
in the root ~/ directory
find logger -name \*.sh | tar -cvf test.tar -T - --xform='s|^.*/||' --show-transformed
The -T - option tell tar to read a list of files from stdin, the --xform='s|^.*/||' applies the sed expression to all the filenames as after they are read and before they are stored. --show-transformed is just a nicety to show you the file names after they are transformed, the default is to show the names as they are read.
There are no doubt other ways besides using find to specify files to archive. For instance, if you have dotglob set in bash, you can use ** patterns to wildcard any number of directories, shortening the previous to this:
tar -cvf test.tar --xform='s|^.*/||' --show-transformed logger/**/*.sh
You’ll have to judge what is best for your situation given the files you’re after.
find -type f | tar --transform 's/.*\///g' -zcvf comp.tar.gz -T -
Where find -type f finds all the files in the directory tree and using tar with --transform compresses them without including the folder structure. This is very useful if you want to compress only the files that are the result of a certain search or the files of a specific type like:
find -type f -name "*.txt" | tar --transform 's/.*\///g' -zcvf comp.tar.gz -T -
Unlike the other answers, you don't have to include */*/* specifying the depth of the directory. find handles that for you.

Resources