How can I untar all tar files in one command using Putty.
I Tried the following but its not un-tarring (all files start with alcatelS*)
tar -xfv alcatelS*.tar
It is not working i don't get no errors and it is not un-tarring.
Thank you,
-xfv is wrong since v is being referred as the file instead. Also, tar can't accept multiple files to extract at once. Perhaps -M can be used but it's a little stubborn when I tried it. Also, it would be difficult to pass multiple arguments that were extracted from pathname expansion i.e. you have to do tar -xvM -f file1.tar -f file2.tar.
Do this instead:
for F in alcatelS*.tar; do
tar -xvf "$F"
done
Or one-line: (EDIT: Sorry that -is- a "one"-liner but I find that not technically a real one-liner, just a condensed one so I should haven't referred to that as a one-liner. Avoid the wrong convention.)
for F in alcatelS*.tar; do tar -xvf "$F"; done
You can use following command for extract all tar.gz files in directory in unix
find . -name 'alcatelS*.tar.gz' -exec tar -xvf {} \;
Following is my favorite way to untar multiple tar files:
ls *tar.gz | xargs -n1 tar xvf
Can be done in one line:
cat *.tar | tar -xvf - -i
Related
I need do some simple compresion multiple files to .tar in bash.
Conditions:
My archive must include only files with extension .exe.
Unfortunetly, when I am trying this:
find ./myDir -name "*.exe" | tar -cf archive -T -
The file names are changed like ./file1.exe How can I compress this without change of file names?
Suggesting:
find "$PWD/myDir" -name "*.exe" | tar -cf archive -T -
This will not do anything interesting.
Better suggesting to feed tar command with results from find command
tar czf your_compressed_file.tar.gz $(find ./myDir -name "*.exe")
Notice tar cf is not compressed. But tar czf is compressed.
You can also add -v option to see what are the compressed files:
tar czfv your_compressed_file.tar.gz $(find ./myDir -name "*.exe")
I have a directory with plus 1,000,000 .json files and used the following command to build a j.tar.gz only from json files (without including the /Library/WebServer/a/a/e/j/ path):
cd /Library/WebServer/a/a/e/j && tar -zcvf j.tar.gz *.json
This error happened: ...Argument list too long. Would you suggest a better command to accomplish this task? Thanks.
An initial caveat: tar is not a standards-defined tool (the POSIX archiver is pax), so its behavior can vary between platforms without any minimal guaranteed baseline. Your mileage may vary.
Since this is flagged for bash, you can use <() -- a process substitution -- to generate a filename which, when read, will emit a subprocess's output without the need for a temporary file. (This will typically be implemented as either a /dev/fd name if your operating system supports them, or a named pipe otherwise).
If you only want the cd to apply to the tar command, you can do that as follows, putting it in a subshell and using exec to have the subshell replace itself with the tar command, avoiding the fork penalty that a subshell otherwise creates:
dir=/Library/WebServer/a/a/e/j
(cd "$dir" && exec tar --null -zcvf j.tar.gz -T <(printf '%s\0' *.json) )
Alternately, if your tar supports it, you can use --include to tell tar itself to filter the names:
tar -C "$dir" --include='*.json' -cvzf "$dir/j.tar.gz" .
Points of note:
printf '%s\n' *.json is immune from this because printf is a shell builtin; thus, the glob results aren't put in an execv-family syscall's arguments, so ARG_MAX doesn't apply.
Using --null on find and '%s\0' on printf (or -print0 if you were generating your list of names with find) prevents a maliciously-generated name with a literal newline from being able to inject arbitrary names into your stream. Think about what happens if someone runs mkdir -p $'hello/\n/etc/passwd\n.json' -- you don't want /etc/passwd going into your tarball.
Try:
find . -type f -name "*.json" > ./include_file && tar -zcvf j.tar.gz --files-from ./include_file
NOTE: This was tested successfully on CentOS/RedHat 6.7.
There is a limit set by your system. You can check
$ getconf ARG_MAX
mine returns
131072
Alternatively, you can create a file list for tar and use -T, --files-from F option to get names instead of globbing which hits the max args limit.
How about something like:
> cd /Library/WebServer/a/a/e/j
> find . -name '*.json' -maxdepth 1 | xargs tar -czvf j.tar.gz --add-file
It does not require temporary file and does not need to do *.json in the shell which would fail.
Checked on Ubuntu haven't got Mac at hand.
I need to secure copy (scp) to remotely copy a directory with its sub structure preserved from the UNIX command line. The sub directories have identically named files that I WANT and bunch of other stuff that I don't. Here is how the structure looks like.
directorytocopy
subdir1
1.wanted
2.wanted
...
1.unwanted
2.notwanted
subdir2
1.wanted
2.wanted
...
1.unwanted
2.notwanted
..
I just want the .wanted files preserving the directory structure. I realize that it is possible to write a shell (I am using bash) script to do this. Is it possible to do this in a less brute force way? I cannot copy the whole thing and delete the unwanted files because I do not have enough space.
Adrian has the best idea to use rsync. You can also use tar to bundle the wanted files:
cd directorytocopy
shopt -s nullglob globstar
tar -cf - **/*.wanted | ssh destination 'cd dirToPaste && tar -xvf -'
Here, using tar's -f option with the filename - to use stdin/stdout as the archive file.
This is untested, and may fail because the archive may not contain the actual subdirectories that hold the "wanted" files.
Assuming GNU tar on the source machine, and assuming that filenames of the wanted files won't contain newlines and they are short enough to fit the tar headers:
find /some/directory -type f -name '*.wanted' | \
tar cf - --files-from - | \
ssh user#host 'cd /some/other/dir && tar xvpf -'
rsync with and -exclude/include list follwing #Adrian Frühwirth's suggestion would be a to do this.
I have a folder I am turning into a tar file for transfer over the network it looks like this
/foo/bar/file1
/foo/file2
/foo/baz/bin/file3
I want to extract it with no directory information so the extracted contents look like this
file1
file2
file3
How does one accomplish this
The GNU tar manual elaborates on --transform:
tar --show-transformed --transform 's,.*/,,' tfv myarchive.tar
tar --transform 's,.*/,,' xf myarchive.tar
What about the Unix philosophy of stringing together tools?
Use tar to extract as usual, then
find foo -type f | xargs mv '{}' .
rm -rf foo # optional, nuke the empty dirs.
or variations thereof with find ... -print0 and xargs -0.
I'm extracting a tar file within a shell script and it is is creating folder with a name like:
supportsuite-5-20-3/
The "supportsuite" part of the name is always the same, but I don't know what the remaining characters are in advance. I'm unpacking the tar file in a script and I need the name of the resulting folder from within that script. What is the best way to put that directory name into a shell variable within the script after the tar has been extracted?
I'm script challenged, so thanks in advance for any help you can provide.
If you're assuming that all of the files in the tar file are in the same directory, you could do something like:
DIRNAME=$(tar -tvf $TARFILE | head -1 | sed -e 's:^.* \([^/]*\)/.*$:\1:')
Note that it will probably misbehave if your filenames have spaces in them. If you care about this you'll need to adjust the regex in the sed command.
This will unzip it and keep the folder name in tact.
find . -type f -name "*.tar" -exec tar -xv '{}' \;
It's a little bit of a hack, but this might help (written for bash:
tar zxf supportsuite-5-20-3.tgz
NEWDIR=$(cd supportsuite-* && pwd)