How to read a directory name within a bash script - shell

I'm extracting a tar file within a shell script and it is is creating folder with a name like:
supportsuite-5-20-3/
The "supportsuite" part of the name is always the same, but I don't know what the remaining characters are in advance. I'm unpacking the tar file in a script and I need the name of the resulting folder from within that script. What is the best way to put that directory name into a shell variable within the script after the tar has been extracted?
I'm script challenged, so thanks in advance for any help you can provide.

If you're assuming that all of the files in the tar file are in the same directory, you could do something like:
DIRNAME=$(tar -tvf $TARFILE | head -1 | sed -e 's:^.* \([^/]*\)/.*$:\1:')
Note that it will probably misbehave if your filenames have spaces in them. If you care about this you'll need to adjust the regex in the sed command.

This will unzip it and keep the folder name in tact.
find . -type f -name "*.tar" -exec tar -xv '{}' \;

It's a little bit of a hack, but this might help (written for bash:
tar zxf supportsuite-5-20-3.tgz
NEWDIR=$(cd supportsuite-* && pwd)

Related

Loop through and unzip directories and then unzip items in subdirectories

I have a folder designed in the following way:
-parentDirectory
---folder1.zip
----item1
-----item1.zip
-----item2.zip
-----item3.zip
---folder2.zip
----item1
-----item1.zip
-----item2.zip
-----item3.zip
---folder3.zip
----item1
-----item1.zip
-----item2.zip
-----item3.zip
I would like to write a bash script that will loop through and unzip the folders and then go into each subdirectory of those folders and unzip the files and name those files a certain way.
I have tried the following
cd parentDirectory
find ./ -name \*.zip -exec unzip {} \;
count=1
for fname in *
do
unzip
mv $fname $attempt{count}.cpp
count=$(($count + 1))
done
I thought the first two lines would go into the parentDirectory folder and unzip all zips in that folder and then the for loop would handle the unzipping and renaming. But instead, it unzipped everything it could and placed it in the parentDirectory. I would like to maintain the same directory structure I have.
Any help would be appreciated
excerpt from man unzip
[-d exdir]
An optional directory to which to extract files. By default, all files and subdirectories are recreated in the current directory; the -d option allows extraction in an arbitrary directory (always assuming one has permission to write to the directory).
It's doing exactly what you told it, and what would happen if you had done the same on the command line. Just tell it where to extract, since you want it to extract there.
see Ubuntu bash script: how to split path by last slash? for an example of splitting the path out of fname.
putting it all together, your command executed in the parentDirectory is
find ./ -name \*.zip -exec unzip {} \;
But you want unzip to extract to the directory where it found the file. I was going to just use backticks on dirname {} but I can't get it to work right, as it either executes on the "{}" literal before find, or never executes.
The easiest workaround was to write my own script for unzip which does it in place.
> cat unzip_in_place
unzip $1 -d `dirname $1`
> find . -name "*.zip" -exec ./unzip_in_place {} \;
You could probably alias unzip to do that automatically, but that is unwise in case you ever use other tools that expect unzip to work as documented.

Bash Extract tar.gz file

I have my tar file under:
/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz
With the tar command:
tar -tf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz"
The command show me:
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/exit_codes/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/exit_codes/code_FUNC
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/exit_codes/code_SCRI
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/check_appprivilege.php
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/check_login.php
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/login/privilege.php
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/Lysto BackUp/
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/Lysto BackUp/sys
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/webapp/scripte/Lysto BackUp/sys_func
/volume1/02_public/3rd_Party_Apps/SPK_SCRIPTS/SynoDSApps/SSH_ERROR
My Plan or better my wish is to handle it like this:
IFS=$'\n'
for PATHS in $(tar -tPf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz")
do
SED=$(echo "$PATHS" | sed 's/.*\///')
if [[ -n "$SED" ]]
then
tar -C "${target_archiv}" -xvf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz" "$PATHS"
#echo JA
echo "$PATHS"
fi
done
unset IFS
i only want one file of the tar and Store this to a different Directory....
but this command with the -C don´t work... it Extract all the files of the tar....
My Question is, is it possible to extract only one file of the Tar without cd to the Directory ??
Another Question: is it possible to Extract only the files of the tar without the Folders this is maybe the better way but I don´t know how...?
and no I can not tar the files without the paths of it I need them...
so this is no way for me...
I hope for help here :)
If your ultimate goal is to extract files without the full path, you can use a SED-like expression to rename the files while they are extracted, using the --xform option:
tar -C "${target_archiv}" -xvf "/volume1/#appstore/SynoDSApps/archiv/DE/2018_08_18__Lysto BackUp.tar.gz" --xform='s,^.*/,,'
The 's,^.*/,,' expression asks to substitute (s) from the beginning of the filename (^), capture everything (.*) and stop at the last slash (/) then replace it with nothing. In other words, it removes the directory structure from the filenames.
If you want to get rid of the empty folders that have been extracted, you may call this command after extracting:
find "${target_archiv}" -mindepth 1 -maxdepth 1 -type d -exec rmdir {} \;
Keep in mind it will remove all the (empty) subfolders of "${target_archiv}", even the ones that were already here before extracting the tarball. However, because rmdir will not remove directories that contain files, it will be mostly harmless to the subdirectories you had.

Terminal command to create a .tar.gz files from 1,000,000 .json files (without including any directory)

I have a directory with plus 1,000,000 .json files and used the following command to build a j.tar.gz only from json files (without including the /Library/WebServer/a/a/e/j/ path):
cd /Library/WebServer/a/a/e/j && tar -zcvf j.tar.gz *.json
This error happened: ...Argument list too long. Would you suggest a better command to accomplish this task? Thanks.
An initial caveat: tar is not a standards-defined tool (the POSIX archiver is pax), so its behavior can vary between platforms without any minimal guaranteed baseline. Your mileage may vary.
Since this is flagged for bash, you can use <() -- a process substitution -- to generate a filename which, when read, will emit a subprocess's output without the need for a temporary file. (This will typically be implemented as either a /dev/fd name if your operating system supports them, or a named pipe otherwise).
If you only want the cd to apply to the tar command, you can do that as follows, putting it in a subshell and using exec to have the subshell replace itself with the tar command, avoiding the fork penalty that a subshell otherwise creates:
dir=/Library/WebServer/a/a/e/j
(cd "$dir" && exec tar --null -zcvf j.tar.gz -T <(printf '%s\0' *.json) )
Alternately, if your tar supports it, you can use --include to tell tar itself to filter the names:
tar -C "$dir" --include='*.json' -cvzf "$dir/j.tar.gz" .
Points of note:
printf '%s\n' *.json is immune from this because printf is a shell builtin; thus, the glob results aren't put in an execv-family syscall's arguments, so ARG_MAX doesn't apply.
Using --null on find and '%s\0' on printf (or -print0 if you were generating your list of names with find) prevents a maliciously-generated name with a literal newline from being able to inject arbitrary names into your stream. Think about what happens if someone runs mkdir -p $'hello/\n/etc/passwd\n.json' -- you don't want /etc/passwd going into your tarball.
Try:
find . -type f -name "*.json" > ./include_file && tar -zcvf j.tar.gz --files-from ./include_file
NOTE: This was tested successfully on CentOS/RedHat 6.7.
There is a limit set by your system. You can check
$ getconf ARG_MAX
mine returns
131072
Alternatively, you can create a file list for tar and use -T, --files-from F option to get names instead of globbing which hits the max args limit.
How about something like:
> cd /Library/WebServer/a/a/e/j
> find . -name '*.json' -maxdepth 1 | xargs tar -czvf j.tar.gz --add-file
It does not require temporary file and does not need to do *.json in the shell which would fail.
Checked on Ubuntu haven't got Mac at hand.

How to untar all tar files in current directory using Putty

How can I untar all tar files in one command using Putty.
I Tried the following but its not un-tarring (all files start with alcatelS*)
tar -xfv alcatelS*.tar
It is not working i don't get no errors and it is not un-tarring.
Thank you,
-xfv is wrong since v is being referred as the file instead. Also, tar can't accept multiple files to extract at once. Perhaps -M can be used but it's a little stubborn when I tried it. Also, it would be difficult to pass multiple arguments that were extracted from pathname expansion i.e. you have to do tar -xvM -f file1.tar -f file2.tar.
Do this instead:
for F in alcatelS*.tar; do
tar -xvf "$F"
done
Or one-line: (EDIT: Sorry that -is- a "one"-liner but I find that not technically a real one-liner, just a condensed one so I should haven't referred to that as a one-liner. Avoid the wrong convention.)
for F in alcatelS*.tar; do tar -xvf "$F"; done
You can use following command for extract all tar.gz files in directory in unix
find . -name 'alcatelS*.tar.gz' -exec tar -xvf {} \;
Following is my favorite way to untar multiple tar files:
ls *tar.gz | xargs -n1 tar xvf
Can be done in one line:
cat *.tar | tar -xvf - -i

Archive old files to different files

In past I have used following command to archive old files to one file
find . -mtime -1 | xargs tar -cvzf archive.tar
Now suppose we have 20 directories I need to make a script that goes in each directory and archives all files to different files which have same name as original file?
So suppose if I have following files in one directory named /Home/basic/
and this directory has following files:
first_file.txt
second_file.txt
third_file.txt
Now after I am done running the script I need output as follows:
first_file_05112014.tar
second_file_05112014.tar
third_file_05112014.tar
Use:
find . -type f -mtime -1 | xargs -I file tar -cvzf file.tar.gz file
I added .gz to indicate its zipped as well.
From man xargs:
-I replace-str
Replace occurrences of replace-str in the initial-arguments with names
read from standard input. Also, unquoted blanks do not terminate input
items; instead the separator is the newline character. Implies -x and -L 1.
The find command will produce a list of filepaths. -L 1 means that each whole line will serve as input to the command.
-I file will assign the filepath to file and then each occurrence of file in the tar command line will be replaced by its value, that is, the filepath.
So, for ex, if find produces a filepath ./somedir/abc.txt, the corresponding tar command will look like:
tar -czvf ./somedir/abc.txt.tar.gz ./somedir/abc.txt
which is what is desired. And this will happen for each filepath.
What about this shell script?
#!/bin/sh
mkdir /tmp/junk #Easy for me to clean up!
for p in `find . -mtime -1 -type f`
do
dir=`dirname "$p"`
file=`basename "$p"`
tar cvf /tmp/junk/${file}.tar $p
done
It uses the basename command to extract the name of the file and the dirname command to extract the name of the directory. I don't actually use the directory but I left it in there in case you might find it handy.
I put all the tar files in one place so I could delete them easily but you could easily substitute $P instead of $file if you wanted them in the same directory.

Resources