I have created some .tar folders that for the most part contain some text files. Is is possible to edit programmatically a text file from the command line (bash and related tools) without fully un-packing the tar?
Context: these .tars were created by a script and I realized I made a mistake. I'm looking for the most efficient and simple solution to edit a part of a single line of a text file.
You could use tar --delete and tar --update to replace a file inside a tar
i think that you won't be able do it with out descompressing. You could to do a loop for descompress only your text file, edit with sed and compress again. In other case, i think that it is impossible...
maybe something like this
xzcat blah | sed /../../ | xz blah
within the loop that runs through all the files
Thanx for the inspiration Grammy, here is a minimal example in case someone comes looking for the same.
Create some data in a folder:
cd /tmp;
mkdir dir1;
echo "foo" >> ./dir1/test_txt;
echo "bar" >> ./dir1/test_txt;
Pack in a tar:
tar -cf test_tar.tar dir1;
Delete the directory that we are going to decompress and edit.
rm -rf dir1;
And unpack only the relevant file from the tar folder and edit:
WARNING:
Tar updates only if the timestamp changed since the last edit!
This is why there is a sleep command here (to change the timestamp.).
sleep 1
tar -xf test_tar.tar dir1/test_txt
sed -i 's/foo/baar/' dir1/test_txt ;
Update the relevant file.
tar -uf test_tar.tar dir1/test_txt
Check if this worked:
tar -xf test_tar.tar dir1
cat ./dir1/test_txt
[...]$
baar
bar
Related
I need to secure copy (scp) to remotely copy a directory with its sub structure preserved from the UNIX command line. The sub directories have identically named files that I WANT and bunch of other stuff that I don't. Here is how the structure looks like.
directorytocopy
subdir1
1.wanted
2.wanted
...
1.unwanted
2.notwanted
subdir2
1.wanted
2.wanted
...
1.unwanted
2.notwanted
..
I just want the .wanted files preserving the directory structure. I realize that it is possible to write a shell (I am using bash) script to do this. Is it possible to do this in a less brute force way? I cannot copy the whole thing and delete the unwanted files because I do not have enough space.
Adrian has the best idea to use rsync. You can also use tar to bundle the wanted files:
cd directorytocopy
shopt -s nullglob globstar
tar -cf - **/*.wanted | ssh destination 'cd dirToPaste && tar -xvf -'
Here, using tar's -f option with the filename - to use stdin/stdout as the archive file.
This is untested, and may fail because the archive may not contain the actual subdirectories that hold the "wanted" files.
Assuming GNU tar on the source machine, and assuming that filenames of the wanted files won't contain newlines and they are short enough to fit the tar headers:
find /some/directory -type f -name '*.wanted' | \
tar cf - --files-from - | \
ssh user#host 'cd /some/other/dir && tar xvpf -'
rsync with and -exclude/include list follwing #Adrian Frühwirth's suggestion would be a to do this.
How can I untar all tar files in one command using Putty.
I Tried the following but its not un-tarring (all files start with alcatelS*)
tar -xfv alcatelS*.tar
It is not working i don't get no errors and it is not un-tarring.
Thank you,
-xfv is wrong since v is being referred as the file instead. Also, tar can't accept multiple files to extract at once. Perhaps -M can be used but it's a little stubborn when I tried it. Also, it would be difficult to pass multiple arguments that were extracted from pathname expansion i.e. you have to do tar -xvM -f file1.tar -f file2.tar.
Do this instead:
for F in alcatelS*.tar; do
tar -xvf "$F"
done
Or one-line: (EDIT: Sorry that -is- a "one"-liner but I find that not technically a real one-liner, just a condensed one so I should haven't referred to that as a one-liner. Avoid the wrong convention.)
for F in alcatelS*.tar; do tar -xvf "$F"; done
You can use following command for extract all tar.gz files in directory in unix
find . -name 'alcatelS*.tar.gz' -exec tar -xvf {} \;
Following is my favorite way to untar multiple tar files:
ls *tar.gz | xargs -n1 tar xvf
Can be done in one line:
cat *.tar | tar -xvf - -i
I have a file named 2014-03-19_cis_digital.tar.gz. in a source directory, i will have to first GUNZIP the file and then UNTAR the file and move the untarred files to another directory.
Can anyone help me in writing the shell script commands??
change your working directory first then untar/ungzip.
cd $TARGET_DIR
tar xzf $PATH_TO_FILE
You don't need to gunzip separately. You can do everything in one command:
tar -xzf /source/dir/2014-03-19_cis_digital.tar.gz -C /target/dir
I would like to ask if there is a way to search for a file inside a .tar.gz file without extracting it? If there is, is there a way to search for that file by date?
My OS is AIX.
Thanks!
tar can be instructed to preserve atimes on files it archives, but not all tars do this, and I am unfortunately not familiar with AIX-specific tar in this case. What you need to know is whether tar was invoked with --atime-preserve (AIX tar may not support this; be sure to check), and when you call an extraction you must use the -p flag. So, you'd have something like this:
tar zxpf file.tar.gz the/file/you/want.txt
You will likely find that Unix (cf Linux) tar won't support the -j and -z so you would have to use:
gzip -dc file.tar.gz | tar xf - the/file/you/want.txt
to run the command from a pipe. In this case, you would need to know the name of the file you want extracted, which you can get from:
tar tf file.tar.gz
using compression as required. Obviously you can tack on a | grep foo if you are looking for a file named foo.
It is not, I do not think, possible to extract a file from tar based upon the modification date of the file in the tarball – at least I was not able to find support for such in the documentation. Remember, tar is just the tape archiver and is not meant to do such fancy things. :-)
Lastly, you can do this:
tar xvf file.tar `tar tf file.tar | grep foo`
if you want to pull out all the files matching 'foo' from file.tar (compression above yada yada). I do not suggest running that command on an actual tape drive!
$ tar tzf archive.tar.gz | grep "search"
I have a tar archive which contains several text files. I would like to write a script to display (stdout) the content of a file without extracting it to the current directory.
Actually I would like to do the same as:
tar tf myArchive.tar folder/someFile.txt
cat folder/someFile.txt
rm -R folder
but without the rm...
I tried this way but it didn't work:
tar tf myArchive.tar folder/someFile.txt | cat
Thanks
Use x to extract, with f from archive file. Then add also option -O to direct extracted files to standard output.
tar xf myArchive.tar folder/someFile.txt -O