I have downloaded the image data from ImageNet ILSVRC2012 and save into a hard disk on this site : http://www.image-net.org/challenges/LSVRC/2012/nonpub-downloads . I am working on OSX.
When I do tar -xvf ILSVRC2012_img_train.tar, I have this error :
x n01729977.tar: Truncated tar archive
tar: Error exit delayed from previous errors.
Does anyone have any idea how to fix it ?
I tried to download this synset apart through the website directly but when I re-use tar -xvf ILSVRC2012_img_train.tar, I had the same issue.
Thank you
Thank you Prune. Indeed my problem was linked to the downloading. Don't know why it was not fully downloaded. The actual file is 140GB.
Related
Edit
After seeing 4ae1e1's suggestion, I decided to completely rework my post.
I am having a very strange problem with tar not archiving/extracting files with short, all uppercase names. I am using an embedded PC running Debian Sarge and a YAFFS2 file system.
I have been trying to create a zipped tar archive that contains a file that has a all uppercase name that's three letters long (AAA for example). After creating a text file named AAA in my working directory, I use the command that follows to create the archive:
tar cvzf "/home/Update.tgz" ./*
./AAA
As you can see, tar shows the file being added to the archive just fine and tar returns 0 indicating success. When I try to extract the archive to the folder /temp with the following:
tar zxf "/home/Update.tgz" -C "/temp"
the file AAA is not extracted. tar again returns 0 indicating the extraction was successful. Running ls shows that the file AAA doesn't exist in /temp.
What is really strange is that if I rename the original file AAA to Aaa and recreate the archive, the file can be extracted successfully.
After finding this error I have tried many different filenames and every name that is all uppercase and three characters long or less shows this issue. Has anyone ever seen a similar issue? I would really appreciate any light anyone could shed on this issue.
Thanks!
On a cluster I zipped a large (61GB, 9.2GB when zipped) directory.
zip -r zzDirectory Directory
I then scp the zzDirectory on my personal computer.
scp -r name#host.com:/path/to/zzDirectory.zip path/in/my/computer/zzDirectory.zip
And finally I unzipped it. I tried to unzip from the bash but it failed
warning [zzDirectory.zip]: 5544449626 extra bytes at beginning or within zipfile
(attempting to process anyway)
error [zzDirectory.zip]: start of central directory not found;
zipfile corrupt.
(please check that you have transferred or created the zipfile in the
appropriate BINARY mode and that you have compiled UnZip properly)
So I doubled click on the icon from the finder and the system started to unzip zzDirectory.zip. However, some files are missing and it looks like (I am not 100% sure yet) that some newline characters (\n) are missing as well. unzip used to work fine on my computer before.
In order to investigate where the problem come from, I unzipped zzDirectory.zip on the cluster and everything seem to work fine (no missing files).
I repeated the transfer and unzipped again but the problem persists. Note that transfers are made via internet. My OS is Mac OSX Yosemite 10.10.2.
How can I solve this issue? I would prefer not to transfer data that are not zipped because of band width issue. Do you think I should try to tar or should I use specific options that goes with the unzip command line?
On OS X you could try:
ditto -x -k the_over4gb.zip /path/to/dir/where/want/unzip
e.g:
ditto -x -k zzDirectory.zip .
I am writing to a HP LTO4 tape drive. But after writing a big file (of the order 30GB) I am not able to write anything after that. I get
tar: directory checksum error
Anyone has any idea what could be wrong?
I am using the command
tar -rvfE /dev/rmt/0 <file.gz>
Need help!
Problem resolved by using -i flag.
So now I am using the command
tar -rvfEi /dev/rmt/0 <file>
:)
I would like to compress a directory.
tar -cvzf mydir.tar.gz mydir
but this retains symlinks so that it is not portable to a new system.
How can I convert symlinks?
I have tried
tar -cvzfh
since man tar says
-h, --dereference
don’t dump symlinks; dump the files they point to
but this results in an error
tar: Error exit delayed from previous errors
and creates a file called "zh"
My files are on a RHEL server.
Your tar.gz file name must follow immediately after the -f flag, merely reordering the flags may work.
tar -cvzhf mydir.tar.gz mydir
Hey, I have bash script running a wget command to get a directory:
wget -r -nH --cut-dirs=5 "ftp://$USER:$PASS#stumpyinc.com/subdomains/cydia/httpdocs/theme/themes/$theme_root"
And what it's supposed to do is download a folder structure that looks like this:
$theme_root/Library/Themes/$theme_name.theme/Icons
For some reason, it wont download any folder that's inside of the $theme_name.theme folder. There's also a UIImages folder in there that's not showing up, although files that are in that folder are being downloaded. Does anyone notice anything that I might have done wrong? Thanks in advance!
EDIT
if you add --level=inf it works perfectly!
Wget's default directory retrieval depth is 5 directories as per the wget manual. If the files you are trying to get are deeper than that from your starting position, it will not go down there. You can try giving a larger --level option or as your edit --level=inf.