I am successfully able to create and boot a little OS generated from buildroot on my embedded system.
Buildroot generates a cpio.
I decompress it (a working cpio, that boot with no problem )with
cpio -iv < ../rootfs.cpio
in a directory. After that I compress it again with
find . | cpio --quiet -o -H newc > ../rootfs.cpio
that is the command used in buildroot (in cpio.mk:31)
but at this point that cpio do not work anymore. If I compile my kernel with that file I get a kernel panic.
File persmission seems the same. With bcompare I saw that the "re-compressed" file has some additional zero at the end, just before the init script. I've tried to remove them, but at this point the system freeze, maybe there is some CRC check somewhere.
Any ideas?
the point is that I'd like to start from buildroot but I want to add my stuff to that rootfs and then embed it into my kernel.
I would like to decode HEVC encoded files to YUV files.
Is there any simple way to do this yet? An executable would be nice but I would make do with source code that is easily compilable.
It's as simple as (guide assumed linux, tweek it to your needs)
Clone the official reference codec (the official-official is a svn-repo found at https://hevc.hhi.fraunhofer.de/svn/svn_HEVCSoftware/trunk/ but a read-only git-repo that is kept in sync with the svn is provided by BBC which is so much easier to work with IMHO)
git clone git://hevc.kw.bbc.co.uk/git/jctvc-hm.git
To create the executables:
cd jctvc-hm/build/linux && make -f makefile
Binaries are now placed in
jctvc-hm/bin
Now, to decode a HEVC-encoded binary file into YCbCr, do
./TAppDecoderStatic -b encoded_file.bin -o reconstructed.yuv
If you are not on a linux system, just goto the build folder and you will hopefully find something you can use for your system:
$ cd jctvc-hm/build && ls
HM_vc10.sln HM_vc8.sln HM_vc9.sln linux/ vc10/ vc8/ vc9/
Follow the instructions on https://hevc.hhi.fraunhofer.de/svn/svn_HEVCSoftware/branches/HM-9.2-dev/doc/software-manual.pdf, the source code can be downloaded from https://hevc.hhi.fraunhofer.de/svn/svn_HEVCSoftware/trunk/ by using any subversion software.
You can build it on both Windows and Linux based OS. After you built the software, you may run the exe files as it is instructed on the software manual.
Alternatively, you can use libde265 as a much faster decoder.
Get the latest version from its github release page.
Configure with ./configure --disable-sherlock265
Compile: make
Generate the YUV file with
./dec265/dec265 hevc-file.bin -o output.yuv -t4
The option -t4 is for multi-threaded decoding. You can also do more things like input NAL-unit streams, dump the headers, directly display the video, or check the SEI hashes.
You can download the ffmpeg windows build exe file
simply decoding HEVC bitstream.
ffmpeg.exe -i xxx.bin out.yuv
I have an Amazon EC2 instance running CentOs. Unfortunately I don't have a gui. I tried setting up x11 forwarding but apparently it works differently with Ubuntu than it does with CentOs. But thats not the point. I download a pretty large .gz file (8.7Gb) and extracted using the following command:
gzip -d [filename] &
it took nearly an hour to decompress, and using ls -l I could see that the uncompressed directory was going to be nearly 30 gb. Anyway the process finishes and when I ls again the directory is no where to be found. I tried ls -a as well but still nothing. Any thoughts on this?
This sounds like gzip is silently failing when it runs out of space. How large is your instance's EBS volume / local disk that you're unzipping onto? (run df -h and figure out which device you're unzipping in.)
Additionally you could try to run gzip in verbose mode to catch any errors it might not be showing. I don't have a CentOS machine handy, but you might be able to use gzip -l [filename] to figure out whether your file is too big for the target directory.
I create shadow copies of big directories using cpio (find . | cpio -pdm destination)
After upgrade to Mountain Lion, cpio now warns me about every file that it didn't copy, i.e. cpio: ./some-file: File on disk is not older; skipping.
I could redirect stderr; however, I do want to know about real errors like destination full.
cpio --quiet does not help.
Ideas?
It means files on your destination are more recent than your source.
This may come from a wrong current time on your machine.
If you are sure you want to overwrite, you can use -u cpio switch:
from https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/cpio.1.html :
-u (i and p modes) Unconditionally overwrite existing files. Ordinarily, an older file will not overwrite a newer file on disk.
I have a series of files named filename.part0.tar, filename.part1.tar, … filename.part8.tar.
I guess tar can create multiple volumes when archiving, but I can't seem to find a way to unarchive them on Windows. I've tried to untar them using 7zip (GUI & commandline), WinRAR, tar114 (which doesn't run on 64-bit Windows), WinZip, and ZenTar (a little utility I found).
All programs run through the part0 file, extracting 3 rar files, then quit reporting an error. None of the other part files are recognized as .tar, .rar, .zip, or .gz.
I've tried concatenating them using the DOS copy command, but that doesn't work, possibly because part0 thru part6 and part8 are each 100Mb, while part7 is 53Mb and therefore likely the last part. I've tried several different logical orders for the files in concatenation, but no joy.
Other than installing Linux, finding a live distro, or tracking down the guy who left these files for me, how can I untar these files?
Install 7-zip. Right click on the first tar. In the context menu, go to "7zip -> Extract Here".
Works like a charm, no command-line kung-fu needed:)
EDIT:
I only now noticed that you mention already having tried 7zip. It might have balked if you tried to "open" the tar by going "open with" -> 7zip - Their command-line for opening files is a little unorthodox, so you have to associate via 7zip instead of via the file association system built-in to windows. If you try the right click -> "7-zip" -> "extract here", though, that should work- I tested the solution myself (albeit on a 32-bit Windows box- Don't have a 64 available)
1) download gzip http://www.gzip.org/ for windows and unpack it
2) gzip -c filename.part0.tar > foo.gz
gzip -c filename.part1.tar >> foo.gz
...
gzip -c filename.part8.tar >> foo.gz
3) unpack foo.gz
worked for me
As above, I had the same issue and ran into this old thread. For me it was a severe case of RTFM when installing a Siebel VM . These instructions were straight from the manual:
cat \
OVM_EL5U3_X86_ORACLE11G_SIEBEL811ENU_SIA21111_PVM.tgz.1of3 \
OVM_EL5U3_X86_ORACLE11G_SIEBEL811ENU_SIA21111_PVM.tgz.2of3 \
OVM_EL5U3_X86_ORACLE11G_SIEBEL811ENU_SIA21111_PVM.tgz.3of3 \
| tar xzf –
Worked for me!
The tar -M switch should it for you on windows (I'm using tar.exe).
tar --help says:
-M, --multi-volume create/list/extract multi-volume archive
I found this thread because I had the same problem with these files. Yes, the same exact files you have. Here's the correct order: 042358617 (i.e. start with part0, then part4, etc.)
Concatenate in that order and you'll get a tarball you can unarchive. (I'm not on Windows, so I can't advise on what app to use.) Note that of the 19 items contained therein, 3 are zip files that some unarchive utilities will report as being corrupted. Other apps will allow you to extract 99% of their contents. Again, I'm not on Windows, so you'll have to experiment for yourself.
Enjoy! ;)
This works well for me with multivolume tar archives (numbered .tar.1, .tar.2 and so on) and even allows to --list or --get specific folders or files in them:
#!/bin/bash
TAR=/usr/bin/tar
ARCHIVE=bkup-01Jun
RPATH=home/user
RDEST=restore/
EXCLUDE=.*
mkdir -p $RDEST
$TAR vf $ARCHIVE.tar.1 -F 'echo '$ARCHIVE'.tar.${TAR_VOLUME} >&${TAR_FD}' -C $RDEST --get $RPATH --exclude "$EXCLUDE"
Copy to a script file, then just change the parameters:
TAR=location of tar binary
ARCHIVE=Archive base name (without .tar.multivolumenumber)
RPATH=path to restore (leave empty for full restore)
RDEST=restore destination folder (relative or absolute path)
EXCLUDE=files to exclude (with pattern matching)
Interesting thing for me is you really DON'T use the -M option, as this would only ask you questions (insert next volume etc.)
Hello perhaps would help.
I had the same problems ...
a save on my web site made automaticaly in Centos at 4 am create multiple file in multivolume tar format (saveblabla.tar, saveblabla.tar1.tar, saveblabla.tar2.tar,etc..)
after downloading this file on my PC (windows) i can't extract them with both windows cmd or 7zip (unknow error).
I thirst binary copy file to reassemble tar files. (above in that thread)
copy /b file1+file2+file3 destination
after that, 7zip worked !!! Thanks for you help