I'm using command line 7zip to zip up the contents of a folder (in Windows) thus:
7za a myzip.zip * -tzip -r
I've discovered that running exactly the same command line twice will produce two different ZIP files - they've got the same size but if you run a binary compare (ie fc /b file1.zip file2.zip) they are different. To complicate matters it seems that if you make the two zips in rapid succession then they are the same. But if you do them on different days or separated by a few hours they are not.
I presume that there's a date/time stamp in the ZIP file somewhere but I can't find anything on the 7zip site to confirm that.
Assuming I'm right does anyone know how to suppress the date/time? Or is something else causing the binaries to be different?
7-zip has the switch -m with parameter tc which has value on by default if not specified on command line.
With -mtc=on all 3 dates of a file stored on an NTFS partition are stored in the archive:
the last modification time,
the creation time, and also
the last access time.
See in help of 7-zip the page with title -m (Set compression Method) switch.
The last access time of the files is most likely responsible for the differences between the archive files.
You have to append -mtc=off to avoid storage of the NTFS timestamps in archive file.
Related
I have an issue with moving a large number of files and folders from old MAS OSX server ( 10.9.4) to Windows Server 2016.
I already try to use robocopy but I always get an error - File creation error - The file or directory is not a reparse point.
The same thing happens when I am using xcopy.
I also tried to copy files from MAC to windows after mounting a share on MAC and use scp but also there are some errors and not all files are moved.
Can anyone know a way how I can copy files and preserver this creation and modified date?
rsync works fine now but I needed to give full permission over the newtork to prevent any issues
I generally use tar.
tar like rsync and lsyncdpreserves modification times, etc.
For tar you would:
tar up all the files just like a backup
gzip the tar ball
copy the gzipped tar ball to Windows
Extract the files using WinRAR. The file modification times will be preserved.
I am trying to code a script to automatically process some of our daily ftp files.
I have already coded the files to download from the source ftp using WinSCP and calling it in a .bat file, and would ideally like to call it within the same bat. Scripting Language does not matter, as long as I can run/call it from the original batch.
I need will extract the date from a filename, and unzip the contents into corresponding folders. The source file is delivered automatically daily via FTP, and the filename is:
SOFL_CLAIM_TC201702270720000075.zip
The bolded section is the date that I would like to extract.
The contents of the .zip include two types of content, multiple PDFs and a .dat file.
For the supplied date of 20170227, the pdfs need to get extracted to a folder following the format:
\%root%\FNOIs\2017\02-Feb\02-27-2017
At the same time, the .dat file needs to get extracted to multiple folders following the format:
\%root%\Claim Add\2017 Claim Add\02-2017
\%root2%\vendorFTP\VendorFolder
After extracting, I need to move the source zip to
\%root%\Claim Add\2017 Claim Add\02-2017
What is the best way off accomplishing all of this?
I am assuming it would be the for /f batch command, but I am new to batch coding and cannot figure out how to start it from scratch.
I also have 7zip installed, but do not understand how to use the command-line options.
You have asked for a lot in one question, and not shown any code or demonstrated effort on your part.
For the first part, once you have the filename in a variable:
set FILENAME=SOFL_CLAIM_TC201702270720000075.zip
You can get the date part with:
echo %FILENAME:~13,-14%
The syntax: :13,-14 means "Remove the first 13 letters and the last 14 letters." That should leave you with just the date.
When you integrate that into your script, Show Your Code
I need a way to select several files with check-boxes and drag them all to a batch-file icon. The first step in the script would be to be compressed them into a single zip file before proceeding to the next step. If possible, it would be useful to also end up with each in a separate zip of its own for storage.
I am not sure how to address the for:to commands to allow gathering all selected files into a single zip file in a script. The Windows 'sendto compressed zip' function works perfectly if I select all the files the copy and paste them to the Windows ZF filetype. But I don't know how to access it from within a batch.
Drag and drop multiple files
By default, if you drag a file onto a batch file it is the same as passing the filepath as a parameter. This can be accessed using %%1. Dragging multiple files would have a similar effect and be like calling test.bat file1.txt file2.txt file3.txt. Each "parameter" can be accessed using subsequent variables, %%1, %%2, %%3, etc, up to %%9.
Get more information here.
Zipping files
First you will need a utility that supports command line operations, such as 7-zip. Once you have the appropriate executable in your path, you will need to review the documentation on how to zip files from the command line. Instead of using the paths and filenames, you will use the variables mentioned above.
Here is some pseudo code:
zipfiles output.zip %%1 %%2 %%3 %%4 %%5
Notes
You will not be able to drop more than 9 files.
You may need to confirm that each variable exists. If you try to include %%9 in the zip command but you only dropped 8 files, you may get an error.
I am trying to create a script for cron job. I have around 8 GB folder containing thousands of files. I am trying to create a bash script which first tar the folder and then transfer the tarred file to ftp server.
But I am not sure while tar is tarring the folder and some other process is accessing files inside it or writing to the files inside it.
Although its is fine for me if the tarred file does not contains that recent changes while the tar was tarring the folder.
suggest me the proper way. Thanks.
tar will hapilly tar "whatever it can". But you will probably have some surprises when untarring, as tar also stored the size of the file it tars, before taring it. So expect some surprises.
A very unpleasant surprise would be : if the size is truncated, then tar will "fill" it with "NUL" characters to match it's recorded size... This can give very unpleasant side effects. In some cases, tar, when untarring, will say nothing, and silently add as many NUL characters it needs to match the size (in fact, in unix, it doesn't even need to do that : the OS does it, see "sparse files"). In some cases, if truncating occured during the taring of the file, tar will complain it encounters an Unexpected End of File when untarring (as it expected XXX bytes but only reads fewer than this), but will still say that the file should be XXX bytes (and the unix OSes will then create it as a sparse file, with "NUL" chars magically appended at the end to match the expected size when you read it).
(to see the NUL chars : an easy way is to less thefile (or cat -v thefile | more on a very old unix. Look for any ^#)
But on the contrary, if files are only appended to (logs, etc), then the side effect is less problematic : you will only miss some bits of them (which you say you're ok about), and not have that unpleasant "fill with NUL characters" side effects. tar may complain when untarring the file, but it will untar it.
I think tar failed (so do not create archive) when an archived file is modified during archiving. As Etan said, the solution depends on what you want finally in the tarball.
To avoid a tar failure, you can simply COPY the folder elsewhere before to call tar. But in this case, you cannot be confident in the consistency of the backuped directory. It's NOT an atomic operation, so some files will be todate while other files will be outdated. It can be a severe issue or not follow your situation.
If you can, I suggest you configure how these files are created. For example: "only recent files are appended, files older than 1 day are never changed", in this case you can easily backup only old files and the backup will be consistent.
More generally, you have to accept to loose last data AND be not consistant (each files is backup at a different date), or you have to act at a different level. I suggest :
Configure the software that produces the data to choose a consistency
Or use OS/Virtualization features. For example it's possible to do consistent snapshot of a storage on some virtual storage...
I'm currently trying to find a way to concatenate several files, typically all files from within a directory (recursive included) into a single stream, for further processing.
TAR looks like an obvious candidate, except that it is not at all standard in Windows, and unfortunately, all versions i could find (mostly variations of GNU TAR) are much too big (several hundreds of KB once included DLL dependencies). I need something much smaller.
Apparently, the standard COPY command could do the trick. For example the following command works:
COPY /B sourcefile1+sourcefile2 destinationfile
However, there are still 2 problems : I don't know how to write the result to stdout (for pipe), and even more importantly how to achieve the reverse operation ?
I need a small utility to do this concatenation job, either in C source code, a standard windows command, or as a distributable binary. It doesn't need to respect the TAR format (although it is not a bad thing if it does). And obviously the concatenation shall be reversible.
I suggest using 7-zip. It has portable version, can compress very good (or just copy without compression) all files recurse subdirectories and write output to single stream (stdout).
It has "-so" (write data to stdout) switch. For example,
7z x archive.gz -so > Doc.txt
decompresses archive.gz archive to output stream and then redirects that stream to Doc.txt file.
7z a -tzip -so -r src\*.cpp src\*.h > archive.zip
compresses the all *.cpp- and *.h- files in src directory and all it subdirectories to the 7-Zip standard output stream and writes that stream to archive.zip file (remove "> archive.zip" and intercept output by your program).
Why don't you use ZIP (disable compression if you want)? It's very standard, and support comes built into Windows. See Creating a ZIP file on Windows (XP/2003) in C/C++
Pure concatenation isn't reversible, because you can't know where to split it again. So you should use a directory of chunk sizes, such as exists in the ZIP and TAR formats.
Well, Shelwien's almost solved the issue.
The Tar version he proposes is "lean anough" (~120KB) and does not necessitate external DLL dependancies.
http://downloads.sourceforge.net/project/unxutils/unxutils/current/UnxUtils.zip
Unfortunately, it also has some problems of its own, such as no support for Unicode characters, interpreted escape sequence (so a directory name starting with t triggers a \t which is considered a tabulation), and a potential problem with pipe implementation under Windows XP (although on this last one it could come from the other program).
So that's a dead end.
A solution is still to be found...
[Edit] Shelwien just provided a solution by creating "shar", a tar replacement much smaller and much more efficient, without the limitations described above. This solve the issue.