Code:
$timestamp = (Get-Date).ToString('yyyy-MM-dd')
$originalSource = Get-ChildItem "D:\output\csv\*.csv", "D:\output\csv\Billing\*.csv" | Where-Object {($_.LastWriteTime -ge [datetime]::today)}
$source = $originalSource
$target = "D:\output\csv\bin\$timestamp.7z"
$housekeepZipFile = "D:\output\csv\bin\*"
####Using 7z to zip
if (-not (test-path "D:\bin\7-Zip\7z.exe")) {throw "D:\bin\7-Zip\7z.exe needed"}
set-alias sz "D:\bin\7-Zip\7z.exe"
sz a -mx=0 -mhe=on -m0=lzma2 $target $source
I have tried above powershell. When i pumped files into csv and billing folder, it will create archive file that I want.
When there is no today's date input file pumped into the csv and billing folder, 7zip random pull window files and create archive to me.
Question:
How can I set that only create archive if there's a file with current date or create archive (with empty folder) if there no file with current date. P/S: I tried to put where-object to filter lastwritetime but seems not useful enough.
you just have to add an if after calling Get-ChildItem
$timestamp = (Get-Date).ToString('yyyy-MM-dd')
$originalSource = Get-ChildItem "D:\output\csv\*.csv", "D:\output\csv\Billing\*.csv" | Where-Object {($_.LastWriteTime -ge [datetime]::today)}
if($originalSource) {
$source = $originalSource
$target = "D:\output\csv\bin\$timestamp.7z"
$housekeepZipFile = "D:\output\csv\bin\*"
####Using 7z to zip
if (-not (test-path "D:\bin\7-Zip\7z.exe")) {throw "D:\bin\7-Zip\7z.exe needed"}
set-alias sz "D:\bin\7-Zip\7z.exe"
sz a -mx=0 -mhe=on -m0=lzma2 $target $source
}
so only if files are found the archive will be created
Related
I just want to know how to convert an images folder into a CBZ file READABLE for my Ebook (I checked the ebook and she can read this format).
Optimally, I would like to convert it without having to install everything. Just a script.
For those who are fast, I already answered my question... Just sharing it.
GO SEE UPDATE PART
Assuming your OS is Windows, we can do it with Batch or PowerShell.
For this one its process is quite easy, we just need to understand that a CBZ file IS a ZIP file with images in it. So we will just:
zip with 7zip because for some reasons the files converted with WinRAR didn't worked in my Ebook (wasn't even in the library) ;
Change the extension from .zip to .cbz.
I'm only going to show the PowerShell one because the .bat script had known issues.
Architecture
Architecture of the directory
The architecture should be:
My first folder
first image
second image
My second folder
first image
second image
PowerShell
Here's the code from my "#_ImagesFolderToCBZ.ps1"
Clear-Host
# INPUT - Script folder path
$i = Split-Path -Parent -Path $MyInvocation.MyCommand.Definition
# 7Zip path
$7zipPath = "$env:ProgramFiles\7-Zip\7z.exe"
# A list of the path of the zipped files
$listOfZippedFiles = New-Object Collections.Generic.List[String]
$listOfZippedNames = New-Object Collections.Generic.List[String]
# Ask the user if we delete the folders after their conversion
$isSdel = Read-Host -Prompt "Do you want to delete the folders after conversion [Y/N]: "
# Get the files inside the INPUT path and forEach children
Get-ChildItem "$i" | ForEach-Object {
# Get the full path of the file
$pathFolder = $_.FullName
# If the path here is a folder
if (Test-Path -Path "$pathFolder" -PathType Container) {
# Set the alias
Set-Alias 7z $7zipPath
# If the user asked for deletion of folders
if ("Y" -eq $isSdel.ToUpper()) {
# Zip the content of the folder
7z a "$pathFolder.zip" "$pathFolder\*" -sdel | FIND "ing archive"
}
else {
# Zip the content of the folder
7z a "$pathFolder.zip" "$pathFolder\*" | FIND "ing archive"
}
# Add the file name into the list
$listOfZippedFiles.Add("$pathFolder.zip")
$listOfZippedNames.Add("$_.zip")
}
# If the user asked for deletion of folders
if ("Y" -eq $isSdel) {
# Remove the now blank folders
if( $_.psiscontainer -eq $true){
if((gci $_.FullName) -eq $null){
$_.FullName | Remove-Item -Force
}
}
}
}
# For each zipped file
foreach ($file in $listOfZippedFiles) {
# Change the extension to CBZ
$dest = [System.IO.Path]::ChangeExtension("$file",".cbz")
Move-Item -Path "$file" -Destination $dest -Force
}
# Write for the user
Write-Host "`r`n`r`nConverted:"
# Displaying the converted files by their names
foreach ($file in $listOfZippedNames) {
$newName = [System.IO.Path]::ChangeExtension("$file",".cbz")
Write-Host "-- $newName"
}
# Blank line
Write-Host ""
# Pause to let us see the result
Pause
Output
output
As we can see, the folder is sucessfully created AND without loops like : I have ZIP files in the root folder of the script and they are also renamed into CBZ ones (I had this loop for my batch script).
I also added the choice to automatically delete the converted folders OR not.
Obviously, there's room for improvements (especially in how we delete the folders). I'll gladly take any advice.
UPDATE
I updated my script and it's much better. Less instructions, a list (in the prompt) that update itself when each folder is really converted. So no more: 1) ZIP all folders 2) rename their extension.
So a code that's more logic and also useful to show a beautiful process in real time.
Here's the updated code :
Clear-Host
# ROOT - Script folder path
$root = Split-Path -Parent -Path $MyInvocation.MyCommand.Definition
# 7Zip path
$7zipPath = "$env:ProgramFiles\7-Zip\7z.exe"
# Test if 7zip is installed
if (-not (Test-Path -Path $7zipPath -PathType Leaf)) {
throw "7 zip file '$7zipPath' not found"
}
# Ask the user if we delete the folders after their conversion
$isSdel = Read-Host -Prompt "Do you want to delete the folders after conversion [Y/N]: "
# Write for the user
Write-Host "`r`nConverted:"
# Get the files inside the INPUT path and forEach children
Get-ChildItem "$root" | ForEach-Object {
# Get the full path of the file
$pathFolder = $_.FullName
# If the path here is a folder
if (Test-Path -Path "$pathFolder" -PathType Container) {
# If the user asked for deletion of folders
if ("Y" -eq $isSdel.ToUpper()) {
# Zip the content of the folder while deleting the files zipped
& $7zipPath a "$pathFolder.zip" "$pathFolder\*" -sdel > $null
# Remove the now blank folder
if( $_.psiscontainer -eq $true){
if((gci $_.FullName) -eq $null){
$_.FullName | Remove-Item -Force
}
}
}
else {
# Zip the content of the folder
& $7zipPath a "$pathFolder.zip" "$pathFolder\*" > $null
}
# Change the extension to CBZ
$newName = [System.IO.Path]::ChangeExtension("$pathFolder.zip",".cbz")
Move-Item -Path "$pathFolder.zip" -Destination $newName -Force
# Tells the user this one is finished converting
Write-Host "--" -ForegroundColor DarkGray -NoNewline
Write-Host " $_.cbz"
}
}
# Tells the user it's finished
Write-Host "`r`nFinished`r`n" -ForegroundColor Green
# Pause to let us see the result
Pause
UPDATE 2
I made a GitHub project for this one. The URL is here:
https://github.com/PonyLucky/CBZ-Manga-Creator.
I have Several zip files that Contain multiple filetypes. The ONLY ones I am interested in are the .txt files. I need to extract the .txt files only and place them in a folder of their own, ignoring all other file types in the zips files.
All the zip files are in the same folder.
Example
-foo.zip
--1.aaa
--2.bbb
--3.ccc
--4.txt
-foo2.zip
--5.aaa
--6.bbb
--7.ccc
--8.txt
I want to have 4.txt and 8.txt extracted to another folder. I can't for the life of my figure it out and spent ages looking, googling and trying. Even managing to delete to zips once in a while :-)
Thanks in advance
Use the ZipArchive type to programmatically inspect the archive before extracting:
Add-Type -AssemblyName System.IO.Compression
$destination = "C:\destination\folder"
# Locate zip file
$zipFile = Get-Item C:\path\to\file.zip
# Open a read-only file stream
$zipFileStream = $zipFile.OpenRead()
# Instantiate ZipArchive
$zipArchive = [System.IO.Compression.ZipArchive]::new($zipFileStream)
# Iterate over all entries and pick the ones you like
foreach($entry in $zipArchive.Entries){
if($entry.Name -like '*.txt'){
# Create new file on disk, open writable stream
$targetFileStream = $(
New-Item -Path $destination -Name $entry.Name -ItemType File
).OpenWrite()
# Open stream to compressed file, copy to new file stream
$entryStream = $entry.Open()
$entryStream.BaseStream.CopyTo($targetFileStream)
# Clean up
$targetFileStream,$entryStream |ForEach-Object Dispose
}
}
# Clean up
$zipArchive,$zipFileStream |ForEach-Object Dispose
Repeat for each zip file.
Note that the code above has very minimal error-handling, and is to be read as an example
Try this:
Set-Location "Extraction path"
#("full path of foo.zip","full path of foo2.zip") | ForEach {
& "Full path of 7z.exe" x '-i!*.txt' $_.FullName
}
Sets location to the path where files will be extracted.
Passes a array of zip files to for loop.
Exexute 7z command to extract only zip files.
Here is one approach:
Go through each .zip file in a folder.
Extract archive into separate folder.
Extract .txt file from folder.
Copy files into destination folder containing all .txt files. This will overwrite files if they already exist in the destination folder.
Cleanup extracted folders once finished.
Demo:
function Copy-ZipArchiveFiles {
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[ValidateScript({
if (-not(Test-Path $_ -PathType Container))
{
throw "The source path $_ does not exist. Please enter a valid source path."
}
else
{
$true
}
})]
[string]$Path,
[Parameter(Mandatory=$true)]
[ValidateScript({
if ([string]::IsNullOrEmpty($_.Trim()))
{
throw "The Destination path is null or empty. Please enter a valid destination path."
}
else
{
$true
}
})]
[string]$Destination,
[Parameter(Mandatory=$false)]
[AllowNull()]
[AllowEmptyString()]
[AllowEmptyCollection()]
[string[]]$Include
)
# Create destination folder if it doesn't already exist
if (-not(Test-Path -Path $Destination -PathType Container))
{
try
{
New-Item -Path $Destination -ItemType Directory -ErrorAction Stop
}
catch
{
throw "The destination path $Destination is invalid. Please enter a valid destination path."
}
}
# Go through each .zip file
foreach ($zipFile in Get-ChildItem -Path $Path -Filter *.zip)
{
# Get folder name from zip file w/o .zip at the end
$zipFolder = Split-Path $zipFile -LeafBase
# Get full folder path
$folderPath = Join-Path -Path $Path -ChildPath $zipFolder
# Expand .zip file into folder if it doesn't exist
if (-not(Test-Path -Path $folderPath -PathType Container))
{
Expand-Archive -Path $zipFile.FullName -DestinationPath $folderPath
}
# Copy files into destination folder
foreach ($file in Get-ChildItem $folderPath -Include $Include -Recurse)
{
Copy-Item -Path $file.FullName -Destination $Destination
}
# Delete extracted folders
Remove-Item -Path $folderPath -Recurse -Force
}
}
Usage:
Copy-ZipArchiveFiles `
-Path "C:\path\to\zip\files" `
-Destination "C:\path\to\text\files" `
-Include "*.txt"
Note: Could also use this for multiple extension types as well by passing -Include *.txt, *.pdf. I also went a bit overboard in the parameter error checking, but I believe in writing robust code. Good habit to get into when writing your own cmdlets anyways :)
I have a main folder where other folders reside with .mkg files . These folders have a certain format.
Folder = NameSerie.SxEy.Randomstuf
Item in folder = NameSerie.SxEy.Randomstuf.mkv
Where x is the season number and y is the episode number.
What i want to do is automaticly create a folder if the nameSerie isnt already created and put the .mkv files in this folder.
So, if we have folders named NameSerie.SxEy.Randomstuf we check if folder NameSerie exists, if not we create one. Then we enter folder NameSerie.SxEy.Randomstuf and copy the NameSerie.SxEy.Randomstuf.mkv file in the NameSerie folder.
the file name needs to change from NameSerie.SxEy.Ra.n[dom}stuf.mkv to NameSerie.SxEy.mkv but I cant seem to figure out how to remove the random stuf after the NameSerie.SxEy.< >.mkv
This is the code that i have managed to create but im still stuck. I have managed to create a folder if one does not exists but this only works if the .mkv file is not in a folder.
$Location = "\\<ip>\Share\Media\Series"
#rename files
Get-ChildItem $Location | Rename-Item -NewName {$_.Name.Replace(" [480p]","") }
#make folder for serie if it does not exist
Get-ChildItem "$Location\*.mkv" |
Foreach-Object {
$FullName = $_.Name
$pos = $FullName.IndexOf(" - ")
$Name = $FullName.Substring(0, $pos)
Write-Host $_.FullName
$TARGETDIR = "$Location\$Name"
if( -Not (Test-Path -Path $TARGETDIR ) )
{
New-Item -ItemType directory -Path $TARGETDIR
}
Move-Item -Path $_.FullName -Destination $TARGETDIR
}
You could use the parameter -Recurse when you list files from the path like this:
Get-ChildItem "$Location\*.mkv" -Recurse
instead of:
Get-ChildItem "$Location\*.mkv"
With the -Recurse parameter, the script will list files in the current directory and for each directory in the current directory, it will do the same recursively. So the .mkv inside other folders won't hide any longer.
I wrote the below PowerShell script to compress logs older than 30 days:
$LastWrite=(get-date).AddDays(-30).ToString("MM/dd/yyyy")
Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object
{$_.LastWriteTime -le $LastWrite}
Now, I am unable to get a compress command in PowerShell via which I can compress (zip/tar) the server.log* files older than 30 days.
Expecting a single command which I can use by adding a pipe sign in the above command.
You can use the Compress-Archive cmdlet to zip files if you have PowerShell version 5 or above:
$LastWrite = (get-date).AddDays(-30)
$Files = Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object {$_.LastWriteTime -le $LastWrite}
ForEach ($File in $Files) {
$File | Compress-Archive -DestinationPath "$($File.fullname).zip"
}
If you have an older version of Powershell you can use ZipFileExtensions' CreateEntryFromFile method, but there are a lot of considerations if you want a robust script that runs unattended.
In months of testing a script developed for this purpose, I encountered some issues that have made this small problem more complicated:
Will any of the files be locked? CreateEntryFromFile may fail if so.
Did you know that you can have multiple copies of the same file in a Zip archive? It's harder to extract them because you can't put them in the same folder. My script checks the file path and the archived file time stamp (+/- 2 seconds due to the lost date precision in Zip format) to determine if it's been already archived, and doesn't create a duplicate.
Are the files created in a time zone with Daylight Savings? Zip format doesn't preserve that attribute, and may lose or gain an hour when uncompressed.
Do you want to delete the original if it was successfully archived?
If unsuccessful due to a locked/missing file or very long path, should the process continue?
Will any error leave you with an unusable zip file? You need to Dispose() the archive to finalize it.
How many archives do you want to keep? I prefer one per run-month, adding new entries to an existing zip.
Do you want to preserve the relative path? Doing so will partially eliminate the problem of duplicates inside the zip file.
Mark Wragg's script should work if you don't care about these issues and you have Powershell 5, but it creates a zip for every log, which may not be what you want.
Here's the current version of the script - in case GitHub ever becomes unavailable:
#Sends $FileSpecs files to a zip archive if they match $Filter - deleting the original if $DeleteAfterArchiving is true.
#Files that have already been archived will be ignored.
param (
[string] $ParentFolder = "$PSScriptRoot", #Files will be stored in the zip with path relative to this folder
[string[]] $FileSpecs = #("*.log","*.txt","*.svclog","*.log.*"),
$Filter = { $_.LastWriteTime -lt (Get-Date).AddDays(-7)}, #a Where-Object function - default = older than 7 days
[string] $ZipPath = "$PSScriptRoot\archive-$(get-date -f yyyy-MM).zip", #create one archive per run-month - it may contain older files
[System.IO.Compression.CompressionLevel]$CompressionLevel = [System.IO.Compression.CompressionLevel]::Optimal,
[switch] $DeleteAfterArchiving = $true,
[switch] $Verbose = $true,
[switch] $Recurse = $true
)
#( 'System.IO.Compression','System.IO.Compression.FileSystem') | % { [void][System.Reflection.Assembly]::LoadWithPartialName($_) }
Push-Location $ParentFolder #change to the folder so we can get relative path
$FileList = (Get-ChildItem $FileSpecs -File -Recurse:$Recurse | Where-Object $Filter) #CreateEntryFromFile raises UnauthorizedAccessException if item is a directory
$totalcount = $FileList.Count
$countdown = $totalcount
$skipped = #()
Try{
$WriteArchive = [IO.Compression.ZipFile]::Open( $ZipPath, [System.IO.Compression.ZipArchiveMode]::Update)
ForEach ($File in $FileList){
Write-Progress -Activity "Archiving files" -Status "Archiving file $($totalcount - $countdown) of $totalcount : $($File.Name)" -PercentComplete (($totalcount - $countdown)/$totalcount * 100)
$ArchivedFile = $null
$RelativePath = (Resolve-Path -LiteralPath "$($File.FullName)" -Relative) -replace '^.\\'
$AlreadyArchivedFile = ($WriteArchive.Entries | Where-Object {#zip will store multiple copies of the exact same file - prevent this by checking if already archived.
(($_.FullName -eq $RelativePath) -and ($_.Length -eq $File.Length) ) -and
([math]::Abs(($_.LastWriteTime.UtcDateTime - $File.LastWriteTimeUtc).Seconds) -le 2) #ZipFileExtensions timestamps are only precise within 2 seconds.
})
If($AlreadyArchivedFile -eq $null){
If($Verbose){Write-Host "Archiving $RelativePath $($File.LastWriteTimeUtc -f "yyyyMMdd-HHmmss") $($File.Length)" }
Try{
$ArchivedFile = [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($WriteArchive, $File.FullName, $RelativePath, $CompressionLevel)
}Catch{
Write-Warning "$($File.FullName) could not be archived. `n $($_.Exception.Message)"
$skipped += [psobject]#{Path=$file.FullName; Reason=$_.Exception.Message}
}
If($File.LastWriteTime.IsDaylightSavingTime() -and $ArchivedFile){#HACK: fix for buggy date - adds an hour inside archive when the zipped file was created during PDT (files created during PST are not affected). Not sure how to introduce DST attribute to file date in the archive.
$entry = $WriteArchive.GetEntry($RelativePath)
$entry.LastWriteTime = ($File.LastWriteTime.ToLocalTime() - (New-TimeSpan -Hours 1)) #TODO: This is better, but maybe not fully correct. Does it work in all time zones?
}
}Else{#Write-Warning "$($File.FullName) is already archived$(If($DeleteAfterArchiving){' and will be deleted.'}Else{'. No action taken.'})"
Write-Warning "$($File.FullName) is already archived - No action taken."
$skipped += [psobject]#{Path=$file.FullName; Reason="Already archived"}
}
If((($ArchivedFile -ne $null) -and ($ArchivedFile.FullName -eq $RelativePath)) -and $DeleteAfterArchiving) { #delete original if it's been successfully archived.
Try {
Remove-Item $File.FullName -Verbose:$Verbose
}Catch{
Write-Warning "$($File.FullName) could not be deleted. `n $($_.Exception.Message)"
}
}
$countdown = $countdown -1
}
}Catch [Exception]{
Write-Error $_.Exception
}Finally{
$WriteArchive.Dispose() #close the zip file so it can be read later
Write-Host "Sent $($totalcount - $countdown - $($skipped.Count)) of $totalcount files to archive: $ZipPath"
$skipped | Format-Table -Autosize -Wrap
}
Pop-Location
Here's a command line that will compress all server.log* files older than 30 days under the current folder:
.\ArchiveOldLogs.ps1 -FileSpecs #("server.log*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-30)}
I have a powershell script that will look for and compress log files then make a new .7zip file, however currently it has trouble compressing folder with dates e.g. application_logs_2016-07-14
1: The script works fine if there are folders with char's however if the containing folder has a date e.g. (application_logs_2016-07-14) nothing is archived.
2: I need a zip log files that are older than 5 days old, dump.log.341.log, dump.log.342.log and dump.log.343.log should be converted as dump.log.341.zip, dump.log.342.zip, dump.log.343.zip.
Here is the current code, if any powershell guru's could advise i'd be very happy. Thanks in advance.
# See if 7 zip is installed
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias sz "$env:ProgramFiles\7-Zip\7z.exe"
# Define log location and target directory
$Logfile = "D:\stuff\Software\dump files\Archive.log"
$newdirectory = "D:\stuff\compressedlogs"
# Write to log file - start of archive
$now = Get-date -Format "yyyy-MM-dd HH:mm:ss"
Add-Content $Logfile -value "Archive started $now"
# Import what we want to back up from a list file
$List = Import-CSV "D:\stuff\Software\dump files\Compressionlist.txt"
ForEach ($Entry in $List){
$filepath = $($Entry.FilePath)
$Extension = $($Entry.Extension)
$Include = $($Entry.Include)
$Horizon = $($Entry.Horizon)
$Ext2Chg = $($Entry.Ext2Chg)
# Extract List of Files to process
$log = Get-ChildItem -Recurse -Path $filepath -Filter $Extension -Include $Include | Where-Object {$_.lastwriteTime -lt (((Get-Date).AddDays($Horizon)).date)}
# Archive each file found
ForEach ($file in $log) {
if ($file -ne $null) {
$name = $file.name
$newdate = $file.CreationTime | Get-Date -f "yyyy-MM-dd"
$newname = $file.BaseName + "___" + $newdate + $file.Extension
$directory = $file.DirectoryName
$zipfile = $newname.replace($Ext2Chg,".7z")
sz a -t7z "$newdirectory\$zipfile" "$directory\$name"
$now = Get-date -Format "yyyy-MM-dd HH:mm:ss"
Add-Content $Logfile -value "File $directory\$name archived to folder\new filename $newdirectory\$newname at $now"
Remove-Item $file
}
}
}
# Write to log file - end of archive
$now = Get-date -Format "yyyy-MM-dd HH:mm:ss"
Add-Content $Logfile -value "Archive completed $now"
# End
The script looks at a txt document to find what to archive
COMPRESSION LIST
Filepath,Extension,Include,Horizon,Ext2Chg
E:\APPLICATION\DUMP_logs_*,*.log,APP.dumps-currentlog.messages*,-5,.log
===============================================================
example folder structure
D:\application\server_log
(which will contain a log e.g. server_log_2016-07-14_00-00-00_0001.log) this will archive fine.
D:\application\application_log_2016-07-14
(which will contain a log e.g. APP.dumps-currentlog.messages.log) this will NOT archive.
Hope that make sense.