Powershell Find and Replace Script - windows

The pattern match is returning $True so the test I would think should proceed to run the replace, but instead just sits in a no response mode. What is wrong with my test that is not replacing the last 3 lines to "NOW in the PRIMARY or SECONDARY role"
The service.log file has the following lines of syntax:
NOW in the PRIMARY or SECONDARY role
End of Job Run
NOW in the PRIMARY or SECONDARY role
End of Job Run
NOW in the PRIMARY or SECONDARY role
End of Job Run
NOW in the PRIMARY or SECONDARY role
End of Job Run
not in the PRIMARY or SECONDARY role
End of Job Run
not in the PRIMARY or SECONDARY role
End of Job Run
not in the PRIMARY or SECONDARY role
$Path = "D:\Program Files\test"
$File = "service.log"
$result = Get-ChildItem $Path -recurse | Where-Object { $_.Name -match $File }
$a = Get-Content $result.FullName -Tail 10 | Select-string -Pattern "not in the PRIMARY or SECONDARY role" -Quiet
if ($a -eq $True) {
((Get-Content $result.FullName -Raw) -Replace 'not in the PRIMARY or SECONDARY role', 'NOW in the PRIMARY or SECONDARY role') | Set-Content -Path $Path\$File }
Working now -
$Directory = "D:\Program Files\test"
$Source = "service.log"
$result = Get-ChildItem $Directory -recurse -filter $Source
if (( Get-Content $result.FullName -Tail 10 | Select-string -Pattern "not in the PRIMARY or SECONDARY role" -Quiet ) ) {
((Get-Content -path $Directory\$Source -Raw) -Replace 'not in the PRIMARY or SECONDARY role', 'NOW in the PRIMARY or SECONDARY role') | Set-Content -Path $Directory\$Source
}

The reason it just sits there is the -Wait parameter. According to the documentation for Get-Content:
Keeps the file open after all existing lines have been output. While
waiting, Get-Content checks the file once each second and outputs new
lines if present. You can interrupt Wait by pressing CTRL+C. Waiting
also ends if the file gets deleted, in which case a non-terminating
error is reported.
Remove that and the script will complete as expected.
Aside from that, you should filter files at the Get-ChildItem cmdlet, not get all files and then filter with a Where-Object statement. It is much faster, and a good practice.
$Path = "D:\Program Files\test"
$File = "service.log"
$result = Get-ChildItem $Path -recurse -filter $File
I also don't see $a defined anywhere.

Related

Pattern as a input in powershell

I am trying a script that could compress and delete folders which is in 'n' sublevel folders.
For example the below script could do the job for 3 sublevel folders.
$path = Read-Host "Enter the path"
$directory = $path +"\*\*\*"
Add-Type -AssemblyName System.IO.Compression.FileSystem
$folders = Get-ChildItem $directory -recurse | Where-Object {$_.PSIsContainer -eq $true} | Select-object -ExpandProperty FullName
foreach ($folder in $folders) {
Write-Verbose "Archiving $archive"
$archive = $folder + '.zip'
[System.IO.Compression.ZipFile]::CreateFromDirectory($folder, $archive, 'Optimal', $True)
Remove-Item $folder -recurse -force -Verbose
}
The script is working fine...My doubt is, how to input the sublevel as a input value?
In the above script I am giving the path as a input...Likewise, I wish to input the sublevel also as a input value.
For example: Enter the level:3 (This should assume the pattern like (bs* bs* bs*)
or 4 (bs* bs* bs* bs*)
Note : I had mentioned \ as bs. Because if I mention the pattern as in script, its not visible in the preview.
Any help?
PowerShell allows you to replicate strings with its * operator:
PS> $numLevels = 3; $path = 'C:\path\to'; $path + ('\*' * $numLevels)
C:\path\to\*\*\*

Copy and paste the latest modified log files from one server to other server (the servers are in the different domains)

I am trying to write a script that copies the latest modified log file from one server to another server (servers are in different domains), while copying it should check for the credentials and then execute the script.
Please let me know if the script is correct or any corrections to be made.
$sourcePath = 'sourcepath'
$destPath = 'Destinationpath'
$compareDate = (Get-Date).AddDays(-1);
$LastFileCaptured = Get-ChildItem -Path $sourcePath |
where {$_.Extension.EndsWith('.log') -and $_.LastWriteTime -gt $compareDate } |
Sort LastAccessTime -Descending |
select -First 1 |
select -ExcludeProperty Name, LastAccessTime
Write-Host $LastFileCaptured.Name
$LastFileCaptured.LastAccessTime
$LastFileCaptured = Get-ChildItem -Recurse |
Where-Object{$_.LastWriteTime.AddDays(-1) -gt (Get-Date)}
Write-Host $LastFileCaptured
Get-ChildItem $sourcePath -Recurse -Include '.log' | Where-Object {
$_.LastWriteTime.AddDays(-1).ToString("yyyy/MM/dd") -gt (get-date).ToString("yyyy/mm/dd")
} | ForEach-Object {
$destDir = Split-Path ($_.FullName -replace [regex]::Escape($sourcePath), $destPath)
if (!(Test-Path $destDir)) {
New-Item -ItemType directory $destDir | Out-Null
}
Copy-Item $_ -Destination $destDir
}
The "correctness" of your script is determined easily by running it! But, while this isn't a direct answer, I would suggest robocopy for this task.
In particular note these options:
/mon: Monitors the source, and runs again when more than N changes are detected.
/maxage: Specifies the maximum file age (to exclude files older than N days or date).

Getting ACL info using PowerShell and Robocopy due to PathTooLongException

I'm trying to get a listing of all permissions on some network folders using PowerShell. Unfortunately I'm encountering the dreaded PathTooLongException so I'm attempting to use Robocopy as a work around. However I'm a complete novice with PowerShell so was hoping for a little help. The easiest command I've come up with is
Get-Childitem "S:\StartingDir" -recurse | Get-Acl | Select-Object path,accestostring | Export-Csv "C:\export.csv"
That works and does what I want except the exception I'm getting. How would I insert Robocopy into this statement to bypass the exception? Any thoughts?
First, create a batch file, such as getShortFilename.bat, with the following lines:
#ECHO OFF
echo %~s1
This will return the short filename of the long filename passed to it. The following script will use that to get the short filename when Get-Acl fails due to a long path. It will then use the short path with cacls to return the permissions.
$files = robocopy c:\temp NULL /L /S /NJH /NJS /NDL /NS /NC
remove-item "c:\temp\acls.txt" -ErrorAction SilentlyContinue
foreach($file in $files){
$filename = $file.Trim()
# Skip any blank lines
if($filename -eq ""){ continue }
Try{
Get-Acl "$filename" -ErrorAction Stop| Select-Object path, accesstostring | Export-Csv "C:\temp\acls.txt" -Append
}
Catch{
$shortName = &C:\temp\getShortFilename.bat "$filename"
$acls = &cacls $shortName
$acls = $acls -split '[\r\n]'
#create an object to hold the filename and ACLs so that export-csv will work with it
$outObject = new-object PSObject
[string]$aclString
$firstPass = $true
# Loop through the lines of the cacls.exe output
foreach($acl in $acls){
$trimmedAcl = $acl.Trim()
# Skip any blank lines
if($trimmedAcl -eq "" ){continue}
#The first entry has the filename and an ACL, so requires extra processing
if($firstPass -eq $true){
$firstPass = $false
# Add the long filename to the $exportArray
$outObject | add-member -MemberType NoteProperty -name "path" -Value "$filename"
#$acl
# Add the first ACL to $aclString
$firstSpace = $trimmedAcl.IndexOf(" ") + 1
$aclString = $trimmedAcl.Substring($firstSpace, $trimmedAcl.Length - $firstSpace)
} else {
$aclString += " :: $trimmedAcl"
}
}
$outObject | add-member -MemberType NoteProperty -name "accesstostring" -Value "$aclString"
$outObject | Export-Csv "C:\temp\acls.txt" -Append
}
}
Notes:
The string of ACLs that is created by Get-Acl is different from the on created by cacls, so whether that's an issue or not...
If you want the ACL string to be in the same format for all files, you could just use cacls on all files, not just the ones with long filenames. It wouldn't be very difficult to modify this script accordingly.
You may want to add extra error checking. Get-Acl could of course fail for any number of reasons, and you may or may not want to run the catch block if it fails for some reason other than the path too long.

What's wrong with my Powershell script?

I don't understand this error message I'm receiving when I try to run my powershell script. The purpose is to copy a .bat file into the main win 7 startup folder on a series of machines.
And the script I am running.
$ServerList = Get-Content "C:\ServersList.txt" #Change this to location of servers list
$SourceFileLocation = "C:\firefox_issue.bat" #For example: D:\FoldertoCopy\ or D:\file.txt
$Destination = "C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Startup" #Example: C$\temp
foreach ($_ in $ServerList)
{Copy-Item $SourceFileLocation -Destination \\$_\$Destination -Recurse -PassThru}
Write-Host "Press any key to continue ..."
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
Write-Host
Write-Host "A"
Write-Host "B"
Write-Host "C"
Because your location is getting set to:
\\SERVERNAME\C:\ProgramData...
and it should be:
\\SERVERNAME\C$\ProgamData...
Your destination needs to be:
$Destination = 'C$\ProgramData\Microsoft\Windows\Start Menu\Programs\Startup'
And your loop should be:
foreach($server in $serverList) {
Copy-Item $SourceFileLocation -Destination "\\$server\$Destination" -Recurse
}
You should probably avoid explicitly using $_ as a variable name as $_ is a special variable for accessing an object in the pipeline.
Did you read the comment behind the $Destination line?
This is a UNC path.
\\server1\c:\programdata\ is not a valid UNC-path. Try:
$Destination = "C$\ProgramData\Microsoft\Windows\Start Menu\Programs\Startup"
Also, $_ is a reserved variable for pipeline input, so you need to change it, like:
foreach ($server in $ServerList)
{Copy-Item $SourceFileLocation -Destination \\$server\$Destination -Recurse -PassThru}

Compare a log file of file paths to a directory structure and remove files not in log file

I have a file transfer/sync job that is copying files from the main network into a totally secure network using a custom protocol (ie no SMB). The problem is that because I can't look back to see what files exist, the destination is filling up, as the copy doesn't remove any files it hasn't touched (like robocopy MIR does).
Initailly I wrote a script that:
1. Opens the log file and grabs the file paths out (this is quite quick and painless)
2. Does a Get-ChildItem on the destination folder (now using dir /s /b as it's way faster than gci)
3. Compared the two, and then removed the differences.
The problem is that there are more jobs that require this clean up but the log files are 100MB and the folders contain 600,000 files, so it's taking ages and using tons of memory. I actually have yet to see one finish. I'd really like some ideas on how to make this faster (memory/cpu use doesn't bother me too much but speed is essential.
$destinationMatch = "//server/fileshare/folder/"
the log file contains some headers and footers and then 600,000 lines like this one:
"//server/fileshare/folder/dummy/deep/tags/20140826/more_stuff/Deeper/2012-07-02_2_0.dat_v2" 33296B 0B completed
Here's the script:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select Name -first 1
$manifestFileName = [string]$manifestFile.name
$manifestFullPath = $logPath + "\" + $manifestFileName
$copiedList = #()
(gc $manifestFullPath -ReadCount 0) | where {$_.trim() -match $DestinationMatch} | % {
if ( $_ -cmatch '(?<=")[^"]*(?=")' ){
$copiedList += ($matches[0]).replace("/","\")
}
}
$dest = $destinationMatch.replace("/","\")
$actualPathString = (gci -Path $dest -Recurse | select fullname).fullnameCompare-Object -ReferenceObject $copiedList -DifferenceObject $actualPathString -PassThru | % {
$leaf = Split-Path $_ -leaf
if ($leaf.contains(".")){
$fsoData = gci -Path $_
if (!($fsoData.PSIsContainer)){
Remove-Item $_ -Force
}
}
}
$actualDirectory | where {$_.PSIsContainer -and #(gci -LiteralPath $_.FullName -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue | where {!$_.PSIsContainer}).Length -eq 0} | remove-item -Recurse -Force
Ok, so let's assume that your file copy preserves the last modified date/time stamp. If you really need to pull a directory listing, and compare it against a log, I think you're doing a decent job of it. The biggest slow down is obviously going to be pulling your directory listing. I'll address that shortly. For right now I would propose the following modification of your code:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select -first 1
$RegExPattern = [regex]::escape($DestinationMatch)
$FilteredManifest = gc $manifestfile.FullPath | where {$_ -match "`"($RegexPattern[^`"]*)`""} |%{$matches[1] -replace '/','\'}
$dest = $destinationMatch.replace("/","\")
$DestFileList = gci -Path $dest -Recurse | select Fullname,Attributes
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -notmatch "Directory"}|Remove-Item $_ -Force
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -match "Directory" -and (gci -LiteralPath $_ -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue).Length -eq 0}{Remove-Item $_ -Recurse -Force}
This stops you from duplicating efforts. There's no need to get your manifest file, and then assign different variables to different properties of the file object, just reference them directly. Then later when you pull your directory listing of the drive (the slow part here), keep the full name and attributes of the files/folders. That way you can easily filter against Attributes to see what's a directory and what not, so we can deal with files first, then clean up directories later after the files are cleaned up.
That script should be a bit more streamlined version of yours. Now, about pulling that directory listing... Here's the deal, using Get-ChildItem is going to be slower than some alternatives (such as dir /s /b) but it stops you from having to duplicate efforts by later checking what's a file, and what's a directory. I suppose if the actual files/folders that you are concerned with are a small percentage of the total, then the double work may actually be worth the time and effort to pull the list with something like dir /s /b, and then parse against the log, and only pull folder/file info for the specific items you need to address.

Resources