Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
Windows PowerShell is out a quite long time now. In comparison to the the good old windows shell it's much more powerful.
Are there any scripts you use to speed up and simplify your every day work as an developer? If you can do magic with PowerShell -> please share it with us!
Update
Not really a script, but also very useful are PowerShell Community Extensions. The package contains a lot of new Cmdlets and PowerShell modifications.
I put together a bunch of scripts to work with Subversion at the command line. Most of them just use the --xml option to put various information in object form. Here are a couple of examples:
function Get-SvnStatus( [string[]] $Path = ".",
[string] $Filter = "^(?!unversioned|normal|external)",
[switch] $NoFormat )
{
# powershell chokes on "wc-status" and doesn't like two definitions of "item"
[xml]$status = ( ( Invoke-Expression "svn status $( $Path -join ',' ) --xml" ) -replace "wc-status", "svnstatus" ) `
-replace "item=", "itemstatus="
$statusObjects = $status.status.target | Foreach-Object { $_.entry } | Where-Object {
$_.svnstatus.itemstatus -match $Filter
} | Foreach-Object {
$_ | Select-Object #{ Name = "Status"; Expression = { $_.svnstatus.itemstatus } },
#{ Name = "Path"; Expression = { Join-Path ( Get-Location ) $_.path } }
} | Sort-Object Status, Path
if ( $NoFormat )
{
$statusObjects
}
else
{
$statusObjects | Format-Table -AutoSize
}
}
function Get-SvnLog( [string] $Path = ".",
[int] $Revision,
[int] $Limit = -1,
[switch] $Verbose,
[switch] $NoFormat )
{
$revisionString = ""
$limitString = ""
$verboseString = ""
if ( $Revision )
{
$revisionString = "--revision $Revision"
}
if ( $Limit -ne -1 )
{
$limitString = "--limit $Limit"
}
if ( $Verbose )
{
$verboseString = "--verbose"
}
[xml]$log = Invoke-Expression "svn log $( $path -join ',' ) --xml $revisionString $limitString $verboseString"
$logObjects = $log.log.logentry | Foreach-Object {
$logEntry = $_
$logEntry | Select-Object `
#{ Name = "Revision"; Expression = { [int]$logEntry.revision } },
#{ Name = "Author"; Expression = { $logEntry.author } },
#{ Name = "Date";
Expression = {
if ( $NoFormat )
{
[datetime]$logEntry.date
}
else
{
"{0:dd/MM/yyyy hh:mm:ss}" -f [datetime]$logEntry.date
}
} },
#{ Name = "Message"; Expression = { $logEntry.msg } } |
Foreach-Object {
# add the changed path information if the $Verbose parameter has been specified
if ( $Verbose )
{
$_ | Select-Object Revision, Author, Date, Message,
#{ Name = "ChangedPaths";
Expression = {
$paths = $logEntry.paths.path | Foreach-Object {
$_ | Select-Object `
#{ Name = "Change";
Expression = {
switch ( $_.action )
{
"A" { "added" }
"D" { "deleted" }
"M" { "modified" }
"R" { "replaced" }
default { $_.action }
}
} },
#{ Name = "Path"; Expression = { $_."#text" } }
}
if ( $NoFormat )
{
$paths
}
else
{
( $paths | Sort-Object Change | Format-Table -AutoSize | Out-String ).Trim()
}
}
}
}
else
{
$_
}
}
}
if ( $NoFormat )
{
$logObjects
}
else
{
$logObjects | Format-List
}
}
I have these aliased to svns and svnl, respectively. I talk about a few others here.
I use this one all the time because Windows Explorer's search for file contents never works for me:
Get-ChildItem -Recurse -Filter *.extension |
Select-String -List somestring |
Format-Table filename,linenumber -AutoSize
Just replace "extension" with the file extension of the file type you're interested in (or remove the -Filter parameter entirely) and replace "somestring" with the text you want to find in the file.
It's not a script, but in general it's helpful to learn when you can short-cut parameters, both by name and position.
By name, PowerShell just needs enough to narrow it down to one. For example, gci -r works but gci -f might be either -filter or -force.
Values specified without a parameter label are applied positionally. So if you want to specify -filter you could either do this:
gci -r -fil *.cs
Or provide . positionally as -path so you can also specify -filter positionally:
gci -r . *.cs
Any time you see something with proper capitalization, it's an indication I've used TAB completion. You should learn which things PS will complete for you -- it's quite good in V2.
Any time you see aliases in lowercase, it's something I typed from memory. You should memorize it too.
# grep example - find all using statements
dir -r -fil *cs | ss using
# advanced version
dir -fil *cs -r | ss '^using[^\(]+' | gpv line | sort -unique
# figure out how to query for drive free space (emphasis on "figure out" -- I can never remember things like this)
gcm *drive*
help Get-PSDrive -full
Get-PSDrive | gm
# now use it
Get-PSDrive | ? { $_.free -gt 1gb }
# pretend mscorlib.dll is an assembly you're developing and want to do some ad-hoc testing on
$system = [system.reflection.assembly]::LoadFile("c:\blah\...\mscorlib.dll")
$system | gm
$types = $a.GetTypes()
$types | gm
$types | ? { $_.ispublic -and $_.basetype -eq [system.object] } | sort name
$sbType = $types | ? { $_.name -eq "StringBuilder" }
# now that we've loaded the assembly, we could have also done:
# $sbType = [system.text.stringbuilder]
# but we may not have known it was in the Text namespace
$sb = new-object $sbType.FullName
$sb | gm
$sb.Append("asdf")
$sb.Append("jkl;")
$sb.ToString()
Related
I am attempting to extract the date last modified from the files in a Windows directory. Here is my basic script:
Function Get-FolderItem {
[cmdletbinding(DefaultParameterSetName='Filter')]
Param (
[parameter(Position=0,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
[Alias('FullName')]
[string[]]$Path = $PWD,
[parameter(ParameterSetName='Filter')]
[string[]]$Filter = '*.*',
[parameter(ParameterSetName='Exclude')]
[string[]]$ExcludeFile,
[parameter()]
[int]$MaxAge,
[parameter()]
[int]$MinAge
)
Begin {
$params = New-Object System.Collections.Arraylist
$params.AddRange(#("/L","/E","/NJH","/NDL","/BYTES","/FP","/NC","/XJ","/R:0","/W:0","T:W","/TS","/UNILOG:c:\temp\test.txt"))
#params.AddRange(#("/L","/S","/NJH","/BYTES","/FP","/NC","/NDL","/TS","/XJ","/R:0","/W:0"))
If ($PSBoundParameters['MaxAge']) {
$params.Add("/MaxAge:$MaxAge") | Out-Null
}
If ($PSBoundParameters['MinAge']) {
$params.Add("/MinAge:$MinAge") | Out-Null
}
}
Process {
ForEach ($item in $Path) {
Try {
$item = (Resolve-Path -LiteralPath $item -ErrorAction Stop).ProviderPath
If (-Not (Test-Path -LiteralPath $item -Type Container -ErrorAction Stop)) {
Write-Warning ("{0} is not a directory and will be skipped" -f $item)
Return
}
If ($PSBoundParameters['ExcludeFile']) {
$Script = "robocopy `"$item`" NULL $Filter $params /XF $($ExcludeFile -join ',')"
} Else {
$Script = "robocopy `"$item`" NULL $Filter $params"
}
Write-Verbose ("Scanning {0}" -f $item)
Invoke-Expression $Script | Out-Null
get-content "c:\temp\test.txt" | ForEach {
Try {
If ($_.Trim() -match "^(?<Children>\d+)\s(?<FullName>.*)") {
$object = New-Object PSObject -Property #{
FullName = $matches.FullName
Extension = $matches.fullname -replace '.*\.(.*)','$1'
FullPathLength = [int] $matches.FullName.Length
FileHash = Get-FileHash -LiteralPath "\\?\$($matches.FullName)" |Select -Expand Hash
Created = ([System.IO.FileInfo] $matches.FullName).creationtime
LastWriteTime = ([System.IO.FileInfo] $matches.FullName).LastWriteTime
Characters = (Get-Content -LiteralPath "\\?\$($matches.FullName)" | Measure-Object -ignorewhitespace -Character).Characters
Owner = (Get-ACL $matches.Fullname).Owner
}
$object.pstypenames.insert(0,'System.IO.RobocopyDirectoryInfo')
Write-Output $object
} Else {
Write-Verbose ("Not matched: {0}" -f $_)
}
} Catch {
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
} Catch {
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
}
$a = Get-FolderItem "C:\TargetDirectory\Folder" | Export-Csv -Path C:\Temp\output.csv -Encoding Unicode
The script extracts the date last modified of filepaths less than 260 characters. It returns a nonsense date of 1600-12-31 4:00:00 PM for files longer than 260 characters. Here is the line that is not working:
LastWriteTime = ([System.IO.FileInfo] $matches.FullName).LastWriteTime
My first attempt to solve this problem was to find a command that began with Get- because such commands were useful in extracting filehashes, filepaths, character counts and owner names of files longer than 260 characters. For example:
Owner = (Get-ACL $matches.Fullname).Owner
Characters = (Get-Content -LiteralPath "\\?\$($matches.FullName)" | Measure-Object-ignorewhitespace -Character).Characters
FileHash = Get-FileHash -LiteralPath "\\?\$($matches.FullName)" |Select -Expand Hash
Get-Date however seemed to be about getting the current date.
In my second attempt, I went back to Boe Prox's original blogpost on this script and noticed that his script had two components that were missing from mine:
a robocopy switch /TS
Date = [datetime]$matches.Date
I added to my script however doing so return an error: WARNING: Cannot convert null to type "System.DateTime". I rechecked the file in the directory, and it clearly has a date.
I reexamined the documentation on Get-Date and tried
Date = Get-Date -Format o | ForEach-Object { $matches -replace ":", "." }
However, this returned WARNING: Cannot convert value "2018/03/05 18:06:54 C:TargetDirectory\Folder\Temp.csv to type "System.IO.FileInfo". Error: " Illegal characters in path."
(N.B. In other posts, people have suggested changing the server settings to permit the existence of files longer than 260 characters. This is not an option for me because I do not have access to the servers.)
Once you hit 260 characters in the path, you hit the old Windows MAX_PATH limitation. In order to get around that, you have to prepend your path with \\?\.
In your code above, you do that for Characters and FileHash but you don't do that when retrieving LastWriteTime. e.g. Changing the path to this will work:
Created = ([System.IO.FileInfo] "\\?\$($matches.FullName)").creationtime
LastWriteTime = ([System.IO.FileInfo] "\\?\$($matches.FullName)").LastWriteTime
The alternative way is to use the Get-ChildItem cmdlet along with \\?\ prepended to the path to retrieve most of the fields you want without having to query it multiple times:
get-content "c:\temp\test.txt" | ForEach {
Try {
If ($_.Trim() -match "^(?<Children>\d+)\s(?<FullName>.*)") {
$file = Get-ChildItem "\\?\$($matches.FullName)"
$object = New-Object PSObject -Property #{
FullName = $file.FullName
Extension = $file.Extension
FullPathLength = $file.FullName.Length
FileHash = Get-FileHash -LiteralPath "\\?\$($matches.FullName)" |Select -Expand Hash
Created = $file.CreationTime
LastWriteTime = $file.LastWriteTime
Characters = (Get-Content -LiteralPath "\\?\$($matches.FullName)" | Measure-Object -ignorewhitespace -Character).Characters
Owner = (Get-ACL $matches.Fullname).Owner
}
$object.pstypenames.insert(0,'System.IO.RobocopyDirectoryInfo')
Write-Output $object
} Else {
Write-Verbose ("Not matched: {0}" -f $_)
}
} Catch {
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
i have 1 question:
i need verify 3 reg key on 20 pc and export result on csv file.
I used this string
Get-ItemProperty -Path hklm:"\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\" -Name "keyname" | Export-csv -path "csvpath"
and recive the all value for thi key but i don't need see the "PSPath, PSParentPath, PSChildName, PSDrive, PSProvider.
now i was thinking of making a script with variables to simplify it, but at this point i would like it to tell me even if the key was not found and the basic thing i can run it from the DC to all machines (about 20).
this could be a starting point
$key1 = name key 1
$key2 = name key 2
$key3 = name key 3
$hostname= hostname
$regkey= get-itemprperty -path ecc....
and now i'm seeing how you implement the verification loop and export everything to csv
thx
To verify the key existence, use Test-Path.
Computer names and Key names as arrays of strings.
No experience with remoting, I think you'll be using Invoke-Command, but this should give you an idea of looping and getting all non-PS properties:
Computer1
Computer2
Computer3
'# -split '\n'
$keyNames = #'
KeyName1
KeyName2
KeyName3
`# -split '\n'
ForEach ( $Comoputer in $Computers) {
ForEach ( $KeyName in $KeyNames ) {
If ( Test-Path $KeyName )
{
$AllProps = ($key = Get-Item $KeyName).Property
(Get-ItemProperty $key).PSobject.Properties | where name -in $AllProps | select Name , Value
<< Create output >>
}
Else
{
"$ComputerName: $KeyName not found."
}
}
} | Export-Csv "\\Path\to\CsvFile"
To probe multiple computers for 3 registry properties and output the result in a CSV file, you can use Invoke-Command like below:
$computers = 'pc01','pc02','pc03' # etc. the 20 computers you want to probe
$propertynames = 'property1','property2','property3' # you may use wildcards here
# loop over the computers
$result = foreach ($computer in $computers) {
if (!(Test-Connection -ComputerName $computer -Count 1 -Quiet)) {
Write-Warning "Computer '$computer' is not responding"
continue # skip this computer and proceed with the next
}
Invoke-Command -ComputerName $computer -ScriptBlock {
$regPath = "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon"
# create a temporary Hashtable to store the items
$hash = [ordered]#{}
# loop over the properties
foreach ($prop in $using:propertynames) {
$entry = Get-ItemProperty -Path $regPath -Name $prop -ErrorAction SilentlyContinue
if ($entry) {
$hash['ComputerName'] = $using:computer
$entry = $entry | Select-Object * -ExcludeProperty PS*
# use a loop in case you have used wildards for the property names
foreach ($item in $entry.PsObject.Properties) {
$hash[$item.Name] = $item.Value
}
}
else {
Write-Warning "Could not find property '$prop'"
}
}
if ($hash.Count) {
# output the hash converted to PSObject
[PsCustomObject]$hash
}
}
}
# remove the properties added by Invoke-Command
$result = $result | Select-Object * -ExcludeProperty PS*,RunspaceId
# output to gridview
$result | Out-GridView
# output to CSV file
$result | Export-Csv -Path 'X:\Path\To\TheResults.csv' -NoTypeInformation
I've been messing with this powershell script (i installed powershell on my mac OS) I also modified the code a bit in the first line.
I am not getting any errors, just nothing happens.
$folder = “/Users/mbp/Desktop/nier_unpacked_2_extracted“
$files = gci -recurse $folder | where { ! $_.PSIsContainer }
$fileContents = $files | foreach { gc -encoding utf8 $_.fullname }
$lines = $fileContents | foreach { if ($_ -match "^JP: (.*)$") { $matches[1] } }
$chars = $lines | foreach { $_.ToCharArray() }
$groups = $chars | group-object
$totals = $groups | sort-object -desc -property count
Basically outputting japanese text characters and how often they show up.
This is the original code(before modification):
$folder = "F:\nier_unpacked_2_extracted"
$files = gci -recurse $folder | where { ! $_.PSIsContainer }
$fileContents = $files | foreach { gc -encoding utf8 $_.fullname }
$lines = $fileContents | foreach { if ($_ -match "^JP: (.*)$") { $matches[1] } }
$chars = $lines | foreach { $_.ToCharArray() }
$groups = $chars | group-object
$totals = $groups | sort-object -desc -property count
Here is the link to the resource i got the code from if that helps: https://dev.to/nyctef/extracting-game-text-from-nier-automata-1gm0
I'm not sure why nothing is returning unfortunately.
In PowerShell (as in most other programming languages), $totals = ... means that you assign the result of the expression at the right side is assigned to the variable ($totals) at the left side.
To display the contents of the variable ($totals), you might use the Write-Output $totals, Write-Host $totals, Out-Defualt $totals, along with a lot of other output cmdlets.
Anyways, in PowerShell, it is generally not necessary to use a cmdlet in instances where the output is displayed by default. For example:
$totals Enter
I have a file with below lines
c:\scripts\oltp\db1\scripts\scripts1.sql
c:\scripts\oltp\db1\tables\scripts1.sql
c:\scripts\oltp\db1\storedprocedures\scripts1.sql
c:\scripts\oltp\db1\functions\scripts1.sql
c:\scripts\oltp\db1\tables\scripts2.sql
c:\scripts\oltp\db1\storedprocedures\scripts2.sql
I am looking for a powershell script that can sort based on list as mentioned below...
Tables,storedprocedures,views,scripts,everything else.....
My expected output is
c:\scripts\oltp\db1\tables\scripts1.sql
c:\scripts\oltp\db1\tables\scripts2.sql
c:\scripts\oltp\db1\storedprocedures\scripts1.sql
c:\scripts\oltp\db1\storedprocedures\scripts2.sql
c:\scripts\oltp\db1\scripts\scripts1.sql
c:\scripts\oltp\db1\functions\scripts1.sql
Sort-Object -Property can take anonymous calculated properties as its argument. Put a switch inside the expression based on your sorting criteria:
function Test-CustomSort {
$ScriptNames = #(
'c:\scripts\oltp\db1\scripts\scripts1.sql'
'c:\scripts\oltp\db1\tables\scripts1.sql'
'c:\scripts\oltp\db1\storedprocedures\scripts1.sql'
'c:\scripts\oltp\db1\functions\scripts1.sql'
'c:\scripts\oltp\db1\tables\scripts2.sql'
'c:\scripts\oltp\db1\storedprocedures\scripts2.sql'
)
$ScriptNames | Sort-Object #{Expression={
switch(Split-Path -Parent $_ | Split-Path -Leaf){
"tables" { 1 }
"storedprocedures" { 2 }
"views" { 3 }
"scripts" { 4 }
default { 5 }
}
}}
}
Produces:
PS C:\> Test-CustomSort
c:\scripts\oltp\db1\tables\scripts1.sql
c:\scripts\oltp\db1\tables\scripts2.sql
c:\scripts\oltp\db1\storedprocedures\scripts1.sql
c:\scripts\oltp\db1\storedprocedures\scripts2.sql
c:\scripts\oltp\db1\scripts\scripts1.sql
c:\scripts\oltp\db1\functions\scripts1.sql
You could group the paths by directory, put them into a hashtable, then output the hashtable in the desired order.
$categories = 'tables', 'storedprocedures', 'views', 'scripts', 'functions'
$ht = #{}
Get-Content 'C:\path\to\your.txt' |
Group-Object { Split-Path -Parent $_ | Split-Path -Leaf } |
ForEach-Object { $ht[$_.Name] = $_.Group }
$categories | ForEach-Object {
$ht[$_] | Sort-Object
}
If the path is always the same : c:\scripts\oltp\db1\
(gc .\list.txt) | %{$_ -replace 'c:\\scripts\\oltp\\db1\\', ''} | Sort
Output
functions\scripts1.sql
scripts\scripts1.sql
storedprocedures\scripts1.sql
storedprocedures\scripts2.sql
tables\scripts1.sql
tables\scripts2.sql
I have a file containing a large number of occurrences of the string Guid="GUID HERE" (where GUID HERE is a unique GUID at each occurrence) and I want to replace every existing GUID with a new unique GUID.
This is on a Windows development machine, so I can generate unique GUIDs with uuidgen.exe (which produces a GUID on stdout every time it is run). I have sed and such available (but no awk oddly enough).
I am basically trying to figure out if it is possible (and if so, how) to use the output of a command-line program as the replacement text in a sed substitution expression so that I can make this replacement with a minimum of effort on my part. I don't need to use sed -- if there's another way to do it, such as some crazy vim-fu or some other program, that would work as well -- but I'd prefer solutions that utilize a minimal set of *nix programs since I'm not really on *nix machines.
To be clear, if I have a file like this:
etc etc Guid="A" etc etc Guid="B"
I would like it to become this:
etc etc Guid="C" etc etc Guid="D"
where A, B, C, D are actual GUIDs, of course.
(for example, I have seen xargs used for things similar to this, but it's not available on the machines I need this to run on, either. I could install it if it's really the only way, although I'd rather not)
I rewrote the C# solution in PowerShell. I figured it would be easier for you to run a powershell script then compile a C# exe.
Steps for using this:
Download/install powershell
Save the code below somewhere, named GuidSwap.ps1
Modify the $filename and $outputFilename variables to suit your needs
Run powershell -noexit c:\location\to\guidswap.ps1
## GuidSwap.ps1
##
## Reads a file, finds any GUIDs in the file, and swaps them for a NewGUID
##
$filename = "d:\test.txt"
$outputFilename = "d:\test_new.txt"
$text = [string]::join([environment]::newline, (get-content -path $filename))
$sbNew = new-object system.text.stringBuilder
$pattern = "[a-fA-F0-9]{8}-([a-fA-F0-9]{4}-){3}[a-fA-F0-9]{12}"
$lastStart = 0
$null = ([regex]::matches($text, $pattern) | %{
$sbNew.Append($text.Substring($lastStart, $_.Index - $lastStart))
$guid = [system.guid]::newguid()
$sbNew.Append($guid)
$lastStart = $_.Index + $_.Length
})
$null = $sbNew.Append($text.Substring($lastStart))
$sbNew.ToString() | out-file -encoding ASCII $outputFilename
Write-Output "Done"
I was looking for a way to replace all GUIDs in a Visual Studio solution, so I took the answer to this StackOverflow question (GuidSwap.ps1) and extended it so that the script keeps track of GUIDs that are referenced across multiple files. An example is shown in the header below.
<#
.Synopsis
Replace all GUIDs in specified files under a root folder, carefully keeping track
of how GUIDs are referenced in different files (e.g. Visual Studio solution).
Loosely based on GuidSwap.ps1:
http://stackoverflow.com/questions/2201740/replacing-all-guids-in-a-file-with-new-guids-from-the-command-line
.NOTES
Version: 1.0
Author: Joe Zamora (blog.idmware.com)
Creation Date: 2016-03-01
Purpose/Change: Initial script development
.EXAMPLE
.\ReplaceGuids.ps1 "C:\Code\IDMware" -FileNamePatterns #("*.sln","*.csproj","*.cs") -Verbose -WhatIf
#>
# Add common parameters to the script.
[CmdletBinding()]
param(
$RootFolder
,$LogFolder='.'
,[String[]]$FileNamePatterns
,[switch]$WhatIf
)
$global:WhatIf = $WhatIf.IsPresent
# Change directory to the location of this script.
$scriptpath = $MyInvocation.MyCommand.Path
$dir = Split-Path $scriptpath
cd $dir
$ScriptName = $MyInvocation.MyCommand.Name
If(!($RootFolder))
{
Write-Host #"
Usage: $ScriptName -RootFolder <RootFolder> [Options]
Options:
-LogFolder <LogFolder> Defaults to location of script.
-FileNamePatterns #(*.ext1, *.ext2, ...) Defaults to all files (*).
-WhatIf Test run without replacements.
-Verbose Standard Powershell flags.
-Debug
"#
Exit
}
if ($LogFolder -and !(Test-Path "$LogFolder" -PathType Container))
{
Write-Host "No such folder: '$LogFolder'"
Exit
}
<#
.Synopsis
This code snippet gets all the files in $Path that contain the specified pattern.
Based on this sample:
http://www.adminarsenal.com/admin-arsenal-blog/powershell-searching-through-files-for-matching-strings/
#>
function Enumerate-FilesContainingPattern {
[CmdletBinding()]
param(
$Path=(throw 'Path cannot be empty.')
,$Pattern=(throw 'Pattern cannot be empty.')
,[String[]]$FileNamePatterns=$null
)
$PathArray = #()
if (!$FileNamePatterns) {
$FileNamePatterns = #("*")
}
ForEach ($FileNamePattern in $FileNamePatterns) {
Get-ChildItem $Path -Recurse -Filter $FileNamePattern |
Where-Object { $_.Attributes -ne "Directory"} |
ForEach-Object {
If (Get-Content $_.FullName | Select-String -Pattern $Pattern) {
$PathArray += $_.FullName
}
}
}
$PathArray
} <# function Enumerate-FilesContainingPattern #>
# Timestamps and performance.
$stopWatch = [System.Diagnostics.Stopwatch]::StartNew()
$startTime = Get-Date
Write-Verbose #"
--- SCRIPT BEGIN $ScriptName $startTime ---
"#
# Begin by finding all files under the root folder that contain a GUID pattern.
$GuidRegexPattern = "[a-fA-F0-9]{8}-([a-fA-F0-9]{4}-){3}[a-fA-F0-9]{12}"
$FileList = Enumerate-FilesContainingPattern $RootFolder $GuidRegexPattern $FileNamePatterns
$LogFilePrefix = "{0}-{1}" -f $ScriptName, $startTime.ToString("yyyy-MM-dd_HH-mm-ss")
$FileListLogFile = Join-Path $LogFolder "$LogFilePrefix-FileList.txt"
$FileList | ForEach-Object {$_ | Out-File $FileListLogFile -Append}
Write-Host "File list log file:`r`n$FileListLogFile"
cat $FileListLogFile | %{Write-Verbose $_}
# Next, do a read-only loop over the files and build a mapping table of old to new GUIDs.
$guidMap = #{}
foreach ($filePath in $FileList)
{
$text = [string]::join([environment]::newline, (get-content -path $filePath))
Foreach ($match in [regex]::matches($text, $GuidRegexPattern)) {
$oldGuid = $match.Value.ToUpper()
if (!$guidMap.ContainsKey($oldGuid)) {
$newGuid = [System.Guid]::newguid().ToString().ToUpper()
$guidMap[$oldGuid] = $newGuid
}
}
}
$GuidMapLogFile = Join-Path $LogFolder "$LogFilePrefix-GuidMap.csv"
"OldGuid,NewGuid" | Out-File $GuidMapLogFile
$guidMap.Keys | % { "$_,$($guidMap[$_])" | Out-File $GuidMapLogFile -Append }
Write-Host "GUID map log file:`r`n$GuidMapLogFile"
cat $GuidMapLogFile | %{Write-Verbose $_}
# Finally, do the search-and-replace.
foreach ($filePath in $FileList) {
Write-Verbose "Processing $filePath"
$newText = New-Object System.Text.StringBuilder
cat $filePath | % {
$original = $_
$new = $_
$isMatch = $false
$matches = [regex]::Matches($new, $GuidRegexPattern)
foreach ($match in $matches) {
$isMatch = $true
$new = $new -ireplace $match.Value, $guidMap[$match.Value.ToString().ToUpper()]
}
$newText.AppendLine($new) | Out-Null
if ($isMatch) {
$msg = "Old: $original`r`nNew: $new"
if ($global:WhatIf) {
Write-Host "What if:`r`n$msg"
} else {
Write-Verbose "`r`n$msg"
}
}
}
if (!$global:WhatIf) {
$newText.ToString() | Set-Content $filePath
}
}
# Timestamps and performance.
$endTime = Get-Date
Write-Verbose #"
--- SCRIPT END $ScriptName $endTime ---
Total elapsed: $($stopWatch.Elapsed)
"#
Would you be open to compiling a C# console app to do this? I whipped this up real quick. It takes a filename as a command line argument, finds anything that looks like a GUID, replaces it with a new GUID, and writes the new contents of the file.
Take a look:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Text.RegularExpressions;
namespace GUIDSwap
{
class Program
{
static int Main(string[] args)
{
try
{
if (args.Length == 0) throw new ApplicationException("No filename specified");
string filename = args[0];
filename = filename.TrimStart(new char[] { '"' }).TrimEnd(new char[] { '"' });
if (!File.Exists(filename)) throw new ApplicationException("File not found");
StreamReader sr = new StreamReader(filename);
string text = sr.ReadToEnd();
sr.Close();
StringBuilder sbNew = new StringBuilder();
string pattern = "[a-fA-F0-9]{8}-([a-fA-F0-9]{4}-){3}[a-fA-F0-9]{12}";
int lastStart = 0;
foreach (Match m in Regex.Matches(text, pattern))
{
sbNew.Append(text.Substring(lastStart, m.Index - lastStart));
sbNew.Append(Guid.NewGuid().ToString());
lastStart = m.Index + m.Length;
}
sbNew.Append(text.Substring(lastStart));
StreamWriter sw = new StreamWriter(filename, false);
sw.Write(sbNew.ToString());
sw.Flush();
sw.Close();
return 0;
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
return 1;
}
}
}
}
you can just capture the uid into a variable first, then do the sed?
#echo off
setlocal enabledelayedexpansion
for /f %%x in ('uuidgen.exe') do (
set uid=%%x
)
sed -e "s/Guid=\"\(.*\)\"/Guid=\"!uid!\"/g" file
I really like the solution by BigJoe714. I took it one step further finding all specific extension files and replace all GUIDs.
<pre>
<code>
using System;
using System.IO;
using System.Linq;
using System.Text;
using System.Text.RegularExpressions;
namespace AllGuidSwap
{
class Program
{
static void Main(string[] args)
{
try
{
if (args.Length == 0) throw new ApplicationException("No filename specified");
string directory = args[0]; //Path
string extensionToFind = args[1]; //Extension to find
if (!Directory.Exists(directory)) throw new ApplicationException("directory not found");
var allFiles = Directory.GetFiles(directory).Where(a => a.EndsWith(extensionToFind));
foreach (var filename in allFiles)
{
if (!File.Exists(filename)) throw new ApplicationException("File not found");
var sr = new StreamReader(filename);
var text = sr.ReadToEnd();
sr.Close();
var sbNew = new StringBuilder();
var pattern = "[a-fA-F0-9]{8}-([a-fA-F0-9]{4}-){3}[a-fA-F0-9]{12}";
var lastStart = 0;
foreach (Match m in Regex.Matches(text, pattern))
{
sbNew.Append(text.Substring(lastStart, m.Index - lastStart));
sbNew.Append(Guid.NewGuid().ToString().ToUpperInvariant());
lastStart = m.Index + m.Length;
}
sbNew.Append(text.Substring(lastStart));
var sw = new StreamWriter(filename, false);
sw.Write(sbNew.ToString());
sw.Flush();
sw.Close();
}
Console.WriteLine("Successful");
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
Console.ReadKey();
}
}
}
</code>
</pre>