I have been trying to create an output file that writes multiple execute scripts taking a certain parameter from an array list. So far I am getting jumbled duplicated output. How can I get one execute command on each line? Here's what I have.
$myArray = #(1,2,3)
foreach ($element in $myArray) {
$myobj = "EXECUTE [masterdb].[dbo].[update_rows] #row_num=" +"'"+$element+"'"+","+ "#status = 'Fail'"
$myprocedure += $myobj
$myobj = $null
}
Out-file -filepath $path -inputobject $myprocedure -width 50 -force
$myprocedure is never initialized as an array, so it becomes a string that you simply add more text to. Either you need to add a linebreak at end of the execute lines:
$myobj = "EXECUTE [masterdb].[dbo].[update_rows] #row_num=" +"'"+$element+"'"+","+ "#status = 'Fail'" + [System.Environment]::NewLine
Or create an empty array called $myprocedure first:
$myArray = #(1,2,3)
$myprocedure = #()
$path = "test.txt"
foreach ($element in $myArray) {
$myobj = "EXECUTE [masterdb].[dbo].[update_rows] #row_num=" +"'"+$element+"'"+","+ "#status = 'Fail'"
$myprocedure += $myobj
$myobj = $null
}
Out-file -filepath $path -inputobject $myprocedure -width 50 -force
Or append 3 times to the file:
$myArray = #(1,2,3)
$path = "test.txt"
#remove-item $path if necessary
foreach ($element in $myArray) {
"EXECUTE [masterdb].[dbo].[update_rows] #row_num=" +"'"+$element+"'"+","+ "#status = 'Fail'" | Out-file -filepath $path -width 50 -force -Append
}
Related
I am attempting to extract the date last modified from the files in a Windows directory. Here is my basic script:
Function Get-FolderItem {
[cmdletbinding(DefaultParameterSetName='Filter')]
Param (
[parameter(Position=0,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
[Alias('FullName')]
[string[]]$Path = $PWD,
[parameter(ParameterSetName='Filter')]
[string[]]$Filter = '*.*',
[parameter(ParameterSetName='Exclude')]
[string[]]$ExcludeFile,
[parameter()]
[int]$MaxAge,
[parameter()]
[int]$MinAge
)
Begin {
$params = New-Object System.Collections.Arraylist
$params.AddRange(#("/L","/E","/NJH","/NDL","/BYTES","/FP","/NC","/XJ","/R:0","/W:0","T:W","/TS","/UNILOG:c:\temp\test.txt"))
#params.AddRange(#("/L","/S","/NJH","/BYTES","/FP","/NC","/NDL","/TS","/XJ","/R:0","/W:0"))
If ($PSBoundParameters['MaxAge']) {
$params.Add("/MaxAge:$MaxAge") | Out-Null
}
If ($PSBoundParameters['MinAge']) {
$params.Add("/MinAge:$MinAge") | Out-Null
}
}
Process {
ForEach ($item in $Path) {
Try {
$item = (Resolve-Path -LiteralPath $item -ErrorAction Stop).ProviderPath
If (-Not (Test-Path -LiteralPath $item -Type Container -ErrorAction Stop)) {
Write-Warning ("{0} is not a directory and will be skipped" -f $item)
Return
}
If ($PSBoundParameters['ExcludeFile']) {
$Script = "robocopy `"$item`" NULL $Filter $params /XF $($ExcludeFile -join ',')"
} Else {
$Script = "robocopy `"$item`" NULL $Filter $params"
}
Write-Verbose ("Scanning {0}" -f $item)
Invoke-Expression $Script | Out-Null
get-content "c:\temp\test.txt" | ForEach {
Try {
If ($_.Trim() -match "^(?<Children>\d+)\s(?<FullName>.*)") {
$object = New-Object PSObject -Property #{
FullName = $matches.FullName
Extension = $matches.fullname -replace '.*\.(.*)','$1'
FullPathLength = [int] $matches.FullName.Length
FileHash = Get-FileHash -LiteralPath "\\?\$($matches.FullName)" |Select -Expand Hash
Created = ([System.IO.FileInfo] $matches.FullName).creationtime
LastWriteTime = ([System.IO.FileInfo] $matches.FullName).LastWriteTime
Characters = (Get-Content -LiteralPath "\\?\$($matches.FullName)" | Measure-Object -ignorewhitespace -Character).Characters
Owner = (Get-ACL $matches.Fullname).Owner
}
$object.pstypenames.insert(0,'System.IO.RobocopyDirectoryInfo')
Write-Output $object
} Else {
Write-Verbose ("Not matched: {0}" -f $_)
}
} Catch {
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
} Catch {
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
}
$a = Get-FolderItem "C:\TargetDirectory\Folder" | Export-Csv -Path C:\Temp\output.csv -Encoding Unicode
The script extracts the date last modified of filepaths less than 260 characters. It returns a nonsense date of 1600-12-31 4:00:00 PM for files longer than 260 characters. Here is the line that is not working:
LastWriteTime = ([System.IO.FileInfo] $matches.FullName).LastWriteTime
My first attempt to solve this problem was to find a command that began with Get- because such commands were useful in extracting filehashes, filepaths, character counts and owner names of files longer than 260 characters. For example:
Owner = (Get-ACL $matches.Fullname).Owner
Characters = (Get-Content -LiteralPath "\\?\$($matches.FullName)" | Measure-Object-ignorewhitespace -Character).Characters
FileHash = Get-FileHash -LiteralPath "\\?\$($matches.FullName)" |Select -Expand Hash
Get-Date however seemed to be about getting the current date.
In my second attempt, I went back to Boe Prox's original blogpost on this script and noticed that his script had two components that were missing from mine:
a robocopy switch /TS
Date = [datetime]$matches.Date
I added to my script however doing so return an error: WARNING: Cannot convert null to type "System.DateTime". I rechecked the file in the directory, and it clearly has a date.
I reexamined the documentation on Get-Date and tried
Date = Get-Date -Format o | ForEach-Object { $matches -replace ":", "." }
However, this returned WARNING: Cannot convert value "2018/03/05 18:06:54 C:TargetDirectory\Folder\Temp.csv to type "System.IO.FileInfo". Error: " Illegal characters in path."
(N.B. In other posts, people have suggested changing the server settings to permit the existence of files longer than 260 characters. This is not an option for me because I do not have access to the servers.)
Once you hit 260 characters in the path, you hit the old Windows MAX_PATH limitation. In order to get around that, you have to prepend your path with \\?\.
In your code above, you do that for Characters and FileHash but you don't do that when retrieving LastWriteTime. e.g. Changing the path to this will work:
Created = ([System.IO.FileInfo] "\\?\$($matches.FullName)").creationtime
LastWriteTime = ([System.IO.FileInfo] "\\?\$($matches.FullName)").LastWriteTime
The alternative way is to use the Get-ChildItem cmdlet along with \\?\ prepended to the path to retrieve most of the fields you want without having to query it multiple times:
get-content "c:\temp\test.txt" | ForEach {
Try {
If ($_.Trim() -match "^(?<Children>\d+)\s(?<FullName>.*)") {
$file = Get-ChildItem "\\?\$($matches.FullName)"
$object = New-Object PSObject -Property #{
FullName = $file.FullName
Extension = $file.Extension
FullPathLength = $file.FullName.Length
FileHash = Get-FileHash -LiteralPath "\\?\$($matches.FullName)" |Select -Expand Hash
Created = $file.CreationTime
LastWriteTime = $file.LastWriteTime
Characters = (Get-Content -LiteralPath "\\?\$($matches.FullName)" | Measure-Object -ignorewhitespace -Character).Characters
Owner = (Get-ACL $matches.Fullname).Owner
}
$object.pstypenames.insert(0,'System.IO.RobocopyDirectoryInfo')
Write-Output $object
} Else {
Write-Verbose ("Not matched: {0}" -f $_)
}
} Catch {
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
i have 1 question:
i need verify 3 reg key on 20 pc and export result on csv file.
I used this string
Get-ItemProperty -Path hklm:"\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\" -Name "keyname" | Export-csv -path "csvpath"
and recive the all value for thi key but i don't need see the "PSPath, PSParentPath, PSChildName, PSDrive, PSProvider.
now i was thinking of making a script with variables to simplify it, but at this point i would like it to tell me even if the key was not found and the basic thing i can run it from the DC to all machines (about 20).
this could be a starting point
$key1 = name key 1
$key2 = name key 2
$key3 = name key 3
$hostname= hostname
$regkey= get-itemprperty -path ecc....
and now i'm seeing how you implement the verification loop and export everything to csv
thx
To verify the key existence, use Test-Path.
Computer names and Key names as arrays of strings.
No experience with remoting, I think you'll be using Invoke-Command, but this should give you an idea of looping and getting all non-PS properties:
Computer1
Computer2
Computer3
'# -split '\n'
$keyNames = #'
KeyName1
KeyName2
KeyName3
`# -split '\n'
ForEach ( $Comoputer in $Computers) {
ForEach ( $KeyName in $KeyNames ) {
If ( Test-Path $KeyName )
{
$AllProps = ($key = Get-Item $KeyName).Property
(Get-ItemProperty $key).PSobject.Properties | where name -in $AllProps | select Name , Value
<< Create output >>
}
Else
{
"$ComputerName: $KeyName not found."
}
}
} | Export-Csv "\\Path\to\CsvFile"
To probe multiple computers for 3 registry properties and output the result in a CSV file, you can use Invoke-Command like below:
$computers = 'pc01','pc02','pc03' # etc. the 20 computers you want to probe
$propertynames = 'property1','property2','property3' # you may use wildcards here
# loop over the computers
$result = foreach ($computer in $computers) {
if (!(Test-Connection -ComputerName $computer -Count 1 -Quiet)) {
Write-Warning "Computer '$computer' is not responding"
continue # skip this computer and proceed with the next
}
Invoke-Command -ComputerName $computer -ScriptBlock {
$regPath = "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon"
# create a temporary Hashtable to store the items
$hash = [ordered]#{}
# loop over the properties
foreach ($prop in $using:propertynames) {
$entry = Get-ItemProperty -Path $regPath -Name $prop -ErrorAction SilentlyContinue
if ($entry) {
$hash['ComputerName'] = $using:computer
$entry = $entry | Select-Object * -ExcludeProperty PS*
# use a loop in case you have used wildards for the property names
foreach ($item in $entry.PsObject.Properties) {
$hash[$item.Name] = $item.Value
}
}
else {
Write-Warning "Could not find property '$prop'"
}
}
if ($hash.Count) {
# output the hash converted to PSObject
[PsCustomObject]$hash
}
}
}
# remove the properties added by Invoke-Command
$result = $result | Select-Object * -ExcludeProperty PS*,RunspaceId
# output to gridview
$result | Out-GridView
# output to CSV file
$result | Export-Csv -Path 'X:\Path\To\TheResults.csv' -NoTypeInformation
So I have a parser that goes through two different logs, both .csv files, and checks for certain lines based off the regex code that I have chosen.
This one grabs the IDNumber from the beginning of the filename(1234-randomfile.csv), then adds the files location to a variable($Validate), then based on the regex, adds files to certain variables($Scriptdone, $Updatedone, $Failed) and starts the checks to see if they have them.
I am trying to make it so that the output is not line for line as the files I parse through have the same IDNumbers. So for example:
Output Currently:
1234 Script Completed
1234 Update Completed
How I want output:
1234 Script Completed Update Completed
Anyways, Thanks for all the assistance!
function Get-MR4RES {
[CmdletBinding()]
param (
[Parameter(Position = 0,
Mandatory = $True)]
[ValidateNotNullorEmpty()]
[ValidateScript( {Test-Path -Path $_ -PathType 'Any'})]
[String]
$Files,
[Parameter(Position = 1,
Mandatory = $false)]
[String]
$CSVPath) # End Param
begin {
# Setting Global Variables
$Scriptcompletedsuccess = '.+Script\scompleted\ssuccessfully.+' # 3:44:15 End function called, Script completed successfully at 3:44:15 on Tue 07/03/2018
$Updatecomplete = '\w+\s+\:\s\[\d+\:\d+\:\d+\]\s+\w+\scomplete' # STATUS : [03:43:07] Update complete
$FailedValidaton = '.+check\sfail.+'
$Fail1 = 'Validation Failed'
$Fail2 = 'Failed'
$Good1 = 'Script completed'
$Good2 = 'Update completed'
$array = #('IDNumber, Results')
$counter = 0
$FileList = (Get-ChildItem -Path $Files -File -Filter "*.log").FullName
$Done = ''
} # End begin
process {
# Do the following code in all the files in the filelist
foreach ($File in $fileList) {
# Test files variables to ensure is directory to ensure progress bar will be operational and needed
if ((Get-Item $Files) -is [System.IO.DirectoryInfo]) {
# Counts once per each file variable in filelist variable
$counter++
# Progress bar indicates the name of the current file and calculates percent based on current count verses total files in $filelist
Write-Progress -Activity 'Analyzing Files' -CurrentOperation $File -PercentComplete (($counter / $FileList.count) * 100)
}
# Calculates ID number based on filename, file name is -filtered in beginning to only contain properly named files
$IDNumber = [System.IO.Path]::GetFileName("$File").split('-')[0]
# Puts file into Variable to be IF Else
$Validate = Get-Content -Path $File
$Scriptdone = $Validate | Where-Object {$_ -match $Scriptcompletedsuccess}
$Updatedone = $Validate | where-object {$_ -match $Updatecomplete}
$Failed = $Validate | Where-Object {$_ -match $FailedValidaton}
# Check if the file HAS a FAILED validation
if($Failed){
# Creates an array of the data from each file that failed
$array += -join ("$IDNumber",', ',"$Fail1")
}
Elseif($Scriptdone){
$Done = $Good1
# Creates an array of the data from each file that script completed
$array += -join ("$IDNumber",', ',"$Done")
} # if the parser found "Update complete"
Elseif($Updatedone){
$Done = $Good2
# Creates an array of the data from each file that update is done
$array += -join ("$IDNumber",', ',"$Done")
} # End of Successful
Else{
# Creates an array of the data from each file that failed
$array += -join ("$IDNumber",', ',"$Fail2")
}
} # End of foreach
} # End process section
End {
# If CSVPath is used in get-command
if ($PSBoundParameters.ContainsKey('CSVPath')) {
# Pipe the array data to a CSV
Add-Content -Path $CSVPath -Value $array -Encoding ascii
}
# If no CSVPath is used in get-command
else {
# Out-put to console
Write-Output $array
} # End of else
} # End of the End
} # End of function
If you want to append new message to existing output you have to tell PowerShell to which entry it should add new info. As manipulating strings is not very intuitive in my opinion I'd suggest to use an object for that.
First you have to define data structure:
// Before ForEach
$array = #()
$properties = #{'ID'="";
'Results'=""}
// In ForEach
$object = New-Object –TypeName PSObject –Prop $properties
$object.ID = $IDNumber
Next, in your if you can set the value (this can also be done using Switch as suggested by #LotPings but let's leave it as it is for simplicity):
$object.Results = $Done // or $Fail or $Fail2
Then you should first check if the entry with such $ID already exists and if yes, add new result. If no, just add new element to the array. Something like this should work:
$line = $array | Where-Object ID -eq $object.id
if ($line) {
$line.Results += " $($object.Results)"
}
else {
$array += $object
}
Of course this will also require changing the way as you output you data (for example by using Export-Csv):
$array | Export-Csv $CSVPath -Append -NoTypeInformation
I would like the output of this powershell to be sent to file, currently it outputs to screen, I have tried adding various pipe to file commands but it populates the file with numbers and the original content from the screen
$strFilter = "(&(objectCategory=Group)(|(groupType=2)(groupType=4)(groupType=8)))"
$objDomain = New-Object System.DirectoryServices.DirectoryEntry
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$objSearcher.SearchRoot = $objDomain
$objSearcher.PageSize = 1000
$objSearcher.Filter = $strFilter
$objSearcher.SearchScope = "Subtree"
$objSearcher.PropertiesToLoad.Add("cn") | Out-Null
$objSearcher.PropertiesToLoad.Add("member") | Out-Null
$($colResults = $objSearcher.FindAll()
foreach ($objResult in $colResults){
$objItem = $objResult.Properties;
Write-Output $objItem.cn
foreach ($objMember in $objItem.member) {
Write-Output " $objMember"
}
}
I have a script set to enter a for each loop every-time a file is created. Once in the loop it will move a file to a another folder and if the same file has to be moved 3 times it will move the file to a different table and remove the record of it from the hash table.
My issue is when I run the script it does not do anything that I write inside the for each loop. Only if I write script above it. Can someone please advise?
$folder = 'C:\Users\jnwankwo\Documents\IUR Test\r' # Enter the root path you want to monitor.
$Failedfolder = 'C:\Users\jnwankwo\Documents\IUR Test\r'
$filter = '*.*' # You can enter a wildcard filter here.
$Files = #{}
$Counter = 1
$folder = 'C:\Users\jnwankwo\Documents\IUR Test\r' # Enter the root path you want to monitor.
$Failedfolder = 'C:\Users\jnwankwo\Documents\IUR Test\r'
$filter = '*.*' # You can enter a wildcard filter here.
$Files = #{}
$Counter = 1
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
ForEach ($file in $folder)
{
$fName = $file.Name
if (-not $Files.ContainsKey($fName))
{
$Files.Add($fName,$Counter)
}
if (($Files.ContainsKey($fName)) -and ($Files.Item($fName) -lt 3))
{
Move-Item 'C:\Users\jnwankwo\Documents\IUR Test\r\*.txt' 'C:\Users\jnwankwo\Documents\IUR Test' -force
$Files.Set_Item($fName,$Counter++)
}
ElseIf (($Files.ContainsKey($fName)) -and ($Files.Item($fName) -eq 3))
{
$Files.clear()
Move-Item 'C:\Users\jnwankwo\Documents\Failed\' $Failedfolder -force
}
}
}
# To stop the monitoring, run the following commands:
# Unregister-Event FileCreated
I have found one thing in your code.
Change ForEach ($file in $folder) to ForEach ($file in (gci $folder))
Here you go, you will have to change the folders back though :)
$folder = 'C:\temp\test' # Enter the root path you want to monitor.
$filter = '*.*' # You can enter a wildcard filter here.
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName,LastWrite'}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
$folder = 'C:\temp\test' # Enter the root path you want to monitor.
$Failedfolder = 'C:\temp\test3'
$Files = #{}
$Counter = 1
ForEach ($file in (gci $folder))
{
$fName = $file.Name
if (-not $Files.ContainsKey($fName))
{
$Files.Add($fName,$Counter)
}
if (($Files.ContainsKey($fName)) -and ($Files.Item($fName) -lt 3))
{
Move-Item $file.Fullname 'C:\Users\jnwankwo\Documents\IUR Test' -force
$Files.Item($fName) = $Counter++
}
ElseIf (($Files.ContainsKey($fName)) -and ($Files.Item($fName) -eq 3))
{
$Files.clear()
Move-Item $file.Fullname $Failedfolder -force
}
}
}
Addition:
To store your Hashtable to a file and re-import it on the next run you can use the following code:
#store hashtable to file
$Files.GetEnumerator() | % { "$($_.Name)=$($_.Value)" } | Out-File files_ht.txt
#to import again
$Files = Get-Content files_ht.txt | Convertfrom-StringData
This should enable you to have the data from the hashtable persistent