I am trying to get the Color Palette of an image. I tried various methods, and now I use the following code in PowerShell, but I could not get the correct result:
$filename = "C:\Users\schoo\Desktop\bb.jpg"
$BitMap = [System.Drawing.Bitmap]::FromFile((Get-Item $filename).fullname)
Foreach($y in (1..($BitMap.Height-1))){
Foreach($x in (1..($BitMap.Width-1))){
$Pixel = $BitMap.GetPixel($X,$Y)
$BackGround = $Pixel.Name
}
$R = $Pixel | select -ExpandProperty R
$G = $Pixel | select -ExpandProperty G
$B = $Pixel | select -ExpandProperty B
$A = $Pixel | select -ExpandProperty A
$allClr = "$R" + "." + "$G" + "." + "$B" + "." + "$A"
$allClr
}
The result take me more than thousand RGB codes:
I assume that by "color palette" you mean the swathe of distinct colours that appear in the image.
A simple (and quite fast) way to select only a distinct subset of a collection is to use a hashtable.
$filename = 'C:\Users\schoo\Desktop\bb.jpg'
$BitMap = [System.Drawing.Bitmap]::FromFile((Resolve-Path $filename).ProviderPath)
# A hashtable to keep track of the colors we've encountered
$table = #{}
foreach($h in 1..$BitMap.Height){
foreach($w in 1..$BitMap.Width) {
# Assign a value to the current Color key
$table[$BitMap.GetPixel($w - 1,$h - 1)] = $true
}
}
# The hashtable keys is out palette
$palette = $table.Keys
Related
Generating a list of windows workstation computer names by reading the active directory and I need to find the highest number so that I can then assign a new device with the next available number - I am not having any success in doing this - how to do it? And as you can see from the list of names, I also have missing numbers in the sequence that ideally, I would like to fill in with new devices also...
The code I am using to get the list from AD is below.
((Get-ADComputer -Filter {operatingsystem -notlike "*server*" -and Name -like $NamingConvention -and enabled -eq "true"} -Credential $credential -server $ADServerIP).Name)
List of device names
PC01
PC28
PC29
PC30
PC31
PC32
PC33
PC34
PC35
PC36
PC37
PC38
PC40
PC41
PC42
PC43
PC44
PC45
PC46
PC47
PC27
PC48
PC26
PC24
PC179
PC18
PC180
PC181
PC182
PC183
PC184
PC185
PC186
PC187
PC188
PC189
PC19
PC190
PC191
PC192
PC21
PC22
PC23
PC25
PC178
PC49
PC51
PC77
PC78
PC79
PC80
PC81
PC83
PC84
PC85
PC87
PC88
PC89
PC90
PC91
PC92
PC93
PC94
PC95
PC96
PC97
PC76
PC50
PC75
PC72
PC52
PC53
PC54
PC55
PC56
PC57
PC59
PC60
PC61
PC62
PC63
PC64
PC65
PC66
PC67
PC68
PC69
PC70
PC71
PC73
PC98
PC177
PC175
PC115
PC116
PC117
PC118
PC119
PC12
PC120
PC121
PC122
PC123
PC124
PC125
PC126
PC127
PC128
PC129
PC13
PC130
PC131
PC114
PC132
PC113
PC111
PC02
PC03
PC04
PC06
PC08
PC09
PC10
PC100
PC101
PC102
PC103
PC104
PC105
PC106
PC107
PC108
PC109
PC11
PC110
PC112
PC176
PC133
PC135
PC158
PC159
PC16
PC160
PC161
PC162
PC163
PC164
PC165
PC166
PC167
PC168
PC169
PC17
PC170
PC171
PC172
PC173
PC174
PC157
PC134
PC156
PC154
PC136
PC137
PC138
PC139
PC14
PC140
PC141
PC142
PC143
PC144
PC145
PC146
PC147
PC148
PC149
PC150
PC151
PC152
PC153
PC155
PC99
Sort the pc names on their numeric values and select the last one:
$lastPC = (Get-ADComputer -Filter {operatingsystem -notlike "*server*" -and Name -like $NamingConvention -and enabled -eq "true"} -Credential $credential -server $ADServerIP).Name |
Sort-Object { [int]($_ -replace '\D+')} | Select-Object -Last 1
Here's a solution that will give you the highest number ($dataMax), the missing numbers ($dataMissing), and the next number to use ($dataNext). The next number to use will be either the 1st missing number, or if there are no missing numbers then it will be the highest number + 1
# load the computers list
$data = ((Get-ADComputer -Filter {operatingsystem -notlike "*server*" -and Name -like $NamingConvention -and enabled -eq "true"} -Credential $credential -server $ADServerIP).Name)
# create an array by splitting the data text using the "space" character as a delimiter
$data = $data.Split(" ")
# remove all the alpha characters ("PC"), leaving only the number values so it can be sorted easier
$dataCleaned = $data -replace "[^0-9]" , '' | sort { [int]$_ }
# after sorting the data, [-1] represents the last element in the array which will be the highest number
[int]$dataMax = $dataCleaned[-1]
# create a number range that represents all the numbers from 1 to the highest number
$range = 1..$dataMax | foreach-object { '{0:d2}' -f $_ }
# compare the created range against the numbers actually in the computer array to find the missing numbers
$dataMissing = #(compare $range $dataCleaned -PassThru)
# if there's a missing value, [0] represents the first element in the array of missing numbers
if ($dataMissing)
{
$dataNext = $dataMissing[0]
}
# if there's no missing values, the next value is the max value + 1
else
{
$dataMissing = "none"
$dataNext = $dataMax + 1
}
Write-Host "The highest number is:"('{0:d2}' -f $dataMax)
Write-Host "The missing numbers are: $dataMissing"
Write-Host "The next number to use is:" ('{0:d2}' -f $dataNext)
Assuming your list is exactly as it appears to be, then this appears to be one way to do it:
$List = 'PC01 PC28 PC29 PC30 PC31 PC32 PC33 PC34 PC35 PC36 PC37 PC38 PC40 PC41 PC42 PC43 PC44 PC45 PC46 PC47 PC27 PC48 PC26 PC24 PC179 PC18 PC180 PC181 PC182 PC183 PC184 PC185 PC186 PC187 PC188 PC189 PC19 PC190 PC191 PC192 PC21 PC22 PC23 PC25 PC178 PC49 PC51 PC77 PC78 PC79 PC80 PC81 PC83 PC84 PC85 PC87 PC88 PC89 PC90 PC91 PC92 PC93 PC94 PC95 PC96 PC97 PC76 PC50 PC75 PC72 PC52 PC53 PC54 PC55 PC56 PC57 PC59 PC60 PC61 PC62 PC63 PC64 PC65 PC66 PC67 PC68 PC69 PC70 PC71 PC73 PC98 PC177 PC175 PC115 PC116 PC117 PC118 PC119 PC12 PC120 PC121 PC122 PC123 PC124 PC125 PC126 PC127 PC128 PC129 PC13 PC130 PC131 PC114 PC132 PC113 PC111 PC02 PC03 PC04 PC06 PC08 PC09 PC10 PC100 PC101 PC102 PC103 PC104 PC105 PC106 PC107 PC108 PC109 PC11 PC110 PC112 PC176 PC133 PC135 PC158 PC159 PC16 PC160 PC161 PC162 PC163 PC164 PC165 PC166 PC167 PC168 PC169 PC17 PC170 PC171 PC172 PC173 PC174 PC157 PC134 PC156 PC154 PC136 PC137 PC138 PC139 PC14 PC140 PC141 PC142 PC143 PC144 PC145 PC146 PC147 PC148 PC149 PC150 PC151 PC152 PC153 PC155 PC99'
$NextNumber = ($List -split "\s" | ForEach-Object { if ($_ -match 'PC(?<Number>\d+)') { $Matches.Number } } | Measure-Object -Maximum).Maximum + 1
$NextNumber
"PC$NextNumber"
I'm using FromFile to get the image out of files, and it has the following error for the png's on the FromFile line:
Exception calling "FromFile" with "1" argument(s): "The given path's
format is not supported."
So, I'm trying to convert the bmp's to jpg, (see convert line above FromFile below) but all the examples I see (that seem usable) are saving the file. I don't want to save the file in the dir. All I need is the image format, so FromFile can use it like this example. I saw ConvertTo-Jpeg, but I don't think this is a standard powershell module, or don't see how to install it.
I saw this link, but I don't think that would leave the image in the format needed by FromFile.
This is my code:
$imageFile2 = Get-ChildItem -Recurse -Path $ImageFullBasePath -Include #("*.bmp","*.jpg","*.png") | Where-Object {$_.Name -match "$($pictureName)"} #$imageFile | Select-String -Pattern '$($pictureName)' -AllMatches
Write-Host $imageFile2
if($imageFile2.Exists)
{
if($imageFile2 -Match "png")
{
$imageFile2 | .\ConvertTo-Jpeg #I don't think this will work with FromFile below
}
$image = [System.Drawing.Image]::FromFile($imageFile2) step
}
else {
Write-Host "$($imageFile2) does not exist"
}
And then I put it in excel:
$xlsx = $result | Export-Excel -Path $outFilePath -WorksheetName $errCode -Autosize -AutoFilter -FreezeTopRow -BoldTopRow -PassThru # -ClearSheet can't ClearSheet every time or it clears previous data ###left off
$ws = $xlsx.Workbook.Worksheets[$errCode]
$ws.Dimension.Columns #number of columns
$tempRowCount = $ws.Dimension.Rows #number of rows
#only change width of 3rd column
$ws.Column(3).Width
$ws.Column(3).Width = 100
#Change all row heights
for ($row = 2 ;( $row -le $tempRowCount ); $row++)
{
#Write-Host $($ws.Dimension.Rows)
#Write-Host $($row)
$ws.Row($row).Height
$ws.Row($row).Height = 150
#place the image in spreadsheet
#https://github.com/dfinke/ImportExcel/issues/1041 https://github.com/dfinke/ImportExcel/issues/993
$drawingName = "$($row.PictureID)_Col3_$($row)" #Name_ColumnIndex_RowIndex
Write-Host $image
$picture = $ws.Drawings.AddPicture("$drawingName",$image)
$picture.SetPosition($row - 1, 0, 3 - 1, 0)
if($ws.Row($row).Height -lt $image.Height * (375/500)) {
$ws.Row($row).Height = $image.Height * (375/500)
}
if($ws.Column(3).Width -lt $image.Width * (17/120)){
$ws.Column(3).Width = $image.Width * (17/120)
}
}
Update:
I just wanted to reiterate that FromFile can't be used for a png image. So where Hey Scripting Guy saves the image like this doesn't work:
$image = [drawing.image]::FromFile($imageFile2)
I figured out that the $imageFile2 path has 2 filenames in it. It must be that two met the Get-ChildItem/Where-Object/match criteria. The images look identical, but have similar names, so will be easy to process. After I split the names, it does FromFile ok.
I'm looking to prepend a folder name to the start of an array of (relative) paths using a foreach statement, but it's not making any changes to the array (no errors either)
Note: This is more for educational purposes than functional as I have it working using a for loop which I've commented out, but I'm interested in learning how the foreach statement works
$myFiles = #(
"blah1\blah2\file1.txt"
"blah3\blah4\file2.txt"
"blah5\blah6\file3.txt"
)
$checkoutFolder = "folder1"
#for ($h = 0; $h -lt $myFiles.Length; $h++) {
#$myFiles[$h] = $checkoutFolder + "\" + $myFiles[$h]
#}
foreach ($path in $myFiles) {
$path = $checkoutFolder + "\" + $path
}
$myFiles
I also tried using a buffer variable e.g.
$buffer = $checkoutFolder + "\" + $path
$path = $buffer
But same result i.e.
OUTPUT:
blah1\blah2\file1.txt
blah3\blah4\file2.txt
blah5\blah6\file3.txt
I could think of two ways:
Create new array with modified data of old array
$myFiles = #(
"blah1\blah2\file1.txt"
"blah3\blah4\file2.txt"
"blah5\blah6\file3.txt"
)
$checkoutFolder = "folder1"
#Create new array $myFilesnew
$myFilesnew = #()
#For each line in in old array
foreach ($file in $myFiles)
{
#Create new row from modied row $file of $myFiles array
$row = $checkoutFolder+"\"+$file
#Add row $row to a new array $myFilesnew
$myFilesnew+=$row
}
$myFilesnew
Modify each row of existing array:
$myFiles = #(
"blah1\blah2\file1.txt"
"blah3\blah4\file2.txt"
"blah5\blah6\file3.txt"
)
$checkoutFolder = "folder1"
$i=0
while($i-lt $myFiles.Count)
{
#Get $i row $myFiles[$i] from aray, perform insert of modified data, write data back to $myFiles[$i] row of the array
$myfiles[$i]=$myFiles[$i].Insert(0,$checkoutFolder+"\");
#Add +1 to $i
$i++
}
$myFiles
Better start using the Join-Path cmdlet to avoid creating paths with backslashes omitted or doubled.
Something like this would do it:
$checkoutFolder = "folder1"
$myFiles = "blah1\blah2\file1.txt", "blah3\blah4\file2.txt", "blah5\blah6\file3.txt" | ForEach-Object {
Join-Path -Path $checkoutFolder -ChildPath $_
}
$myFiles
output:
folder1\blah1\blah2\file1.txt
folder1\blah3\blah4\file2.txt
folder1\blah5\blah6\file3.txt
You can replace the regex beginning of each string with the folder name. This is a useful idiom for generating computernames too.
'blah1\blah2\file1.txt','blah3\blah4\file2.txt','blah5\blah6\file3.txt' -replace '^','folder1\'
folder1\blah1\blah2\file1.txt
folder1\blah3\blah4\file2.txt
folder1\blah5\blah6\file3.txt
I have this code, which is part of a function that returns a list of SQL rows based on a time range.
The query itself (1st line of code) is quite fast. But the foreach loop that extract the relevant data takes a while to complete.
I have around 350.000 lines to iterate, and despite it's has to take a while, I was wondering if there is any change I could make in order to make it faster.
$SqlDocmasterTableResuls = $this.SqlConnection.GetSqlData("SELECT DOCNUM, DOCLOC FROM MHGROUP.DOCMASTER WHERE ENTRYWHEN between '" + $this.FromDate + "' and '" + $this.ToDate + "'")
[System.Collections.ArrayList]$ListOfDocuments = [System.Collections.ArrayList]::New()
if ($SqlDocmasterTableResuls.Rows.Count)
{
foreach ($Row in $SqlDocmasterTableResuls.Rows)
{
$DocProperties = #{
"DOCNUM" = $Row.DOCNUM
"SOURCE" = $Row.DOCLOC
"DESTINATION" = $Row.DOCLOC -replace ([regex]::Escape($this.iManSourceFileServerName + ":" + $this.iManSourceFileServerPath.ROOTPATH)),
([regex]::Escape($this.iManDestinationFileServerName + ":" + $this.iManDestinationFileServerPath.ROOTPATH))
}
$DocObj = New-Object -TypeName PSObject -Property $DocProperties
$ListOfDocuments.Add($DocObj)
}
return $ListOfDocuments
Avoid appending to an array in a loop. The best way to capture loop data in a variable is to simply collect the loop output in a variable:
$ListOfDocuments = foreach ($Row in $SqlDocmasterTableResuls.Rows) {
New-Object -Type PSObject -Property #{
"DOCNUM" = $Row.DOCNUM
"SOURCE" = $Row.DOCLOC
"DESTINATION" = $Row.DOCLOC -replace ...
}
}
You don't need the surrounding if conditional, because if the table doesn't have any rows the loop should skip right over it, leaving you with an empty result.
Since you want to return the list anyway, you don't even need to collect the loop output in a variable. Just leave the output as it is and it will get returned anyway.
Also avoid repeating operations in a loop when their result doesn't change. Calculate the escaped source and destination paths once before the loop:
$srcPath = [regex]::Escape($this.iManSourceFileServerName + ':' + $this.iManSourceFileServerPath.ROOTPATH)
$dstPath = [regex]::Escape($this.iManDestinationFileServerName + ':' + $this.iManDestinationFileServerPath.ROOTPATH)
and use the variables $srcPath and $dstPath inside the loop.
Something like this should do:
$SqlDocmasterTableResuls = $this.SqlConnection.GetSqlData("SELECT ...")
$srcPath = [regex]::Escape($this.iManSourceFileServerName + ':' + $this.iManSourceFileServerPath.ROOTPATH)
$dstPath = [regex]::Escape($this.iManDestinationFileServerName + ':' + $this.iManDestinationFileServerPath.ROOTPATH)
foreach ($Row in $SqlDocmasterTableResuls.Rows) {
New-Object -Type PSObject -Property #{
'DOCNUM' = $Row.DOCNUM
'SOURCE' = $Row.DOCLOC
'DESTINATION' = $Row.DOCLOC -replace $srcPath, $dstPath
}
}
return
[edit - per Ansgar Wiechers, the PSCO accelerator is only available with ps3+.]
one other thing that may help is to replace New-Object with [PSCustomObject]. that is usually somewhat faster to use. something like this ...
$DocObj = [PSCustomObject]$DocProperties
another way to use that type accelerator is to do what Ansgar Wiechers did in his code sample, but use the accelerator instead of the cmdlet. like this ...
[PSCustomObject]#{
'DOCNUM' = $Row.DOCNUM
'SOURCE' = $Row.DOCLOC
'DESTINATION' = $Row.DOCLOC -replace $srcPath, $dstPath
}
hope that helps,
lee
Ok so we have a manual process that runs through PL/SQL Developer to run a query and then export to csv.
I am trying to automate that process using powershell since we are working in a windows environment.
I have created two files that seems to be exact duplicates from the automated and manual process but they don't work the same so I assume I am missing some hidden characters but I can't find them or figure out how to remove them.
The most obvious example of them working differently is opening them in excel. The manual file opens in excel automatically putting each column in it's own seperate column. The automated file instead puts everything into one column.
Can anybody shed some light? I am hoping that by resolving this or at least getting some info will help with the bigger problem of it not processing correctly.
Thanks.
ex one column
"rownum","year","month","batch","facility","transfer_facility","trans_dt","meter","ticket","trans_product","trans","shipper","customer","supplier","broker","origin","destination","quantity"
ex seperate column
"","ROWNUM","RPT_YR","RPT_MO","BATCH_NBR","FACILITY_CD","TRANSFER_FACILITY_CD","TRANS_DT","METER_NBR","TKT_NBR","TRANS_PRODUCT_CD","TRANS_CD","SHIPPER_CD","CUSTOMER_NBR","SUPPLIER_NBR","BROKER_CD","ORIGIN_CD","DESTINATION_CD","NET_QTY"
$connectionstring = "Data Source=database;User Id=user;Password=password"
$connection = New-Object System.Data.OracleClient.OracleConnection($connectionstring)
$command = New-Object System.Data.OracleClient.OracleCommand($query, $connection)
$connection.Open()
Write-Host -ForegroundColor Black " Opening Oracle Connection"
Start-Sleep -Seconds 2
#Getting data from oracle
Write-Host
Write-Host -ForegroundColor Black "Getting data from Oracle"
$Oracle_data=$command.ExecuteReader()
Start-Sleep -Seconds 2
if ($Oracle_data.read()){
Write-Host -ForegroundColor Green "Connection Success"
while ($Oracle_data.read()) {
#Variables for recordset
$rownum = $Oracle_data.GetDecimal(0)
$rpt_yr = $Oracle_data.GetDecimal(1)
$rpt_mo = $Oracle_data.GetDecimal(2)
$batch_nbr = $Oracle_data.GetString(3)
$facility_cd = $Oracle_data.GetString(4)
$transfer_facility_cd = $Oracle_data.GetString(5)
$trans_dt = $Oracle_data.GetDateTime(6)
$meter_nbr = $Oracle_data.GetString(7)
$tkt_nbr = $Oracle_data.GetString(8)
$trans_product_cd = $Oracle_data.GetString(9)
$trans_cd = $Oracle_data.GetString(10)
$shipper_cd = $Oracle_data.GetString(11)
$customer_nbr = $Oracle_data.GetString(12)
$supplier_nbr = $Oracle_data.GetString(13)
$broker_cd = $Oracle_data.GetString(14)
$origin_cd = $Oracle_data.GetString(15)
$destination_cd = $Oracle_data.GetString(16)
$net_qty = $Oracle_data.GetDecimal(17)
#Define new file
$filename = "Pipeline" #Get-Date -UFormat "%b%Y"
$filename = $filename + ".csv"
$fileLocation = $newdir + "\" + $filename
$fileExists = Test-Path $fileLocation
#Create object to hold record
$obj = new-object psobject -prop #{
rownum = $rownum
year = $rpt_yr
month = $rpt_mo
batch = $batch_nbr
facility = $facility_cd
transfer_facility = $transfer_facility_cd
trans_dt = $trans_dt
meter = $meter_nbr
ticket = $tkt_nbr
trans_product = $trans_product_cd
trans = $trans_cd
shipper = $shipper_cd
customer = $customer_nbr
supplier = $supplier_nbr
broker = $broker_cd
origin = $origin_cd
destination = $destination_cd
quantity = $net_qty
}
$records += $obj
}
}else {
Write-Host -ForegroundColor Red " Connection Failed"
}
#Write records to file with headers
$records | Select-Object rownum,year,month,batch,facility,transfer_facility,trans_dt,meter,ticket,trans_product,trans,shipper,customer,supplier,broker,origin,destination,quantity |
ConvertTo-Csv |
Select -Skip 1|
Out-File $fileLocation
Why are you skipping the first row(usually the headers)? Also, try using Export-CSV instead:
#Write records to file with headers
$records | Select-Object rownum, year, month, batch, facility, transfer_facility, trans_dt, meter, ticket, trans_product, trans, shipper, customer, supplier, broker, origin, destination, quantity |
Export-Csv $fileLocation -NoTypeInformation