Improve performance in powershell file generation - performance

I have written a basic script that gets system logs from a restful api, and writes it to a json file. It works, but im seeing huge RAM issues - i think its because the script is getting the whole payload of system log entries and storing it in memory before writing it to the file.
The do loop in the script below is for pagination - the api only returns a certain number of events at a time so it needs to be iterated through.
Can performance be improved here - I guess can the payload in memory get written to a file when it hits a certain size, and then it just loops over? Or is there anything else that can be done to reduce the system resources?
Thanks for your help!
#Name: getLogs.ps1
#Purpose: Script for exporting logs to a json file
#variables
$org = "<api url here>"
$token="<api token here>"
$filePath = "D:\Downloads\logs\test2.json"
#format the date and append 00:00:00
$fromTime = Get-Date -Format s
$fromTime = $fromTime.Substring(0,10) + "00%3A00%3A00Z"
#format the date and append 23:59.59
$toTime = Get-Date -Format s
$toTime = $fromTime.Substring(0,10) + "T23%3A59%3A59Z"
### Set $uri as the API URI for use in the loop
$uri = "$org/api/v1/logs?until=$toTime&limit=20&sortOrder=DESCENDING&q=&since=$fromTimeT00"
### Define $allLogsas empty array
$allLogs = #()
### Use a while loop and get all users from Okta API
DO
{
$webrequest = Invoke-WebRequest -Headers #{"Authorization" = "SSWS $token"} -Method Get -Uri $uri
$link = $webrequest.Headers.Link.Split("<").Split(">")
$uri = $link[3]
$json = $webrequest | ConvertFrom-Json
$allLogs += $json
#slow it down to avoid rate limits
Start-Sleep -m 1001
} while ($webrequest.Headers.Link.EndsWith('rel="next"'))
# get results, switch to json, save
$allLogs | ConvertTo-Json | Out-File $filePath

You could try changing:
$json = $webrequest | ConvertFrom-Json
To:
$webrequest | Out-File $filePath -Append
And then doing away with $alllogs entirely.

There is another post about performance and using the PoSH Invoke-* cmdlets.
As noted in the previous post, there are just slow in general prior to PoSHv6.
See this post for a that on the topic.
Choppy File Download using Powershell on Azure VM

Related

PowerShell question - webrequest or curl doesn't work with tasks

I've written (partly by myself) a script which get the ip address with a webrequest and save it as variable for a dyndns update. If I start the script via ISE or directly with PowerShell it works as expected. But if I start it via scheduled task the web request, whether with invoke-webrequest or curl doesn't works. The script will be started but the variable is empty. If I try to save the output of the command to a file its also empty, so I don't know why it doesn't work.
Here is the script:
# Get IPv4 and IPv6 from wtfismyip.com
$strIPv6 = Invoke-WebRequest -uri http://[2a01:4f9:4b:4c8f::2]/text | select Content -ExpandProperty Content
$strIPv4 = Invoke-WebRequest -uri http://95.217.228.176/text | select Content -ExpandProperty Content
$strIPs = $strIPv4 + "," + $strIPv6
$strIPs = [string]::join("",($strIPs.Split("`n")))
echo "IPs" >> C:\temp\test.txt
echo $strIPs >> C:\temp\test.txt
$strDYNDNS_URL = "https://dyndns.strato.com/nic/update?hostname=sub.domain.de&myip=$strIPs"
# Debug Informations
write "IPv4: $strIPv4"
write "IPv6: $strIPv6"
write "Strato URL: $strDYNDNS_URL"
# Strato DynDNS Update
$user = "domainuser"
$pass = "pass"
$pair = "${user}:${pass}"
$bytes = [System.Text.Encoding]::ASCII.GetBytes($pair)
$base64 = [System.Convert]::ToBase64String($bytes)
$basicAuthValue = "Basic $base64"
$headers = #{ Authorization = $basicAuthValue }
Invoke-WebRequest -uri $strDYNDNS_URL -Headers $headers
The scheduled task runs for testing with an admin user with the following settings:
Start with highest privileges
Trigger: every hour
Cmd: powershell.exe ExecutionPolicy Bypass -Command "C:\Daten\Scripte\StratoDynDNS-Update.ps1"
Does have anyone an idea why it doesn't work? Or is there a better solution?
The goal is to make a dns update at strato because the ISP of my client doesn't provide static ips, even with a business internet connection.

PowerShell array not showing spaces ('n) in email when using Send-MailMessage and shows * at the top when using import-csv?

I have a script I'm running that generates and emails a report based on backups copied and held in our environment. What I am doing to format this report is creating a body array to include the various contents shown in the report, and in order to format the data (name and lastwritetime of backups on the server), I am using the 'select name, select lastwritetime | export-csv' and then import-csv at the end into the body array with ConverTo-Html -Fragment. This formats all the data appropriately:
$body = #()
"" | Out-File C:\files\PrimaryBody.csv -append
Code here that grabs the backup issues
("The number of backups that were not successfully copied are " + $warningErrorCount + ". ") | Out-File C:\files\PrimaryBody.csv -append
$BackupIssues | Out-File C:\files\PrimaryBody.csv -append
$OutofDateBackups | select name, LastWriteTime | export-csv -path C:\files\OutDatedBackups.csv -append -Force
$body+= " "
$body+= "'n"
$body+= Import-Csv -Path C:\files\PrimaryBody.csv | ConvertTo-Html -Fragment
$body+= Import-Csv -Path C:\files\OutDatedBackups.csv | ConvertTo-Html -Fragment
The result in the email, when send-mailmessage command is run is the following:
*
The number of backups that were not successfully copied are 13.
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
(I have included X just to cover up data :) )
However, my issue lies in that the body array will not allow an empty line/space to be shown in the email.. I have tried many things, including $body += " ", $body+= "'n", $body+= "'r'n" all it would do is either ignore the empty space entirely or it would literally write the 'n in the body of the email. I am not sure why It won't allow me to add an empty space/line in the email. Secondly, no matter what I do, there is always a * reported at the top of the report as shown above, before the data in the import-csv is written. I am not sure why this is and have been unable to remove it. Please help if possible!
Thanks in advance :)
you are using partial body as html. Then don't use array and use <br /> tags as line breaks. Like this:
$body = "First line"
$body += "<br />"
And while sending the mail use the -BodyAsHtml switch parameter.
You should not use arrays for the body parameter of Send-MailMessage. If you look at the help for the command, you'll see that the parameter is [[-Body] <string>] and not [[-Body] <string[]>] Your problem is that you're not using a string parameter. If you build your parameter as a string and then use the newline/return characters, you'll be fine. So this will work just fine:
$Body = "First line of text"
$Body += "`r`nSecond line of text"
A more efficient way of doing this in PowerShell is to use the StringBuilder .NET class. Here's a link to an article I like on this topic Powershell: Concatenate strings using StringBuilder. The StringBuilder is faster than += adding to a string. The syntax is a bit odd, but it grows on you. The same data in StringBuilder would be:
$sb = [System.Text.StringBuilder]::new()
[void]$sb.Append('First line of text')
[void]$sb.AppendLine('Second line of text')
To use the data as a string, you need to use the .ToString() method like this:
$sb.ToString()

How to download artifacts from CircleCI via Powershell command Invoke-RESTMethod

I'm trying to get artifacts from CircleCI in Powershell and getting back an unfamiliar data format?
Powershell likes to auto-convert your API’s JSON response to a PSCustomObject. This normally would be what I want.
Here is an example of my attempts at getting clean data.
Add the necessary .NET assembly
Add-Type -AssemblyName System.Net.Http
Create the HttpClient object
$client = New-Object -TypeName System.Net.Http.Httpclient
Get the web content.
$task = $client.GetByteArrayAsync(“https://circleci.com/api/v1.1/project/$vcs_type/$username/$project/$build_number/artifacts?circle-token=$CIRCLE_TOKEN”)
Wait for the async call to finish
$task.wait();
results:
(({
:path “src./file1.txt”,
:pretty-path “src/file1.txt”,
:node-index 0,
:url “https://15-198716507-gh.circle-artifacts.com/0/src/file1.txt”
} {
:path “src/file2.txt”,
:pretty-path “src/file2.txt”,
:node-index 0,
:url “https://15-198716507-gh.circle-artifacts.com/0/src/file2.txt”
}…continued
As you can see this is not JSON or YAML. Let’s try the built-in PowerShell tools like Invoke-RestMethod.
Invoke-RESTMethod -uri https://circleci.com/api/v1.1/project/$vcs_type/$username/$project/$build_number/artifacts?circle-token=$CIRCLE_TOKEN -Method GET
**Output:**
({
:path “src/file1.txt”,
:pretty-path “src/file1.txt”,
:node-index 0,
:url “https://15-198716507-gh.circle-artifacts.com/0/src/file1.txt”
} {
:path “src/file2.txt”,
:pretty-path “src/file2.txt”,
:node-index 0,
:url “https://15-198716507-gh.circle-artifacts.com/0/src/file2.txt”
}…continued
Dang same output. I know from the Invoke-RestMethod documentation that PS sees JSON and auto converts it to a PS object. Maybe it's converting a data type I'm not familiar with? I found it odd that PowerShell was getting EDN type when every other attempt outside PowerShell was JSON.
Maybe they should have the API updated to reply to PS request with JSON by default.
What is wrong with PowerShell not getting JSON data?
It's EDN, didn't know this until CircleCI answered a question on this topic. So if you are using PowerShell to retrieve artifacts from CircleCI you definitely want to know this.
you need to pass a header specifying the data type returned.
(Accept: application/json)
A CircleCI Support member let me know that in PowerShell you have to specify an Accept header to receive the data in JSON. No wonder I'm getting weird output!
So trying again with the new accept JSON header we have this command below.
Working command to get data in JSON and have it auto-convert to a PSObject.
Invoke-RestMethod -Uri https://circleci.com/api/v1.1/project/$vcs_type/$username/$project/$build_number/artifacts?circle-token=$CIRCLE_TOKEN -Method GET -ContentType 'application/json' -UseBasicParsing -Header #{"Accept" = "application/json"}
OUTPUT
$response|select path,url
path
----
src.orig/file1.txt
src.orig/file2.txt
url
---
https://15-824975-gh.circle-artifacts.com/0/src.orig/file1.txt
https://15-824975-gh.circle-artifacts.com/0/src.orig/file2.txt
Using the PS commands Invoke-WebRequest/Invoke-RestMethod both will receive data in EDN format if you don't do the below. Yay now I can use the data as I see fit to download my artifacts.
Reply from CircleCI that got me the solution.
#burninmedia So what's being sent back is actually a data format called EDN. If you want to return JSON you'll need to pass a header specifying so (Accept: application/json). Thanks!
Here is a simple script I wrote to download all the artifacts. Please be sure you're setting the environment variables.
if ($USERNAME -eq $null) { write-host " please add required variable USERNAME" ;exit }
if ($VCS_TYPE -eq $null) { write-host " please add required variable VCS_TYPE" ;exit}
if ($CIRCLE_TOKEN -eq $null) { write-host " please add required variable CIRCLE_TOKEN" ;exit}
if ($BUILD_NUMBER -eq $null) { write-host " please add required variable BUILD_NUMBER" ;exit}
if ($PROJECT -eq $null) { write-host " please add required variable PROJECT" ;exit}
if ($BASEPATH -eq $null) { write-host " please add required variable BASEPATH" ;exit}
$response = Invoke-RestMethod -Uri https://circleci.com/api/v1.1/project/$VCS_TYPE/$USERNAME/$PROJECT/$BUILD_NUMBER/artifacts?circle-token=$CIRCLE_TOKEN -Method GET -ContentType 'application/json' -UseBasicParsing -Header #{"Accept" = "application/json"}
ForEach ($i in $response){
$PATH = $(Split-Path -Path "$($BASEPATH)\$($i.path)")
if (-Not ( Test-Path $PATH) ) {
write-host "Creating folder: $($PATH)"
New-Item -ItemType Directory -Force -Path "$($PATH)"
}
Write-Host "Saving artifact $($i.pretty_path) to file: $($BASEPATH)\$($i.path)"
Invoke-RestMethod "$($i.url)?circle-token=$($CIRCLE_TOKEN)" -UseBasicParsing -OutFile "$($BASEPATH)\$($i.path)"
}
Bash version
export CIRCLE_TOKEN=':your_token'
echo $(https://circleci.com/api/v1.1/project/$vcs-type/$username/$project/$build_number/artifacts?circle-token=$CIRCLE_TOKEN) > ./artifact_json
for ((i = 0 ; i <= $(jq -c '.[].url ' ./artifact_json|wc -l) ; i++));
do
path=$(jq -c ".[$i].path" ./artifact_json|tr -d '"');
url=$(jq -c ".[$i].url" ./artifact_json|tr -d '"');
pathdir=$(dirname "$path")
echo "URL: $url"
echo "path: $path"
echo "Pathdir: $pathdir"
[ -d $pathdir ] && mkdir -p "$pathdir" #check if folder exists if not mkdir
wget -o $path $url
done
rm ./artifact_json```

Parsing data from multiple text files into a CSV

I have a directory full of files filled with content similar to the below. I want to copy everything after //TEST: and before //, I want to copy the date and time, and the IPO into a CSV.
IPO 7 604 1148 17 - Psuedo text here doesnt mean anything just filler text, beep, boop.txt
werqwerwqerw
erqwerwqer
2. (test) On 7 July 2017 at 0600Z, wqerwqerwqerwerwqerqwerwqjeroisduhsuf //TEST: 37MGUI2974027//,
sdfajsfjiosauf
sadfu
(test2) On 7 July 2017 at 0600Z, blah blah //TEST: 89MTU34782374//
blah blah text here //TEST: GHO394749374// (this is uneeded)
Now, Each file has multiple instances of this data, and there may be hundreds of them.
I want to output it into a CSV similar to this:
89MTU34782374, 3 July 2016 at 0640Z, IPO 7 604 1148 17
I have successfully created that with the following, and I feel like I'm on the right track:
$x = "D:\New folder\"
$s = Get-Content $x
$ipo = [regex]::Match($s,'IPO([^/)]+?) -').Groups[1].Value
$test = [regex]::Matches($s,'//TEST: ([^/)]+?)//').Groups[1].Value
$date = [regex]::Matches($s,' On([^/)]+?),').Groups[1].Value
Write-Host $test"," $date"," IPO $ipo
However, I am having trouble getting it to find and select every instance in the file, and printing them onto a new line. I should also note that the way it is looking for the data, every text file is formatted the same way like this.
Not only am I having issues getting it to print each string/variable in the text document onto a new line, I'm having trouble figuring out how to do it for multiple files.
I have tried the following, but it seems to find the terms it's looking for from the first file, and spitting it out for as many files are contained in the directory:
$files = Get-ChildItem "D:\New folder\*.txt"
$s = Get-Content $files
for ($i=0; $i -lt $files.Count; $i++) {
$ipo = [regex]::Match($s,'IPO([^/)]+?) -').Groups[1].Value
$test = [regex]::Matches($s,'//TEST: ([^/)]+?)//').Groups[1].Value
$date = [regex]::Matches($s,' On([^/)]+?),').Groups[1].Value
Write-Host $test"," $date"," IPO $ipo
}
Does anyone have any ideas on how this could be done?
I did a bad job at explaining this.
Every document has an IPO number.
Every TEST string has a date/time associated with it.
There may be other TEST strings but they can be ignored, they are uneeded without a date/time. I could clean it up easily if they got included into the product, though.
Every TEST+date/time combo should have the IPO number from which they came
If date and //TEST: ...// substring always appear as pairs and in the same order you should be able to extract both values with a single regular expression. Try something like this:
Get-ChildItem "D:\New folder\*.txt" | ForEach-Object {
$s = Get-Content $_.FullName
$ipo = [regex]::Matches($s,'(IPO .+?) -').Groups[1].Value
[regex]::Matches($s,' On (.+?),[\s\S]*?//TEST: (.+?)//') | ForEach-Object {
New-Object -Type PSObject -Property #{
IPO = $ipo
Date = $_.Groups[1].Value
Test = $_.Groups[2].Value
}
}
} | Export-Csv 'C:\path\to\output.csv' -NoType
Like so? Most of your code seems to be fine if I understand your question.
It's the loop that seems incorrect as you are repeating the same thing for the number of files found, but not actually referring to the individual files. Also, $s = ... should be inside the loop to get the content of each file.
$files = Get-ChildItem "D:\New folder\*.txt"
foreach($file in $files){
$s = Get-content $file
$ipo = [regex]::Match($s,'IPO([^/)]+?) -').Groups[1].Value
$test = [regex]::Matches($s,'//TEST: ([^/)]+?)//').Groups[1].Value
$date = [regex]::Matches($s,' On([^/)]+?),').Groups[1].Value
Write-Host "$test, $date, IPO $ipo"
}

ftp batch file script

Hoping someone can guide me / help me.
The issue, I have 2 servers one running a Ubuntu which has a website for clients to login and download / view reports. The other is a windows server 2012 R2 which creates / stores the reports. I need to move the files from the windows to the Ubuntu server so clients can view. The data is large currently 7gb and growing at 3 gb a year.
I need a batch file to connect using ftp and then copy the folder to a local folder. However it only needs to copy those files which have modified.
I have only ever written one batch file and I cant seem to find any ftp batch scripts which only copies modifed files.
Your my last resort as I cant seem to find a coder who knows batch script (its a dieing art). I have never used powershell so would not know where to start here.
Any help or advice please let me know.
Thanks
John
You can do it with PowerShell with winscp. Exemple :
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
SshHostKeyFingerprint = "ssh-rsa 2048 xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
$transferResult = $session.PutFiles("d:\toupload\*", "/home/user/", $False, $transferOptions)
# Throw on any error
$transferResult.Check()
# Print results
foreach ($transfer in $transferResult.Transfers)
{
Write-Host ("Upload of {0} succeeded" -f $transfer.FileName)
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host ("Error: {0}" -f $_.Exception.Message)
exit 1
}
This would be a way to do it in PowerShell. It would take files that are older then 31 days and upload them.
function FTP-Upload {
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]$Source_File,
[Parameter(Mandatory=$true)]
[string]$Target_File,
[Parameter(Mandatory=$true)]
[string]$Target_Server,
[Parameter(Mandatory=$true)]
[string]$Target_Username,
[Parameter(Mandatory=$true)]
[string]$Target_Password
)
$FTP = [System.Net.FTPWebRequest]::Create("ftp://$Target_Server/$Target_File")
$FTP = [System.Net.FTPWebRequest]$FTP
$FTP.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$FTP.Credentials = New-Object System.Net.NetworkCredential($Target_Username,$Target_Password)
$FTP.UseBinary = $true
$FTP.UsePassive = $true
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes($Source_File)
$FTP.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $FTP.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
}
$Upload_Server = "server.network.tld"
$Upload_Location = "/data/"
$Upload_Username = "ftpuser"
$Upload_Password = "ftppassword"
$Files_To_Upload = Get-ChildItem E:\Path\To\Files -Recurse | Where-Object {($_.CreationTime -le (Get-Date).AddDays(-31)) -and (!$_.PSIsContainer)}
Foreach ($File in $Files_To_Upload) {
FTP-Upload -Source_File $File.FullName -Target_File ($Upload_Location + $File.Name) -Target_Server $Upload_Server -Target_Username $Upload_Username -Target_Password $Upload_Password
}

Resources