I have a Nagios plugin https://outsideit.net/check-ms-win-updates/ which checks when the last WSUS was installed successfully. This is based on a string 'LastSuccessTime' in a registry key located here: 'HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\WindowsUpdate\Auto Update\Results\Install'
$LastSuccessTimeFolder = 'HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\WindowsUpdate\Auto Update\Results\Install'
$LastSuccessTimeValue = Get-ItemProperty -Path $LastSuccessTimeFolder -Name LastSuccessTime | Select-Object -ExpandProperty LastSuccessTime
This key is not available on Windows 10 it seems. So how can I get the LastSuccessTime date / time form a Windows 10 pc?
Not the best solution, but parsing out C:\Windows\Logs\WindowsUpdate will easily get you a list of the last times that updates were checked for.
Figuring out whether it was successful or not would require parsing the logs. How hard that would be depends on whether or not the "Exit code" at the end changes based on success or failure or not. Since I don't need that for my current purposes I'll leave it to some future expert to decipher.
Related
I'm trying to get all network login history of the past 90 days.
So far, I've got these 2 commands but they still don't give me what I want and I'm asking for help here.
Below gives me only the network login history. One problem is it's giving me the data for only today and yesterday, although the command doesn't have any date restriction.
Get-WinEvent -ProviderName 'Microsoft-Windows-Security-Auditing'
-FilterXPath "*[EventData[Data[#Name='LogonType']='3']]"
Below gives me data going back a few more days only, although it's supposed to be the past 90 days.
Get-Eventlog System -Source Microsoft-Windows-Winlogon -After
(Get-Date).AddDays(-90);
What I'm looking for would be something like a combined command of the two commands above. I tried combining those two commands in various ways but couldn't make it work.
Thank you in advance.
You can list 2 providers in the filterhashtable. This should work in powershell 7. I don't have that auditing turned on. In powershell 5 you can only say data=3 (2?). You can also pipe to format-table -groupby logname, although the header still says providername.
get-winevent #{providername = 'Microsoft-Windows-Security-Auditing',
'Microsoft-Windows-Winlogon'; logontype = 3; starttime = (get-date).AddDays(-90)}
My Question:
Edit: I'm specifically trying to do this programmatically through Powershell.
How can I check to see if a computer that is not currently joined to the domain (Freshly imaged) has an entry in the domain so that I can delete it prior to re-adding the computer to the domain? i.e. to ensure that as I image computers their old records are removed and only a single Computer object exists for each machine?
I know how to check if a computer is currently connected to a domain, and I know how to reconcile a trust relationship. Have I made a mistake in not using a script to pull the computer out of the domain as a pre-install task? I've already imaged a number of computers but I forgot some software and will be re-imaging again. So I'm hoping to get this implemented to remove all duplicate entries on the next go-around.
Background:
I've recently stepped into a network and domain that has less than 10% documentation available. It was one of those lovely systems that was more duct tape as it was built as-needed using whatever tutorials were available at the time.
In attempting to fix some of the more glaring issues, I've prepared a fresh Windows 10 image.
Now, all the computers are already in the domain, but because of the organizational issues and such, I'm using a powershell script to re-add them and put them into better defined OUs with names that make more sense. Previously devices were named after the individual that used them, but with so many staff changes that system is completely ineffective in tracking devices.
Previous Search Attempts:
I've tried searching for post image tasks for checking if a computer already exists before adding it to a domain, but the only results I've been able to get both through Google and Spiceworks are those that simply go over adding a computer or removing it from the domain, and the ones on repairing the trust relationship.
It may be simply that I'm not aware of the correct phrasing for the question, so if this has been answered before I would be ecstatic for someone to post the link for me.
Thank you for reading!
The solution ended up being to simply add the computers to the domain and then after 90 days I ran a script to purge anything that hadn't reported into Active Directory. I did the same thing with accounts at a 180 day mark.
It meant I had ~10,000 extra items in Active Directory and had a negative impact on speed, but now that everything is properly tracked I won't have to repeat the process next year.
I made certain to back up all items using ldifde in the event that I accidentally deleted a machine that was only used at certain times of the year such as the ISTEP test caching machines. Being able to back them up made me much more willing to bypass reconciling the entries and to simply purge them.
Here is the script in case anyone comes across the same issue:
#I always set the location at the start of a script in case I forget at the end of a previous script.
Set-Location C:\
#ActiveDirectory is the only necessary module for this script.
Import-Module ActiveDirectory
#While this is not 100% secure, I still feel it is better than leaving the credentials in plaintext within the code.
$keyfile = "\\[domain controller]\GPO Deployment Files\Scripts\AES.txt"
[byte[]] $key = ([byte array key])
$secure = Get-Content $keyfile | ConvertTo-SecureString -Key $key
$username = "[domain\admin]"
$cred = new-object -typename System.Management.Automation.PSCredential `
-argumentlist $username, $secure
#Establishes the 90 day cutoff based off the current day.
$datecutoff = (Get-Date).AddDays(-90)
#As I wanted to go district-wide, I did not bother adding an OU based filter.
$computers = Get-ADComputer -Filter {LastLogonDate -lt $datecutoff} -Properties * | Sort LastLogonDate | Select DistinguishedName,Name,LastLogonDate
#This is a test CSV - I ran all code up until this line so I could verify the list with a quick visual skim.
$computers | Add-Content -Path "\\[domain controller]\DeletedADItems\test.csv"
#Stepping through the list
Foreach ($computer in $computers) {
#I know there are likely elegant ways to do this, but using the -replace option was the simplest solution I could come up with at the time.
$computername = $computer.Name -replace '#{name=',''
$computername = $computername -replace '}',''
$computerDN = $computer.DistinguishedName -replace '#{DistinguishedName=',''
$computerDN = $computerDN -replace '}',''
<# I discovered that LDIFDE doesn't always play nicely with PS drives and such, or at least it failed to do so in my case.
So I back up the files to my local drive and then I copy them to the domain controller path that I used with the test.csv file. #>
$exportFile = "C:\DeletedADItems\$computername.ldf"
#Prior to deletion, export the object to the export file location.
ldifde -d "$computerDN" -f "$exportFile"
#Finally, remove the item and mark confirm false so you don't have to confirm hundreds of deletions.
Remove-ADComputer -Identity "$computerDN" -Confirm:$False
}
It is mostly self-explanatory, but I added some basic comments just in case.
I am trying to automate some regular tasks, and I need some help. Does powershell compile like C++ or is a simple batch file like the old .bat
Is there an online lint/editor for powershell place like jsfiddle?
Main question: I need help with automating some of these into a powershell script (both interactive and non-interactive modes) and looking at if they succeed
Change user/admin name Get-WMIObject Win32_UserAccount -Filter "Name -like 'admin*'" | Foreach-Object {$_.Rename("Dingbats)")}
Turn on lockout threshold to 3 attempts and set it to 45 mins
PS C:\> Set-ADDefaultDomainPasswordPolicy -Identity SS64.com -LockoutDuration 00:40:00 -LockoutObservationWindow 00:20:00 -ComplexityEnabled $true -ReversibleEncryptionEnabled $false -MaxPasswordAge 10.00:00:00
another example
# set lockout threshold value
# how do I **change $mydomain to my server name** or local server automatically??
PS C:\> $mydomain.MaxBadPasswordsAllowed = 0
# set lockout duration value (in seconds)
PS C:\> $mydomain.AutoUnlockInterval = 1000
Turn on/enabled windows update service to start/auto on window startup
..
Edit 1: I posted some code before, now I have added other snippets as requested, I am still working on figuring out the auto start of windows updates. The challenge seems to be that - there are many options to do the same thing in Powershell. There seems to be an incredible amount of power, and the danger of messing up your system. So, I am looking for help in consolidating so I can add and maint the scripts on my own.
PS is a scripting language - which means it is interpreted, like Python, Ruby, Perl, and, yes, CMD.EXE .BAT files. However there is a huge difference between the capabilities of the two.
Regarding lint, there is the set-strictmode command to diagnose some errors that wouldn't otherwise be called out. However, a scripting language is significantly different from a language like C (to which lint is applicable). Some of the dangerous things you could do in C, that lint would diagnose, just can't be done in a scripting language.
As far as your 3 items, SO is meant to help folks get help with coding. But you don't have much code posted, and it isn't clear if the code you do have works or if you're having trouble with it.
To get started I suggest googling for the three tasks (or last two if the line of code you have works), but add the word Powershell to your search.
You may also want to look at some tutorials on basic PS script. You could learn basic scripting in an hour or less of searching and reading.
Wanting to know what language you would recommend. Have been talking to some friends and they recommended against CMD.
Basically I want to check %SystemRoot%\SYSTEM32\SPOOL\PRINTERS and see if there is any files older than 30 seconds or 1 minute. The physical number will be set. Then if it is delete the file.
If some one could guide me in the best way to program this that would be great :)
PowerShell is your friend. Part of your description is a little bit unclear, but here is what a script doing what I understand you want would resemble:
dir $env:SystemRoot\SYSTEM32\SPOOL\PRINTERS |
where { ((get-date)-$_.creationTime).seconds -gt 30 } |
remove-item -force
I wrote some screen-scraping code in PowerShell and was surprised that it took around 30 seconds to parse a few HTML tables. I stripped it down to try and figure out where all the time was being spent, and it seems to be in the getElementsByTagName calls.
I've included a script below which on both my home desktop, my work desktop and my home slate, takes around 1-2 seconds for each iteration (full results pasted below). However, other people in the PowerShell community are reporting far shorter times (only several milliseconds for each iteration).
I'm struggling to find any way of narrowing down the problem, and there doesn't seem to be a pattern to the OS/PS/.NET/IE versions.
The desktop I'm currently running it on is a brand new Windows 8 install with only PS3 and .NET 4.5 installed (and all Windows Update patches). No Visual Studio. No PowerShell profile.
$url = "http://www.icy-veins.com/restoration-shaman-wow-pve-healing-gear-loot-best-in-slot"
$response = (iwr $url).ParsedHtml
# Loop through the h2 tags
$response.body.getElementsByTagName("h2") | foreach {
# Get the table that comes after the heading
$slotTable = $_.nextSibling
# Grab the rows from the table, skipping the first row (column headers)
measure-command { $rows = $slotTable.getElementsByTagName("tr") | select -Skip 1 } | select TotalMilliseconds
}
Results from my desktop (the work PC and slate give near identical results):
TotalMilliseconds
-----------------
1575.7633
2371.5566
1073.7552
2307.8844
1779.5518
1063.9977
1588.5112
1372.4927
1248.7245
1718.3555
3283.843
2931.1616
2557.8595
1230.5093
995.2934
However, some people in the Google+ PowerShell community reported results like this:
TotalMilliseconds
-----------------
76.9098
112.6745
56.6522
140.5845
84.9599
48.6669
79.9283
73.4511
94.0683
81.4443
147.809
139.2805
111.4078
56.3881
41.3386
I've tried both PowerShell ISE and a standard console, no difference. For the work being done, these times seem kinda excessive, and judging by the posts in the Google+ community, it can go quicker!
See my comment in: https://connect.microsoft.com/PowerShell/feedback/details/778371/invoke-webrequest-getelementsbytagname-is-incredibly-slow-on-some-machines#tabs
I got the same slowness running the script in 64 bits, but when running in 32bits mode, everything is very fast !
Lee Holmes was able to reproduce the issue, and here is his writeup
"The issue is that he’s piping COM objects into another cmdlet – in this case, Select-Object. When that happens, we attempt to bind parameters by property name. Enumerating property names of a COM object is brutally slow – so we’re spending 86% of our time on two very basic CLR API calls:
(…)
// Get the function description from a COM type
typeinfo.GetFuncDesc(index, out pFuncDesc);
(…)
// Get the function name from a COM function description
typeinfo.GetDocumentation(funcdesc.memid, out strName, out strDoc, out id, out strHelp);
(…)
We might be able to do something smart here with caching.
A workaround is to not pipe into Select-Object, but instead use language features:
# Grab the rows from the table, skipping the first row (column headers)
$allRows = #($slotTable.getElementsByTagName("tr"))
$rows = $allRows[1..$allRows.Count]
"
Did you try disabling progress?
$ProgressPreference = "SilentlyContinue"
In my case this solved serious performance problems with Invoke-WebRequest.
I have noticed the same phenomenon on a new Windows 8 system. I have a 104MB file stored on a remote web server that takes from 2 to 5 minutes to download depending on server load, bandwidth, etc. Tried through FF, IE10, Chrome, cURL, and even set up a test FTP server and tried with Windows FTP command. I consistently get the same results.
However, the same file (this is not an exaggeration) takes nearly 3 HOURS to transfer.
$file = 'C:\User\me\Desktop\file.mp4'
$site = 'http://my.site/file.mp4'
Invoke-WebRequest $site -Method Get -OutFile $file
Seems pretty cut-and-dry -- site's not https, file's not an executable or anything that Windows might see as 'unsafe', and there's no authentication needed. It just takes forever to finish.
I thought that my AV's real-time scanning might be the culprit, but disabling that made no discernible difference. Is there maybe was some memory allocation at work here? Similar to how adding the -ReadCount option to Get-Content makes getting the content of large files much faster? I can't find any such option for Invoke-WebRequest.