How to change admin name and logout threshold with powershell win2k12 R2 - shell

I am trying to automate some regular tasks, and I need some help. Does powershell compile like C++ or is a simple batch file like the old .bat
Is there an online lint/editor for powershell place like jsfiddle?
Main question: I need help with automating some of these into a powershell script (both interactive and non-interactive modes) and looking at if they succeed
Change user/admin name Get-WMIObject Win32_UserAccount -Filter "Name -like 'admin*'" | Foreach-Object {$_.Rename("Dingbats)")}
Turn on lockout threshold to 3 attempts and set it to 45 mins
PS C:\> Set-ADDefaultDomainPasswordPolicy -Identity SS64.com -LockoutDuration 00:40:00 -LockoutObservationWindow 00:20:00 -ComplexityEnabled $true -ReversibleEncryptionEnabled $false -MaxPasswordAge 10.00:00:00
another example
# set lockout threshold value
# how do I **change $mydomain to my server name** or local server automatically??
PS C:\> $mydomain.MaxBadPasswordsAllowed = 0
# set lockout duration value (in seconds)
PS C:\> $mydomain.AutoUnlockInterval = 1000
Turn on/enabled windows update service to start/auto on window startup
..
Edit 1: I posted some code before, now I have added other snippets as requested, I am still working on figuring out the auto start of windows updates. The challenge seems to be that - there are many options to do the same thing in Powershell. There seems to be an incredible amount of power, and the danger of messing up your system. So, I am looking for help in consolidating so I can add and maint the scripts on my own.

PS is a scripting language - which means it is interpreted, like Python, Ruby, Perl, and, yes, CMD.EXE .BAT files. However there is a huge difference between the capabilities of the two.
Regarding lint, there is the set-strictmode command to diagnose some errors that wouldn't otherwise be called out. However, a scripting language is significantly different from a language like C (to which lint is applicable). Some of the dangerous things you could do in C, that lint would diagnose, just can't be done in a scripting language.
As far as your 3 items, SO is meant to help folks get help with coding. But you don't have much code posted, and it isn't clear if the code you do have works or if you're having trouble with it.
To get started I suggest googling for the three tasks (or last two if the line of code you have works), but add the word Powershell to your search.
You may also want to look at some tutorials on basic PS script. You could learn basic scripting in an hour or less of searching and reading.

Related

Powershell: Get network login history of past 90 days

I'm trying to get all network login history of the past 90 days.
So far, I've got these 2 commands but they still don't give me what I want and I'm asking for help here.
Below gives me only the network login history. One problem is it's giving me the data for only today and yesterday, although the command doesn't have any date restriction.
Get-WinEvent -ProviderName 'Microsoft-Windows-Security-Auditing'
-FilterXPath "*[EventData[Data[#Name='LogonType']='3']]"
Below gives me data going back a few more days only, although it's supposed to be the past 90 days.
Get-Eventlog System -Source Microsoft-Windows-Winlogon -After
(Get-Date).AddDays(-90);
What I'm looking for would be something like a combined command of the two commands above. I tried combining those two commands in various ways but couldn't make it work.
Thank you in advance.
You can list 2 providers in the filterhashtable. This should work in powershell 7. I don't have that auditing turned on. In powershell 5 you can only say data=3 (2?). You can also pipe to format-table -groupby logname, although the header still says providername.
get-winevent #{providername = 'Microsoft-Windows-Security-Auditing',
'Microsoft-Windows-Winlogon'; logontype = 3; starttime = (get-date).AddDays(-90)}

Windows 10 Update - No LastSuccessTime in Registry

I have a Nagios plugin https://outsideit.net/check-ms-win-updates/ which checks when the last WSUS was installed successfully. This is based on a string 'LastSuccessTime' in a registry key located here: 'HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\WindowsUpdate\Auto Update\Results\Install'
$LastSuccessTimeFolder = 'HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\WindowsUpdate\Auto Update\Results\Install'
$LastSuccessTimeValue = Get-ItemProperty -Path $LastSuccessTimeFolder -Name LastSuccessTime | Select-Object -ExpandProperty LastSuccessTime
This key is not available on Windows 10 it seems. So how can I get the LastSuccessTime date / time form a Windows 10 pc?
Not the best solution, but parsing out C:\Windows\Logs\WindowsUpdate will easily get you a list of the last times that updates were checked for.
Figuring out whether it was successful or not would require parsing the logs. How hard that would be depends on whether or not the "Exit code" at the end changes based on success or failure or not. Since I don't need that for my current purposes I'll leave it to some future expert to decipher.

Powershell statement to enter debugger

There doesn't appear to be easy way to halt execution and enter the ISE debugger from a Powershell script. Currently I do the following:
Set-PSBreakPoint -command BreakIntoDebug | Out-Null # at start of script.
function BreakIntoDebug {} # elsewhere in code.
BreakIntoDebug # wherever I want to go into debugger.
However, this is awkward. At the breakpoint, I need to hit F10 two times to see where it was called from. Then I need to use "exit" to continue running the program. Is there a better way? I know someone will tell me that this is a bad way to debug. However there are times when this is the best way to find a very rare bug that only appears in a specific code path. (I indented 4 spaces to format as code, but it keeps displaying it inline.)
Your way is pretty clever. :-) Take a look at the help on the Set-PSDebug command as another way of tracing/debugging the execution of your script.

Why is this PowerShell code (Invoke-WebRequest / getElementsByTagName) so incredibly slow on my machines, but not others?

I wrote some screen-scraping code in PowerShell and was surprised that it took around 30 seconds to parse a few HTML tables. I stripped it down to try and figure out where all the time was being spent, and it seems to be in the getElementsByTagName calls.
I've included a script below which on both my home desktop, my work desktop and my home slate, takes around 1-2 seconds for each iteration (full results pasted below). However, other people in the PowerShell community are reporting far shorter times (only several milliseconds for each iteration).
I'm struggling to find any way of narrowing down the problem, and there doesn't seem to be a pattern to the OS/PS/.NET/IE versions.
The desktop I'm currently running it on is a brand new Windows 8 install with only PS3 and .NET 4.5 installed (and all Windows Update patches). No Visual Studio. No PowerShell profile.
$url = "http://www.icy-veins.com/restoration-shaman-wow-pve-healing-gear-loot-best-in-slot"
$response = (iwr $url).ParsedHtml
# Loop through the h2 tags
$response.body.getElementsByTagName("h2") | foreach {
# Get the table that comes after the heading
$slotTable = $_.nextSibling
# Grab the rows from the table, skipping the first row (column headers)
measure-command { $rows = $slotTable.getElementsByTagName("tr") | select -Skip 1 } | select TotalMilliseconds
}
Results from my desktop (the work PC and slate give near identical results):
TotalMilliseconds
-----------------
1575.7633
2371.5566
1073.7552
2307.8844
1779.5518
1063.9977
1588.5112
1372.4927
1248.7245
1718.3555
3283.843
2931.1616
2557.8595
1230.5093
995.2934
However, some people in the Google+ PowerShell community reported results like this:
TotalMilliseconds
-----------------
76.9098
112.6745
56.6522
140.5845
84.9599
48.6669
79.9283
73.4511
94.0683
81.4443
147.809
139.2805
111.4078
56.3881
41.3386
I've tried both PowerShell ISE and a standard console, no difference. For the work being done, these times seem kinda excessive, and judging by the posts in the Google+ community, it can go quicker!
See my comment in: https://connect.microsoft.com/PowerShell/feedback/details/778371/invoke-webrequest-getelementsbytagname-is-incredibly-slow-on-some-machines#tabs
I got the same slowness running the script in 64 bits, but when running in 32bits mode, everything is very fast !
Lee Holmes was able to reproduce the issue, and here is his writeup
"The issue is that he’s piping COM objects into another cmdlet – in this case, Select-Object. When that happens, we attempt to bind parameters by property name. Enumerating property names of a COM object is brutally slow – so we’re spending 86% of our time on two very basic CLR API calls:
(…)
// Get the function description from a COM type
typeinfo.GetFuncDesc(index, out pFuncDesc);
(…)
// Get the function name from a COM function description
typeinfo.GetDocumentation(funcdesc.memid, out strName, out strDoc, out id, out strHelp);
(…)
We might be able to do something smart here with caching.
A workaround is to not pipe into Select-Object, but instead use language features:
# Grab the rows from the table, skipping the first row (column headers)
$allRows = #($slotTable.getElementsByTagName("tr"))
$rows = $allRows[1..$allRows.Count]
"
Did you try disabling progress?
$ProgressPreference = "SilentlyContinue"
In my case this solved serious performance problems with Invoke-WebRequest.
I have noticed the same phenomenon on a new Windows 8 system. I have a 104MB file stored on a remote web server that takes from 2 to 5 minutes to download depending on server load, bandwidth, etc. Tried through FF, IE10, Chrome, cURL, and even set up a test FTP server and tried with Windows FTP command. I consistently get the same results.
However, the same file (this is not an exaggeration) takes nearly 3 HOURS to transfer.
$file = 'C:\User\me\Desktop\file.mp4'
$site = 'http://my.site/file.mp4'
Invoke-WebRequest $site -Method Get -OutFile $file
Seems pretty cut-and-dry -- site's not https, file's not an executable or anything that Windows might see as 'unsafe', and there's no authentication needed. It just takes forever to finish.
I thought that my AV's real-time scanning might be the culprit, but disabling that made no discernible difference. Is there maybe was some memory allocation at work here? Similar to how adding the -ReadCount option to Get-Content makes getting the content of large files much faster? I can't find any such option for Invoke-WebRequest.

Process Management w/ bash/terminal

Quick bash/terminal question -
I work a lot on the command line, but have never really had a good idea of how to manage running processes with it - I am aware of 'ps', but it always gives me an exceedingly long and esoteric list of junk, including like 30 google chrome workers, and I always end up going back to activity monitor to get a clean look at what's actually going on.
Can anyone offer a bit of advice on how to manage running processes from the command line? Is there a way to get a clean list of what you've got running? I often use 'killall' on process names that I know as a quick way to get rid of something that's freezing up - can I get those names to display via terminal rather than the strange long names and numbers that ps displays by default? And can I search for a specific process or quick regex of a process, like '*ome'?
If anyone has the answers to these three questions, that would be amazingly helpful to many people, I'm sure : )
Thanks!!
Yes grep is good.
I don't know what you want to achieve but do you know the top command ? Il gives you a dynamic view of what's going on.
On Linux you have plenty of commands that should help you getting what you want in a script and piping commands is a basic we are taught when studying IT.
You can also get a look to the man of jobs and I would advise you to read some articles about process management basics. :)
Good Luck.
ps -o command
will give you a list of just the process names (more exactly, the command that invoked the process). Use grep to search, like this:
ps -o command | grep ".*ome"
there may be scripts out there..
but for example if you're seeing a lot of chrome that you're not interested in, something as simple as the following would help:
ps aux | grep -v chrome
other variations could help showing each image only once... so you get one chrome, one vim etc.. (google show unique rows with perl or python or sed for example)
you could use ps to specifiy one username... so you filter out system processes, or if more than one user is logged to the machine etc.
Ps is quite versatile with the command line arguments.. a little digging help finding a lot of nice tweaks and flags in combinations with other tools such as perl and sed etc..

Resources