I am trying to learn PowerShell.
$newuser = ‘ash’, ‘bob’, ‘charlie’, ‘david’
$newuser | foreach({New-LocalUser -Name $_newuser -Password “123123” -Description “Students”})
What I am trying to do is to create new local accounts with each name on the list, $newuser, with the password and Description I have.
I have googled a lot and tried other things, but no luck.
Do I need to create a list using this syntax? $newuser = #(ash, bob, charlie, david)
Also I am confused with what to put after -Name.
Tried something like -Name $newuser, -Name $__.newuser, -Name $_.($newuser), etc.
I have been looking for any method to call each object from the $newuser for creating each user account.
Any suggestion will be appreciated!!
You're close to getting where you want!
Declaring Arrays: You'll notice that when you try to use $newuser = #(ash, bob, charlie, david) in PowerShell ISE or VSCode, you'll get a syntax error. This is because you need to place quotes around each item. The way you've done it originally is no different in practice. It is simply up to how you want to define arrays. Sometimes with large arrays that expand multiple lines, it can be easier to keep track of where it ends when wrapped with #()
Understanding your variables: You can test the two array declaration methods by trying both ways, and using the GetType() method like so: $newuser.GetType()
Naming Conventions: Try to use plurals for variable names when dealing with something that contains multiple objects, like an array. So in this case, $newuser should be $newusers, and try to make your variables descriptive of what they contain. (You've done that nicely here)
$PSItem and $_: When you iterate over an array, there is a nifty PowerShell variable called $PSItem, also called by alias $_. This variable contains the current object you are iterating over. See the example below for how I might do something like this, but also keep in mind there are multiple ways to accomplish things in PowerShell most of the time.
Passwords: The cmdlet you are using here will not take a string as a password, but instead will require a SecureString. You can force the string to a SecureString in the script, but only do this as practice. The best thing to do is have a file that you call from and shred afterwards or external password manager that you call upon when needed to pass the value. PlainTexting the password is NEVER a good idea.
Method 1
$newUsers = ‘ash’, ‘bob’, ‘charlie’, 'david’
$securePassword = ConvertTo-SecureString '123123' -AsPlainText -Force
foreach ($user in $newUsers) {
New-LocalUser -Name $user -Password $securePassword -Description 'Students'
}
Method 2
$securePassword = ConvertTo-SecureString '123123' -AsPlainText -Force
‘ash’, ‘bob’, ‘charlie’, 'david’ | ForEach-Object {New-LocalUser -Name $PSItem -Password $securePassword -Description 'Students'}
This is mostly personal preference, or you'd follow the standard your team uses, but I'd say the first is what I would use in a script, as I find it a bit more readable, and the second is something I would do on the fly in a terminal.
Related
I am having a hard time with powershell (because I am learning it in the run). I have huuuge amount of data and I am trying to find a unique identifier for every folder with data. I wrote a script which is just MD5-ing every folder recursively and saving the hash value for every folder. But as you might have already thought it is super slow. So I thought that I will hash only the metadata. But I have no idea how to do this in powershell. The ideas from the internet are not working and they return always the same hash value. Has anyone had similar problem? Is there a magic powershell trick to perform such task?
Sorry for lack of precision.
I have a big ~20000 list of folders. In every folder there are unique data, photos, files etc. I iterated through every folder and counted hash of every file (I actually made a crypto-stream here so I had a one hash for the data). This solution is taking ages.
The solution I wanted to adopt was using the metadata. Like those from this command:
Get-ChildItem -Path $Env:USERPROFILE\Desktop -Force | Select-Object -First 1 | Format-List *
But hashing this always gives me the same value even when something changed. I have to have a possibility to chceck if nothing has changed in those files.
First, create an MD5 class that does not create a new instance of System.Security.Cryptography.MD5 every time we create an MD5 from a string.
class MD5 {
static hidden [System.Security.Cryptography.MD5]$_md5 = [System.Security.Cryptography.MD5]::Create()
static [string]Create([string]$inputString) {
return [BitConverter]::ToString([MD5]::_md5.ComputeHash([Text.Encoding]::ASCII.GetBytes($inputString)))
}
}
Second, figure out a way to use each child items Name, Length, CreationTimeUtc, and LastWriteTimeUtc to create unique ID text per each child in the folder, merge into a single string and create an MD5 based on that resulting string.
Get the child objects of a folder.
Select only certain properties, returning the content as a string array.
Join the string array into a single string. No need for joining with newline.
Convert the string into an MD5.
Output the newly created MD5.
$ChildItems = Get-ChildItem -Path $Env:USERPROFILE\Desktop -Force
$SelectProperties = [string[]]($ChildItems | Select-Object -Property Name, Length, CreationTimeUtc, LastWriteTimeUtc)
$JoinedText = $SelectProperties -join ''
$MD5 = [MD5]::Create($JoinedText)
$MD5
Alternately, join the above lines into a very long command.
$AltMD5 = [MD5]::Create([string[]](Get-ChildItem -Path $Env:USERPROFILE\Desktop -Force | Select-Object -Property Name, Length, CreationTimeUtc, LastWriteTimeUtc) -join '')
$AltMD5
This resulting MD5 should be a unique signature of a folder's contents, not the folder itself, but only of the contents. So, you could in theory change the name of the folder itself and this MD5 would remain the same.
Not exactly sure how you aim to use this, but be aware that if any file, or sub-folder, in the folder changes, the MD5 for the folder will also change.
Continuing from my comment.
As per this resource
3rdP tool: http://www.idrix.fr/Root/Samples/DirHash.zip
function Get-FolderHash ($folder)
{
dir $folder -Recurse | ?{!$_.psiscontainer} |
%{[Byte[]]$contents += [System.IO.File]::ReadAllBytes($_.fullname)}
$hasher = [System.Security.Cryptography.SHA1]::Create()
[string]::Join("",$($hasher.ComputeHash($contents) |
%{"{0:x2}" -f $_}))
}
Note, that I've not tested/validated either of the above and will leave that to you.
Lastly, this is not the first time this kind of question has been asked via SO, using the default cmdlet and some .Net. So, this could be seen/markerd as a duplicate.
$HashString = (Get-ChildItem C:\Temp -Recurse |
Get-FileHash -Algorithm MD5).Hash |
Out-String
Get-FileHash -InputStream ([IO.MemoryStream]::new([char[]]$HashString))
Original, faster but less robust, method:
$HashString = Get-ChildItem C:\script\test\TestFolders -Recurse | Out-String
Get-FileHash -InputStream ([IO.MemoryStream]::new([char[]]$HashString))
could be condensed into one line if wanted, although it starts getting
harder to read:
Get-FileHash -InputStream ([IO.MemoryStream]::new([char[]]"$(Get-ChildItem C:\script\test\TestFolders -Recurse|Out-String)"))
Whether it's faster or fast enough for your use case is a different matter. Yet, it does address ensuring you get a different hash based on target folder changes.
I've just started using PowerShell and I have a task where I need to be able to have the file path displayed on screen when I enter the file name.
Is there a script that allows me to do the below ? :
Ex 1: I enter "test.txt" and I get "C:\Program Files...."
Ex 2: I enter a file name "My Documents" and I also get its path.
I have searched online on how to do this but I didn't quite find what I was looking for and all the queries/answers were too complicated for me to understand.
Can anyone help me out, please?
Thanks in advance!
Here is a starter sample for you.
This example search only within the confine of the paths present is the Path system environment variable. It also only looks for files and do not recurse through these path.
So anything you could access directly from the command line should be available to you through it.
Now, if you want to search the whole drive, you could replace the $DefaultPaths assignment with Get-ChildItem -Path 'C:' -Recurse but doing that each time won't be super efficient.
You could do it and it will work... but it will be slow.
To search on the whole drive or whole filesystem, there are alternative methods that might work better. Some examples of what might entice:
Using a database which you have to buld & maintain to index all the files so that when you search, results are instantaneous and / or very fast
Parsing the MFT table (if using Windows / NTFS filesystem only) instead of using Get-ChildItem (This is not somehting natively doable through a simple cmdlet though) .
Relying on a third party software and interface with (For example, Void Tools Everything search engine already parse MFT and build its own database, allowing users to search instantly through a Windows NTFS filesystem. It also have its own SDK you can plug in through Powershell and retrieve what you seek instantly. The caveats is that you need the software installed first for that solution to work.)
Example: Searching through all paths defined in the Path variable
# What you are looking for. Accept wildcards characters (*)
$Filter = 'notepad.exe'
# Get the System Environment Path variable in an array
$DefaultPaths = $env:Path -split ';'
$Paths =
Foreach ($P in $DefaultPaths) {
# Search for files matching the specified filter. Ignore errors (often if the path do not exist but is sin the Path)
$MatchingFiles = Get-ChildItem -Path $P -Filter $Filter -File -ErrorAction SilentlyContinue
if ($MatchingFiles.count -gt 0) {
$MatchingFiles.Directory.FullName
}
}
$Paths | out-string | Write-Host -ForegroundColor Cyan
Output for Notepad.exe search using this method.
C:\Windows\system32
C:\Windows
I'm trying to use Azure powershell to pull an SSH key and add it to a VM.
The cmdlet is
Get-AzKeyVaultKey ... -OutFile filename
I'd like to avoid actually writing the key to the disk, but I need it in a variable. Is there any way to provide a variable acting like a file or something so I can go
-OutFile $someVariablePretendingToBeFile
and use that variable please?
The variable that is returned by Get-AzKeyVaultKey is of type PsKeyVaultKey
if I get its key property, and call ToRSA() I get an RSACryptoServiceProvider
But I still don't see where to get the public key string from!
It's annoying b/c -OutFile produces exactly the public key
Thanks
Since Get-AzKeyVaultKey is not providing a way of doing (that I know of), can you get it to work with a simple :
$key=(Get-AzKeyVaultKey XXX)
To get the result in a variable ?
Let us know !
Not sure if tis would work, it is a variant of the answer above. I can't test it just now
$PublicKey = Get-AzKeyVaultKey -VaultName $vaultName -KeyName $keyName
I am writing a PowerShell module to look for data that each user who has logged onto the computer at some point might have in their directory in HKEY_USERS. My initial thought was to mount HKEY_USERS, find a way to store each user's SID in a string variable, and then loop through all folders like so:
dir HKU\<STRING VARIABLE HOLDING SID>\Software\MyApp\Mydesireddata
Is there a way I can avoid having to loop through SIDs (because I won't know them ahead of time), and extract that file info from each SID on the system while remembering which SID it came from?
EDIT: Here is an example of the key I'm trying to extract from each user's SID using regedit (vncviewer's EulaAccepted)
Use Get-ChildItem to retrieve each user-specific subkey:
$UserHives = Get-ChildItem Registry::HKEY_USERS\ |Where-Object {$_.Name -match '^HKEY_USERS\\S-1-5-21-[\d\-]+$'}
Then loop over each entry and retrieve the desired registry value:
foreach($Hive in $UserHives)
{
# Construct path from base key
$Path = Join-Path $Hive.PSPath "SOFTWARE\MyApp\DataKey"
# Attempt to retrieve Item property
$Item = Get-ItemProperty -Path $Path -Name ValueName -ErrorAction SilentlyContinue
# Check if item property was there or not
if($Item)
{
$Item.ValueName
}
else
{
# doesn't exist
}
}
I tackled this issue a slightly different way; preferring to make use of a conspicuously placed wildcard.
Get-ItemProperty -Path Registry::HKEY_USERS\*\SOFTWARE\TestVNC\viewer\ -Name EulaAccepted |
Select-Object -Property #{n="SID";e={$_.PSPath.Split('::')[-1].Split('\')[1]}},EulaAccepted
The wildcard will automatically check all available paths and return what you need as well as the SID from the parent path.
As for the username (which is probably more useful than a SID), you didn't specifically ask for it, but I added it in for grins; this should cover local and domain accounts.
mind the line breaks
Get-ItemProperty -Path Registry::HKEY_USERS\*\SOFTWARE\TestVNC\viewer\ -Name EulaAccepted |
Select-Object -Property #{n="SID";e={$_.PSPath.Split('::')[-1].Split('\')[1]}},EulaAccepted |
Select-Object -Property #{n="User";e={[System.Security.Principal.SecurityIdentifier]::new($_.SID).`
Translate([System.Security.Principal.NTAccount]).Value}},SID,EulaAccepted
Getting the username is just ugly; there's likely a cleaner way to get it, but that's what I have in my head. The double-select really makes my skin crawl - there's something unpleasant about it. I could just do a one shot thing, but then it gets so unwieldly you don't even know what you're doing by looking at it.
I've included a screenshot of the registry below, and a screenshot of the screen output from running the few lines.
we have Octopus-2.0.
We stored the service credentials (they are per machine) in octopus user variable.
I need to create these login in SQL server as well.
For Example service login name "machine1_service1" stored as variable name. and password of that login is stored under column Variable Value in octopus.
so far i know that to any variable value from octopus we need to provide exact variable name. but in this case I actually need to get list of all these variables.
Is there a way to accomplish this?
Yes.
Octopus variables are accessible from a dictionary object that can be enumerated. If you follow a naming convention you could query the dictionary using powershell with something like the following. This would be called from within a custom step or somewhere where you can write your own powershell e.g. a PostDeploy.ps1 script in the .nuget file
Let's say the variables are defined like this
You can use this powershell to get to them and enumerate round them
# Get service accounts
$serviceAccounts = $OctopusParameters.keys | ? {$_ -like "service-*"}
write-host "Accounts found:" $serviceAccounts.count
foreach($account in $serviceAccounts)
{
write-host "Account: $account"
$password = $OctopusParameters[$account]
write-host "Password: $password"
}
Hope this helps.