Powershell validation of read-host data entry in while loop with nonewline - validation

I have a function which inserts a Y or N menu when called within my Powershell scripts. It uses a while loop to validate that either a Y or N value is entered. Everything works fine, however a new line is created each time an error is made. I could use cls and redisplay everything, but this is not the most ideal solution. Instead, I would like to find a way to redisplay the read-host prompt on the same line while clearing any previously entered answer. Here is my existing code:
# Begin function to display yes or no menu
function ynmenu {
$global:ans = $null
Write-Host -ForegroundColor Cyan "`n Y. [Yes]"
Write-Host -ForegroundColor Cyan "N. [No]`n"
While ($ans -ne "y" -and $ans -ne "n"){
$global:ans = Read-Host "Please select Y or N"
}
}
# End function ynmenu
I have a few other dynamically populated menus which leverage this methodology. Finding a solution to this would resolve the issue with those as well.

I don't think there's any simple way to do that.
But for a yes/no response, you can use $PSCmdlet.ShouldContinue($Query, $Caption) instead, as long as the scope you're in (function, script, etc.) defined the attribute [CmdletBinding(SupportsShouldProcess=$true)]. This shows an appropriate yes/no prompt in ISE and in the console host and avoids manual processing.

Related

Catch multiple exceptions with specific error messages leading to one exit command

I'm trying to create a program that prompts the user for their weight in pounds as a float, when entered I need to check the entry to ensure that it is not a string or below the value of 10. It needs to be done in an if statement and not a loop, as well I need to be able to end multiple exceptions with one quit() statement instead of multiple quit() statements for each exception and then continue the program further if the user entered the input in the correct parameters.
This is what I have so far:
isvalid = float,int
try:
weight_pounds = float(input("Enter your weight in pounds: ")
if weight_pounds != isvalid:
print("Cannot be alphabetical.")
elif weight_pounds < 10:
print("Cannot be less than 10.")
else:
input("Press enter to exit...")
quit()
I am still learning basic functions of python and I may be missing something simple here but I've tried many ways to get this to work and I can't seem to get it working without either a dead end or ValueErrors.

How to Turn on/off Wifi using Keyboard shortcut

I tried creating an automator to create a shortcut to Toggle Wifi using keyboard shortcut.
But I get this error message:
Syntax Error
A “{” can’t go after this “)”.
This is the script I ran:
set_wifi_on_or_off() {
networksetup -getairportpower en${n} | grep ": ${1}";
if test $? -eq 0;
then
echo WiFi interface found: en${n};
eval "networksetup -setairportpower en${n} ${2}"
return 0;
fi
return 1;
}
for n in $(seq 0 10);
do
if set_wifi_on_or_off "On" "off"; then break; fi;
if set_wifi_on_or_off "Off" "on"; then break; fi;
done
please help
The script you've supplied is written in shell script (more specifically, bash or a close relation). It would be more helpful to post a screen shot of your Automator workflow, given that the script itself seems syntactically fine. However, you've tagged this question with applescript, so here's an AppleScript solution:
use framework "CoreWLAN"
on WiFiOn:state
tell my CWWiFiClient's sharedWiFiClient()'s interface()
if (state as {boolean, anything}) ¬
is not in [true, false] ¬
then return powerOn()
setPower_error_(state, [])
end tell
WiFiOn_(null)
end WiFiOn:
This handler takes a single parameter, state. If the value passed to it is some equivalent of the boolean values true or false (these include true, yes, and 1; and false, no, and 0), then this sets the state of the WiFi to on (true) or off (false). Any other value passed to the handler performs a query on the state of the WiFi, returning true or false to indicate whether the WiFi is currently on (true) or off (false).
Therefore, to perform a toggle, you can first perform a query, then set the state to the boolean negation of the result:
set currentState to WiFi_(null)
WiFi_(not the currentState)
This handler and these two lines above need to be pasted into a single Run AppleScript action. There should be no other actions in the workflow. The workflow should be set to receive No Input in any application.
Save the workflow as a Quick Action (Service), and assign it the shortcut key combo of your choice.

Octopus deploy process: share data between steps

Given a runbook/process with 3 steps
step 1 and step 2 both write some JSON data to an output variable (JSON for an object)
Set-OctopusVariable -name "SharedData" -value ($sharedObject | ConvertTo-Json)
step 2 and step 3 need to read and update the data from the previous steps
$OctopusParameters["Octopus.Action[StepA].Output.SharedData"]
$OctopusParameters["Octopus.Action[StepB].Output.SharedData"]
Assume that any secondary step can use the shared data object from any previous step. It's just an object that is manipulated by multiple steps along the way.
If I choose to skip step 2, then step 3 won't see the step 2 output var value because the read instruction requires the name of the step (StepA or StepB).
Is there a way for the output var read syntax to just get the value from the previous step instead of an explicitly named step? E.g.:
$OctopusParameters["Octopus.Action[previous step alias].Output.SharedData"]
I already tried doing this using the $OctopusParameters dictionary directly.
In one step:
$OctopusParameters["SharedData"] = ($sharedObject | ConvertTo-Json)
Then this in a subsequent step:
$sharedObject = $OctopusParameters["SharedData"] | ConvertFrom-Json
But it doesn't work. The dictionary read returns a null. The raw dictionary assignment isn't persisted between the steps. It only works using the provided Set-OctopusVariable helper or other prescribed methods, but those lock you into knowing the previous step name.
Alternatively, is there a way to store data more "globally" to a process execution for use later without the need to tie it specific to the output of another step of a process?
The way I approached this problem is by considering the use of the Dictionary $OctopusParameters to your advantage. As it's a dictionary, it has keys you can inspect. If you want to get the last variable with the same name, just iterate the keys, and get the last one.
e.g., Suppose you have a deployment process like this:
Step A has code like this:
$sharedObject = [PSCustomObject]#{
StepName = "Step A";
Value = "Value from Step A";
Message = "Step A says Hello!";
};
Set-OctopusVariable -name "SharedData" -value ($sharedObject | ConvertTo-Json)
Whilst Step B has code like this:
$sharedObject = [PSCustomObject]#{
StepName = "Step B";
Value = "Value from Step B";
Message = "Step B says Hello!";
};
Set-OctopusVariable -name "SharedData" -value ($sharedObject | ConvertTo-Json)
Finally, the last step checks for the existence of any Output variable ending in SharedData and then just iterate over each one to print the values to the log.
It then selects the last one, which is the important part. It does this so no matter which of Step A or Step B was skipped, it will always get the last one where the variable was set (you can obviously change this logic to suit your requirements)
$MatchingKeys = $OctopusParameters.Keys | Where-Object { $_ -match "^Octopus\.Action.*\.Output.SharedData$" }
Write-Highlight "Found $($MatchingKeys.Count) matching output variables"
foreach($matchingKey in $matchingKeys) {
$OutputVariableValue = $OctopusParameters[$matchingKey]
Write-Host "$matchingKey value: $OutputVariableValue"
}
Write-Host "Finding last value..."
$lastKey = $matchingKeys | Select-Object -Last 1
Write-Highlight "Last Match: $($OctopusParameters[$lastKey])"
You can also turn the above into a one-liner:
$JsonSharedData = $($OctopusParameters.Keys | Where-Object { $_ -match "^Octopus\.Action.*\.Output.SharedData$" } | Select-Object -Last 1 | ForEach-Object {$OctopusParameters[$_]})
You could wrap it in a conditional depending on whether or not StepB was skipped, e.g.
#{if Octopus.Action[StepB].IsSkipped}
$OctopusParameters["Octopus.Action[StepA].Output.SharedData"]
#{else}
$OctopusParameters["Octopus.Action[StepB].Output.SharedData"]
#{/if}

powershell script with mandatory parameters keeps prompting me to enter values despite values entered in the command line [duplicate]

I'm trying to get my head around powershell and write a function as cmdlet, found the following code sample in one of the articles, but it doesnt seem to want to work as cmdlet even though it has [cmdletbinding()] declaration on the top of the file.
When I try to do something like
1,2,3,4,5 | .\measure-data
it returns empty response (the function itself works just fine if I invoke it at the bottom of the file and run the file itself).
Here's the code that I am working with, any help will be appreciated :)
Function Measure-Data {
<#
.Synopsis
Calculate the median and range from a collection of numbers
.Description
This command takes a collection of numeric values and calculates the
median and range. The result is written as an object to the pipeline.
.Example
PS C:\> 1,4,7,2 | measure-data
Median Range
------ -----
3 6
.Example
PS C:\> dir c:\scripts\*.ps1 | select -expand Length | measure-data
Median Range
------ -----
1843 178435
#>
[cmdletbinding()]
Param (
[Parameter(Mandatory=$True,ValueFromPipeline=$True)]
[ValidateRange([int64]::MinValue,[int64]::MaxValue)]
[psobject]$InputObject
)
Begin {
#define an array to hold incoming data
Write-Verbose "Defining data array"
$Data=#()
} #close Begin
Process {
#add each incoming value to the $data array
Write-Verbose "Adding $inputobject"
$Data+=$InputObject
} #close process
End {
#take incoming data and sort it
Write-Verbose "Sorting data"
$sorted = $data | Sort-Object
#count how many elements in the array
$count = $data.Count
Write-Verbose "Counted $count elements"
#region calculate median
if ($sorted.count%2) {
<#
if the number of elements is odd, add one to the count
and divide by to get middle number. But arrays start
counting at 0 so subtract one
#>
Write-Verbose "processing odd number"
[int]$i = (($sorted.count+1)/2-1)
#get the corresponding element from the sorted array
$median = $sorted[$i]
}
else {
<#
if number of elements is even, find the average
of the two middle numbers
#>
Write-Verbose "processing even number"
$i = $sorted.count/2
#get the lower number
$x = $sorted[$i-1]
#get the upper number
$y = $sorted[-$i]
#average the two numbers to calculate the median
$median = ($x+$y)/2
} #else even
#endregion
#region calculate range
Write-Verbose "Calculating the range"
$range = $sorted[-1] - $sorted[0]
#endregion
#region write result
Write-Verbose "Median = $median"
Write-Verbose "Range = $range"
#define a hash table for the custom object
$hash = #{Median=$median;Range=$Range}
#write result object to pipeline
Write-Verbose "Writing result to the pipeline"
New-Object -TypeName PSobject -Property $hash
#endregion
} #close end
} #close measure-data
this the article where I took the code from:
https://mcpmag.com/articles/2013/10/15/blacksmith-part-4.aspx
edit: maybe I should add that versions of this code from previous parts of the article worked just fine, but after adding all the things that make it a proper cmdlet like the help section and verbose lines, this thing just doesnt want to work, and I believe there is something missing, I have a feeling that this could be because it was written for powershell 3 and I am testing it on win 10 ps 5-point-something, but honestly I dont even know in which direction I should look for, that's why I ask you for help
There is nothing wrong with the code (apart from possible optimizations), but the way how you call it can't work:
1,2,3,4,5 | .\measure-data
When you call a script file that contains a named function, it is expected that "nothing happens". Actually, the scripts runs, but PowerShell does not know which function it should call (there could be multiple). So it just runs any code outside of functions.
You have two options to fix the problem:
Option 1
Remove the function keyword and the curly braces that belong to it. Keep the [cmdletbinding()] and Param sections.
[cmdletbinding()]
Param (
[Parameter(Mandatory=$True,ValueFromPipeline=$True)]
[ValidateRange([int64]::MinValue,[int64]::MaxValue)]
[psobject]$InputObject
)
Begin {
# ... your code ...
} #close Begin
Process {
# ... your code ...
} #close process
End {
# ... your code ...
}
Now the script itself is the "function" and can be called as such:
1,2,3,4,5 | .\measure-data
Option 2
Turn the script into a module. Basically you just need to save it with .psm1 extension (there is more to it, but for getting started it will suffice).
In the script where you want to use the function you have to import the module before you can use its functions. If the module is not installed, you can import it by specifying its full path.
# Import module from directory where current script is located
Import-Module $PSScriptRoot\measure-data.psm1
# Call a function of the module
1,2,3,4,5 | Measure-Data
A module is the way when there are multiple functions in a single script file. It is also more efficient when a function will be called muliple times, because PowerShell needs to parse it only once (it remembers Import-Module calls).
It works as-is, you just need to call it properly. Since the code is now a function, you cannot call it like before when the codes was directly in the file
# method when code is directly in file with no Function Measure-Data {}
1,2,3,4,5 | .\measure-data
Now that you've defined the function you instead need to dot source the file so that it loads your function(s) into memory. Then you can call your function by its name (which happens to be the same as the filename, but doesn't have to be)
# Load the functions by dot-sourcing
. .\measure-data.ps1
# Use the function
1,2,3,4,5 | Measure-Data
You're not passing it an Object but an array of integers. If you change the parameter to:
Param (
[Parameter(Mandatory=$True,ValueFromPipeline=$True)]
[ValidateRange([int64]::MinValue,[int64]::MaxValue)]
[Int[]]$InputObject
)
Now things work:
PS> 1,2,3,4,5 | Measure-Data
Median Range
------ -----
3 4

How do I check/enforce the next cmdlet to take the pipeline object

I am intrigued by this question How to sort 30Million csv records in Powershell and came up with a solution which builds temporary files.
Now I am trying to come up with another approach which comes down to first building an sorted index list ([int[]]) and than pick a bulk of those indices (e.g. 1e6) from the source file and drop them onto the pipeline:
Function Sort-BigCsv {
[CmdletBinding()] param(
[string]$FilePath,
[String]$Property,
[Int]$BulkSize = 1e6,
[System.Text.Encoding]$Encoding = [System.Text.Encoding]::Default
)
Begin {
if ($FilePath.StartsWith('.\')) { $FilePath = Join-Path (Get-Location) $FilePath }
$Index = 0
$Dictionary = [System.Collections.Generic.SortedDictionary[string, int]]::new()
Import-Csv $FilePath -Encoding $Encoding | Foreach-Object { $Dictionary[$_.$Property] = $Index++ }
$IndexList = [int[]]($Dictionary.Values)
$Dictionary = $Null # we only need the sorted index list
}
Process {
$Start = 0
While ($Start -lt $IndexList.Count) {
[System.GC]::Collect()
$End = $Start + $BulkSize - 1
if ($End -ge $IndexList.Count) { $End = $IndexList.Count - 1 }
Import-Csv $FilePath -Encoding $Encoding |
Select-Object -Index $IndexList[$Start..$End] | # Note that the -Index parameter reorders the list
Sort-Object $Property | # Consider smarter sort as this has already be done before
$Start = $End + 1
}
}
}
Example:
Sort-BigCsv .\Input.Csv Id -BulkSize 100 # | Export-Csv .\Output.Csv
I think that the general idea behind this should work, but I have second guesses what PowerShell is actually doing in terms of passing on the objects to the next cmdlet(/display), and questions arise like:
Will every single item (including multiple items created within one Process block cycle) always immediately be picked up and processed by next cmdlet?
Will there be any difference for this function if I put everything in the Process block into the End block?
What if the next process block is slower than the current one?
Will it stall the current one?
Or will the items be buffered?
If they are buffered, can I force them to be taken by the next cmdlet, or wait till they are consumed?
Maybe it is just working as supposed (it is hard to tell from e.g. the memory size in the task manager), but I would like to confirm this...
Is there any check and/or control whether an item is passed on (or is this just simply always the case after a Write-Output`? Meaning, if the last cmdlet stalls, the first cmdlet will also needs to stall...)

Resources