I have a PowerShell script with constants defined inside the script:
Set-Variable MY_CONST -option Constant -value 123
Write-Host "Hello, World!"
Write-Host $MY_CONST
Now, when I run this script once, it is fine.
When I run the script again, you get error messages:
Set-Variable : Cannot overwrite variable MY_CONST because it is read-only or constant.
I am running inside Visual Studio Code 2017.
If I exit and re-open Visual Studio Code, it works if you run it again (and fails the second time after that ...).
If you use -Option Constant, you're telling PowerShell that the resulting variable should not allow later modification.
Therefore, running Set-Variable again with the same variable name results in an error.
That said, you would only see that symptom if your script is "dot-sourced", i.e., executed directly in the caller's scope, which means that repeated invocations see definitions left behind by previous invocations.
Some environments implicitly perform "dot-sourcing" - notably, the PowerShell ISE and - as in your case - Visual Studio Code with the PowerShell extension.
A simple workaround is to add -ErrorAction Ignore to your Set-Variable call, given that it's fair to assume that the only possible failure reason is redefinition of the constant.
More generally, in environments such as the PowerShell ISE and Visual Studio Code, be aware that a given script's invocation may leave definitions behind that affect subsequent invocations.
By contrast, this is not a concern when invoking a script repeatedly from a PowerShell console/terminal window, because scripts there run in a child scope.
mhhollomon asks if using scope modifiers such as $script:... would work:
No, because in the global scope in which scripts execute in Visual Studio Code, scope $script:... (Set-Variable -Scope Script ...) is the same scope, i.e., the global scope too.
If you did want to explicitly ensure that your script doesn't modify the calling scope even when "dot-sourced", you can wrap the entire script's content in & { ... } to ensure execution in a child scope.
Related
The variable in the Terraform file (infrastructure.tf) is declared like this:
variable "tags" {
type = map(string)
}
This is the PowerShell code that executes the terraform command-line program with the plan command:
$command = "plan"
$options = #(
"'--var=tags={a:\`"b\`"}'"
"--out=path/to/out.tfplan"
)
Write-Host terraform,$command,$options
& terraform $command $options
The Write-Host command's output is:
terraform plan '--var=tags={a:\"b\"}' --out=path/to/out.tfplan
If I copy paste that into an interactive PowerShell 7.2 (pwsh) session then it works. But the & terraform command fails with this error:
Error: Too many command line arguments
To specify a working directory for the plan, use the global -chdir flag.
I know the Terraform documentation for the plan command warns against using PowerShell to run the command, due to quoting issues (emphasis mine):
PowerShell on Windows cannot correctly pass literal quotes to external programs, so we do not recommend using Terraform with PowerShell when you are on Windows. Use Windows Command Prompt instead.
Unfortunately they don't specify if they mean Windows PowerShell 5.1 included in Windows (powershell.exe), or also PowerShell 7.2 running on Windows (pwsh). But with pwsh it is clearly possible to run the plan command in an interactive PowerShell session (I am using macOS, but some others use Windows) and pass literal quotes to an external program. It just seems that running the same command from a .ps1 file does not work.
Since our whole dev/ops tooling is PowerShell-based, I'd like to know: is this possible?
Because if it is not, then we will have to work around this limitation.
Edit:
Some things I've tried:
Use splatting for $options (e.g. #options)
Add the stop-parsing token (e.g. --%) as the first item in $options (that token is actually "only intended for use on Windows platforms")
Many variations of single/double quotes
Remove the \ (I am not actually sure why this seems required, the map literal syntax is either {"x"="y"} or {x:"y"}), but without the \ copy-pasting the printed command line in an interactive PowerShell also does not work.
tl;dr
Omit the embedded enclosing '...' around --var=..., because they will become a literal part of your argument.
The - unfortunate - need to manually \-escape the embedded " instances, even though PowerShell itself does not need it, is the result of a long-standing bug that was finally fixed in PowerShell (Core) 7.3.0; in 7.3.0 and up to at least 7.3.1, the fix is in effect by default, which breaks the solution below, and therefore requires $PSNativeCommandArgumentPassing = 'Legacy'; however, it looks like the fix will become opt-in in the future, i.e. the old, broken behavior (Legacy) will become the default again - see this answer.
Using Write-Host to inspect the arguments isn't a valid test, because, as a PowerShell command, it isn't subject to the same rules as an external program.
For ways to troubleshoot argument-passing to external programs, see the bottom section of this answer.
$command = "plan"
$options = #(
"--var=tags={a:\`"b\`"}" # NO embedded '...' quoting
"--out=path/to/out.tfplan"
)
# No point in using Write-Host
& { # Run in a child scope to localize the change to $PSNativeCommandArgumentPassing
# Note: Only needed if you're (also) running on PowerShell 7.3+
$PSNativeCommandArgumentPassing = 'Legacy'
& terraform $command $options
}
How to control the exact process command line on Windows / pass arguments with embedded double quotes properly on Unix:
Note: The solution above relies on PowerShell's old, broken behavior, and while it works in the case at hand, a fully robust and less conceptually confusing solution requires more explicit control over how the arguments are passed, as shown below.
A cross-edition, cross-version, cross-platform solution:
Assuming that terraform must see --var=tags={a:\"b\"} on its process command line on Windows, i.e. needs to see the argument as verbatim --var=tags={a:"b"} after parsing its command line, combine --%, the stop-parsing token, with splatting, which gives you full control over how the Windows process command line is built behind the scenes:
$command = "plan"
$options = #(
'--%'
'--var=tags={a:\"b\"}'
'--out=path/to/out.tfplan'
)
& { # Run in a child scope to localize the change to $PSNativeCommandArgumentPassing
# !! Required in v7.3.0 and up to at least v7.3.1, due to a BUG.
$PSNativeCommandArgumentPassing = 'Legacy'
& terraform $command #options
}
This creates the following process command line behind the scenes on Windows (using an example terraform path):
C:\path\to\terraform.exe plan --var=tags={a:\"b\"} --out=path/to/out.tfplan
Note:
In PowerShell (Core) 7.3.0 and at least up to 7.3.1, --% is broken by default, in that its proper functioning is mistakenly tied to value of the v7.3+ $PSNativeCommandArgumentPassing preference variable; thus, (temporarily) setting $PSNativeCommandArgumentPassing = 'Legacy' is required, as shown above - see GitHub issue #18664 for the bug report.
Even though --% is primarily intended for Windows, it works on Unix-like platforms too, as long as you use the Microsoft C/C++ command-line syntax rules to formulate the arguments; specifically, this means:
only use " characters for quoting (with syntactic function)
use \ only to escape " chars.
While you can use --% without splatting, doing so comes with severe limitations - see this answer.
A simpler, but Windows-only cross-edition, cross-version solution:
Calling via cmd /c also gives you control over how the command line is constructed:
$command = "plan"
$options = #(
'--var=tags={a:\"b\"}'
'--out=path/to/out.tfplan'
)
cmd /c "terraform $command $options"
Note: This is often more convenient than --%, but suboptimal, because:
The intermediary cmd.exe call creates extra overhead.
% characters may be interpreted by cmd.exe, and, in unquoted arguments, additional metacharacters such as & and ^ - preventing that requires extra effort.
A v7.3+ cross-platform solution:
Relying on PowerShell's corrected behavior in v7.3+ (no need for manual \-escaping anymore) requires setting $PSNativeCommandArgumentPassing to 'Standard'.
Note: If you target only Unix-like platforms, that isn't necessary.
$command = "plan"
$options = #(
'--var=tags={a:"b"}' # Note: NO \-escaping of " required anymore.
'--out=path/to/out.tfplan'
)
& { # Run in a child scope to localize the change to $PSNativeCommandArgumentPassing
# Necessary on Windows only.
$PSNativeCommandArgumentPassing = 'Standard'
& terraform $command $options
}
Note: On Windows, this creates a slightly different process command line than the solutions above; notably, --var=tags={a:\"b\"} is enclosed in "..." as a whole; however, well-behaved CLIs should parse this as verbatim --var=tags={a:"b"} too, whether enclosed in "..." or not.
C:\path\to\terraform.exe plan "--var=tags={a:\"b\"}" --out=path/to/out.tfplan
To load a script file into an open PS console (e.g. to import functions) dot-sourcing or the Import-module applet is needed.
Using this inside a function (to create an alias) doesn't work, e.g.:
Function psinit1 { . C:\Scripts\scriptFunktions.ps1 }
Function psinit2 { Import-module C:\Scripts\scriptFunktions.ps1 -force}
when I call psinit1 or psinit2 I don't get an error, but my functions are not available. Why doesn't this work, am I right in assuming that the function opens a new scope which loads the script (and gets closed once the function is done)?
How can I get it to work?
Unless you invoke a function via ., the dot-sourcing operator, its body executes in a child scope, so that any operations you perform inside of it - unless you explicitly target a different scope - are limited to that child scope and its descendant scopes.
Therefore, to make your functions works as intended, i.e. to make definitions visible to the caller's scope, dot-source their invocations too:
. psinit1
Generally, note that while Import-Module also accepts .ps1 scripts, its primary purpose is to act on modules. With .ps1 scripts, it effectively behaves like dot-sourcing, except that repeating an Import-Module call with a .ps1 script in a child scope fails, unless -Force is also specified (to force reloading).
The upshot: Do not use Import-Module with .ps1 scripts:
The primary reason to avoid is that it makes a promise it cannot keep: because simple dot-sourcing takes place, no actual module is being imported - even though one nominally shows up in Get-Module's output, named for the script file's base file name (e.g., foo for script foo.ps1).
Because simple dot-sourcing takes place, you can not use Remove-Module to unload the script's definitions later; while you can call Remove-Module on the imported pseudo-module, it has no effect: the dot-sourced definitions remain in effect.
I'm trying to run an application that reads an environment variable that contains a JSON with about 22k characters. The project setup tells me to use $(cat ./path/to/file) to correctly configure it, but as I'm using windows, this commands do not work.
I've tried copying the contents of the file to the variable using the GUI Environment Variable, but its input truncates the value to a certain limit which is not even on half of the file.
After this I tried setting the variable using the Powershell with the command:
$env:myvar = iex '$(type path/to/file)'
and then saving the result with:
[System.Environment]::SetEnvironmentVariable('MYVAR', $env:MYVAR, [System.EnvironmentVariableTarget]::Machine)
After these commands, Powershell is able to print the result correctly but CMD still prints only part of the value when I echo it.
This is very odd because the regedit shows the correct value as suggested here.
The application still can't process the value because it is not complete.
Is there any fix for this?
Note: This answer applies to Windows.
tl;dr
While you can store up to 32,766 characters in a single environment variable, the standard retrieval mechanisms in cmd.exe and PowerShell / .NET (as of v7.1 / 5.0) support only up to 4,095.
A workaround in PowerShell is possible, but ultimately it comes down to whether the target executable that is meant to read an environment-variable value supports reading values up to the technical maximum length.
The technical limit for the number of characters in a single environment variable is 32,766 (32KB = 32768, minus 2).
Starting with Windows Server 2008 / Windows Vista, there is no longer a limit on the overall size of the environment block - see the docs.
However, depending on how the environment-variable is retrieved, the limit may be lower:
Both cmd.exe and PowerShell, as of v7.1 / .NET 5.0, support retrieving at most 4,095 characters.
However, in PowerShell you can retrieve longer values, assuming the variable of interest is defined persistently in the registry, and assuming that you know whether it is defined at the machine or user level; e.g., for a MYVAR environment variable:
At the machine level:
Get-ItemPropertyValue 'registry::HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Environment' MYVAR
At the user level:
Get-ItemPropertyValue registry::HKEY_CURRENT_USER\Environment MYVAR
try the type command. It is the windows equivalent of the unix cat command. This means storing the json inside of a seperate file and using the command "type <path_to_file>".
I'm trying to start a powershell instance, that loads a script and remains open so I can still call methods loaded by that script manually.
I'm trying to dot source a script and pipe it to powershell like below, from a cmd instance/batchfile:
echo . .\script.ps1 | powershell
The result in this case is that powershell starts, loads my script, executes it and exits. I've tried running with -noexit argument, it has no effect.
I'm thinking of another option, to start a powershell process and pipe my dot source command to its stdin - but this probably won't allow me to interact with the process anymore because its stdin is opened by the host process.
If you need to run a script file so that window stays open and variables are accessible after the execution.
Try dot sourcing the script file like this:
powershell -noexit ". .\script.ps1"
Once the script is done, you can access any internal variable the script defined. Assuming the variables are at the script level scope.
Okay, here's essentially what I am trying to do. I have a process P1. This process needs to invoke the Visual Studio command-line compiler cl.exe in a separate process P2 (obviously). However, as everyone who has ever used the Visual Studio command-line compiler knows, you cannot simply invoke cl.exe and expect a good experience. You instead have to first run the batch script %VSXXXCOMNTOOLS%\vsvars32.bat (where XXX is the Visual Studio version number). This script sets a few key environment variables used by the compiler (such as what to use as the include path). Using a batch script, this is insanely easy to do:
call "%VS110COMNTOOLS%\vsvars32.bat"
...
cl Foo.cpp Bar.cpp ...
since just calling a batch file from a batch script runs in the same process (and thus the added environment variables are persistent). This is what I used to do before I realized that I need more flexibility and decided to port my script to C++ which, so far, has worked wonderfully. That is, until I got to the point where I need to implement the actual compilation.
So, that's the problem I am ultimately trying to solve. The best idea I have come up with is to invoke cmd.exe /c "%VS110COMNTOOLS%\vsvars32.bat" in a separate process P3 using CreateProcess, wait for that process to terminate, and then extract the modified environment variables from that child process. That is, P1 creates P3 and waits for it to finish. P1 then sets P3's environment variables as its own. P1 then creates P2 with these environment variables set. So the code looks roughly as follows (minus all error checking):
...
CreateProcess(TEXT("cmd"), TEXT("/c \"%VS110COMNTOOLS%\vsvars32.bat\""), NULL, NULL, FALSE, 0, NULL, NULL, &si, &pi);
WaitForSingleObject(pi.hProcess, INFINITE);
/* Set current process environment using pi.hProcess */
CloseHandle(pi.hProcess);
...
CreateProcess(TEXT("cl"), TEXT("..."), NULL, NULL, FALSE, 0, NULL, NULL, &si, &pi);
That would be the preferred solution. I am not entirely sure if such a thing is possible, but based off my research, it appears there is a way to do this in .NET, and ProcessExplorer appears to be able to read the environment of arbitrary processes, so I would assume such a solution is possible. I just can't seem to find any documented functions that are able to get environment variables from child processes. There's also an old discussion that is similar to this on MSDN. One of the responses mentions setting Merge Environment to yes. Anyone know what that is/means? I can't seem to find any documentation on it.
If it turns out this is not possible, alternate solutions I have thought about is (1) writing a batch script that simply calls vsvars32.bat and then invokes cl.exe with the input arguments, (2) invoking cmd instead of cl with arguments to run vsvars32.bat and then compile (similar to 1, but more extensible... but not sure if possible), or (3) print the environment variables to a file and then read those in. I'd prefer not to use any of such solutions if possible.
I'm also open to alternate suggestions. Just know that 99% of what I need to do is already done, so clean, non-hacky solutions are preferred.
The clean way to do this is to run vcvars32 to set all the environment variables and then run your process P1.
cmd /C vcvars32.bat && P1
Note that the user doesn't have to do this manually. Create a shortcut with this target:
cmd /C ""C:\Some Path\vcvarsall.bat" && start /separate C:\SomeOtherPath\YourGui.exe"
This sets the environment variables and then launches your GUI app. The start /separate stops the command-prompt lingering once the GUI app has started. If you also want this to be convenient to run from the command-line you can put it all in a batch file.
If for some reason you don't want to do this, the simplest way to get the environment variables from a batch script is to run:
cmd /U /C vcvars32.bat && set
This writes the values to standard output in Unicode. You can use a pipe to retrieve the values. This is much less hacky than trying to retrieve the variables from the memory of another process.
N.B. If you want to test this at a command-prompt you need to run:
cmd /U /C "vcvars32.bat && set"
The quotes ensure that the set runs in the child command processor.
Do not retrieve and set environment variables across process boundaries. Run vsvars32.bat and cl.exe in the same process, like they are meant to be. cmd.exe lets you execute multiple commands at one time using its && operator:
cmd.exe /c "\"%VS110COMNTOOLS%\vsvars32.bat\" && cl ..."