What is causing this PowerShell script error? - windows

I'm writing a PowerShell script that registers and unregistered Scheduled Tasks on Windows Server 2012 R2. I'm in particular having trouble with this line:
Unregister-ScheduledTask -TaskPath "\Sensei\"
(Sensei is the code name of this project, and is the name of the folder that contains all the tasks I care about.)
When I run that command from ISE or the command line, it works fine. However, when I run it from a .ps1 file, it gives me a downright confusing error:
Unregister-ScheduledTask : Cannot retrieve the dynamic parameters for the cmdlet. Cannot bind argument to parameter 'TypeName' because it is null.
At Z:\server_scripts\clean.ps1:12 char:5
+ Unregister-ScheduledTask -TaskPath "\Sensei\"
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Unregister-ScheduledTask], ParameterBindingException
+ FullyQualifiedErrorId : GetDynamicParametersException,Unregister-ScheduledTask
As near as I can tell, TypeName is not a parameter that Unregister-ScheduledTask accepts, so it must be coming from somewhere internal to the cmdlet.
Does anyone have any idea what's causing this?
EDIT: Someone asked for the whole script. I omitted it since there's nothing else of note.
cd Z:\server_scripts
. .\variables.ps1
#Delete config files
del $SENSEIPATH\assets.cfg
del $SENSEIPATH\Web\Web.config
#Remove the virtual application
Remove-WebApplication -Site $IISSITE -Name "c"
#Clean out the scheduled tasks
Unregister-ScheduledTask -TaskPath "\Sensei\"
.\variables.ps1 just contains a few variable definitions, such as $SENSEIPATH or $IISSITE. I can't post it here, but I promise there's nothing fancy.

So, I don't understand what, exactly, the problem is, but here's what is going on.
I actually have a separate script, not listed above, that downloads the script in question, and then executes it. This is bootstrap.ps1. It looks something like this:
Read-S3Object -BucketName $BUCKET -Folder z:\server_scripts -KeyPrefix "server_scripts/app"
Z:\server_scripts\clean.ps1
When this ran from inside ISE, it would produce the error message above.
HOWEVER, if I ran clean.ps1 directly, or ran bootstrap.ps1 outside of ISE, OR I modified bootstrap.ps1 like this, it works fine:
Read-S3Object -BucketName $BUCKET -Folder z:\server_scripts -KeyPrefix "server_scripts/app"
. Z:\server_scripts\clean.ps1
(I missed that some of these were succeeding because it throws a different error if Unregister-ScheduledTask doesn't find any tasks to unschedule, and wasn't paying close attention).
So, there might be some kind of scoping issue, but I am long past caring, and just want this thing to work. May this help someone else!

Related

Powershell write to variable instead of file

I'm trying to use Azure powershell to pull an SSH key and add it to a VM.
The cmdlet is
Get-AzKeyVaultKey ... -OutFile filename
I'd like to avoid actually writing the key to the disk, but I need it in a variable. Is there any way to provide a variable acting like a file or something so I can go
-OutFile $someVariablePretendingToBeFile
and use that variable please?
The variable that is returned by Get-AzKeyVaultKey is of type PsKeyVaultKey
if I get its key property, and call ToRSA() I get an RSACryptoServiceProvider
But I still don't see where to get the public key string from!
It's annoying b/c -OutFile produces exactly the public key
Thanks
Since Get-AzKeyVaultKey is not providing a way of doing (that I know of), can you get it to work with a simple :
$key=(Get-AzKeyVaultKey XXX)
To get the result in a variable ?
Let us know !
Not sure if tis would work, it is a variant of the answer above. I can't test it just now
$PublicKey = Get-AzKeyVaultKey -VaultName $vaultName -KeyName $keyName

Unable to copy item from mapped network drive using powershell as file contains wierd characters

I am trying to copy files from mapped network drive.Some of them gets copied but others are not copied as filename has got some wierd characters.
for example my mapped network drive Z: contains the following files:
skifteretsattest 1(1).pdf
MailBody.msg
k�rekort terje(3).pdf
I am able to copy first two files from mapped network drive but not the last one using the below command
Copy-Item -LiteralPath Z:\$name -Destination I:\Dat\SomePath\ss/ -Force
The error which I get is:
Copy-Item : Could not find file 'Z:\k�rekort terje(3).pdf
I tried [WildcardPattern]::Escape($name) but that also did not work
Kindly help if anybody knows the solution
Maybe you could use robocopy.exe oder xcopy.exe instead?
Maybe old "dir /x" can help to find out the old "8.3" filename (like "GET-GP~1.PS1" for "Get-GPProcessingTime.ps1") and this can be used to copy or rename the file?
I also remember something about bypassing file system logic using unc-like syntax like \\0\driveletter\directory or whatever - unfortunately I don't remember the exact syntax. Maybe someone else does?
Try something like this:
$files = Get-ChildItem -Path "Z:\"
$files | % { Copy-Item -Destination "I:\Dat\SomePath\ss" }

Admin vs Non-Admin Mode - Cannot overwrite variable because the variable has been optimized

During some testing today I came across an unexpected issue and I do not understand why it is happening. Below is the code I am using to duplicate the issue. It is only a very small portion of the larger project.
Testing is being cone on Windows 10 Build 1709, if that helps
Both the PS1 File and the BAT File are named the same.
Ways to cause the errors
Running the PS1 File via Right-Click - Run with PowerShell will cause the error
Opening PowerShell ISE in Non-Admin Mode, then opening/running the script will cause the error
Running BAT File as Admin or Non-Admin will cause the error
Ways to avoid the errors
Opening PowerShell ISE in Admin Mode, then opening/running the script will not cause the error
Adding Script: in front of the variables on the last 2 lines of code will not cause the error no matter how the script is executed
Using VSCode, it will work as shown below. Running it in the integrated terminal, it will see it not running as an Admin, it will launch PowerShell.exe outside of VSCode and work without issue
-
Why do I have Script: in front of the variables in the functions? It was the only way I could get variables set in the functions to be used outside the functions. The other 25 or so variables not listed in this post do not have an issue, however, they are not modified like these two are after they are set.
The Questions
Why, if running the ISE in Admin Mode, it will work?
Why would it not work if it relaunches as an Administrator?
Why does VSCode not care and it works regardless?
Something isn't making sense and I cannot pinpoint it.
Here are the errors
Cannot overwrite variable NetFX3 because the variable has been optimized. Try using the New-Variable or Set-Variable
cmdlet (without any aliases), or dot-source the command that you are using to set the variable.
At C:\Users\a502690530\Desktop\Testing2.ps1:14 char:5
+ [string]$Script:NetFX3 = $BAT_Files_Path + "NetFX3.zip"
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (NetFX3:String) [], SessionStateUnauthorizedAccessException
+ FullyQualifiedErrorId : VariableNotWritableRare
Cannot overwrite variable Power_Plan because the variable has been optimized. Try using the New-Variable or
Set-Variable cmdlet (without any aliases), or dot-source the command that you are using to set the variable.
At C:\Users\a502690530\Desktop\Testing2.ps1:15 char:5
+ [string]$Script:Power_Plan = $BAT_Files_Path + "Power_Plan.zip"
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (Power_Plan:String) [], SessionStateUnauthorizedAccessException
+ FullyQualifiedErrorId : VariableNotWritableRare
Here is the code
# Checks if running as an administrator. If not, it will relaunch as an administrator
If (-Not ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")) {
$Arguments = "& '" + $MyInvocation.MyCommand.Definition + "'"
Start-Process Powershell -Verb RunAs -ArgumentList $Arguments
Break
}
[string]$ErrorActionPreference = "Continue"
[string]$BAT_Files = $Root_Path + "BAT_Files\"
Function Set-FilePaths ([string]$BAT_Files_Path) {
# BAT Files Paths (ZIPs only!!!)
[string]$Script:NetFX3 = $BAT_Files_Path + "NetFX3.zip"
[string]$Script:Power_Plan = $BAT_Files_Path + "Power_Plan.zip"
Set-Lists
}
function Set-Lists {
# List of BAT Files (ZIPs)
[System.Collections.ArrayList]$Script:List_Of_BAT_Files = #(
$NetFX3
$Power_Plan
)
}
Set-FilePaths `
-BAT_Files_Path $BAT_Files
PAUSE
$NetFX3 = ((Split-Path $NetFX3 -Parent) + "\NetFX3\")
$Power_Plan = ((Split-Path $Power_Plan -Parent) + "\Power_Plan\")
BAT File to launch
REG ADD "HKLM\SOFTWARE\Microsoft\PowerShell\1\ShellIds\Microsoft.PowerShell" /T REG_SZ /V ExecutionPolicy /D Unrestricted /F
Start PowerShell.exe -Command "& '%~dpn0.ps1'"
I have no specific answer, but a pointer:
Your issue sounds like a PowerShell bug related to the DLR (Dynamic Language Runtime), a technology PowerShell uses behind the scenes (since v3); there's at least one open bug report on GitHub that sounds related.
Aside from the workaround you already know of - using scope modifier script consistently - I suggest avoiding variable access across scope boundaries as a general best practice, which should also avoid the problem.
PowerShell is very flexible in what it can return (output) from a function, so it's better to set variables in the caller's scope based on a function's output.
Specifically, I suggest refactoring your code as follows:
Function Get-FilePaths ([string]$BAT_Files_Path) {
# Output the paths as an *array*.
($BAT_Files_Path + "NetFX3.zip"), ($BAT_Files_Path + "Power_Plan.zip")
}
# Call the function in the script scope and capture its output in variables.
$List_Of_BAT_Files = Get-FilePaths
# Use a destructuring assignment to store the elements of the array
# in individual variables
$NetFX3, $Power_Plan = $List_Of_BAT_Files
If there are a lot of individual variables to set, you can make the function output a hash table instead, and use the hash table's named entries instead of individual variables (requires PSv3+, due to use of [ordered] to create a hash table with ordered keys):
Function Get-FilePaths ([string]$BAT_Files_Path) {
# Output the paths as a *hash table*, using its
# entries for named access instead of individual variables.
$outHash = [ordered] #{
NetFX3 = $BAT_Files_Path + "NetFX3.zip"
Power_Plan = $BAT_Files_Path + "Power_Plan.zip"
}
# Add a 'List' entry that contains all values added above as an array.
# Note the need to use #(...) to force creation of a new array from the
# hash table's value collection.
$outHash.List = #($outHash.Values)
# Output the hash table.
$outHash
}
# Call the function in the script scope and capture its output in
# a single variable that receives the hash table.
$hash = Get-FilePaths
# Now you can access the invididual values by name - e.g., $hash.NetFX3 -
# or use $hash.List to get all values.

Downloading and opening a series of image urls

What I am trying to do is download 2 images from URL's and open them after download. Here's what I have:
#echo off
set files='https://cdn.suwalls.com/wallpapers/cars/mclaren-f1-gtr-42852-400x250.jpg','http://www.dubmagazine.com/home/media/k2/galleries/9012/GTR_0006_EM-2014-12-21_04_GTR_007.jpg'
powershell "(%files%)|foreach{$fileName='%TEMP%'+(Split-Path -Path $_ -Leaf);(new-object System.Net.WebClient).DownloadFile($_,$fileName);Invoke-Item $fileName;}"
Im getting 'Cannot find drive' A drive with the name 'https' cannot be found.
It's the Split-path command that is having problems but cant seem to find a solution.
You could get away with basic string manipulation but, if the option is available, I would opt for using anything else that is data aware. In your case you could use the [uri] type accelerator to help with these. I would also just opt for pure PowerShell instead of splitting between batch and PS.
$urls = 'https://cdn.suwalls.com/wallpapers/cars/mclaren-f1-gtr-42852-400x250.jpg',
'http://www.dubmagazine.com/home/media/k2/galleries/9012/GTR_0006_EM-2014-12-21_04_GTR_007.jpg'
$urls | ForEach-Object{
$uri = [uri]$_
Invoke-WebRequest $_ -OutFile ([io.path]::combine($env:TEMP,$uri.Segments[-1]))
}
Segments will get you the last portion of the url which is a proper file name in your case. Combine() will build the target destination path for you. Feel free to add you invoke item logic of course.
This also lacks error handling if the url cannot be accessed or what not. So be aware of that possibility. The code above was meant to be brief to give direction.

Error handling in batch files

I'm working on a script to automate a couple tasks. Essentially, it just renames files if they exist. However, these files are critical to the operation of some other software. If renaming a file fails, or if it cannot find file, i would like it to say which file it was. However, the REN command isn't very descriptive when it fails. I know that if it fails, it raises ERRORLEVEL to 1. Is there any other information i can get from the failure, like the file name? Or is it pretty much just a success or fail?
Machines this will be running on are Windows 7 and up, so powershell is also an option if necessary but i would prefer batch.
Thanks
As Ken White points out in a comment on the question, the error messages emitted by cmd.exe's ren command do not include the filename(s) involved.
PowerShell, by contrast, provides verbose error messages that do include filenames.
Take the following sample script:
# Create helper sample files.
$tempFiles = "$env:TEMP\foo", "$env:TEMP\bar"
$null = New-Item -Type File $tempFiles
# Try to rename a nonexistent item.
Rename-Item \no\such -NewName foo
# Try to rename to a file that already exists.
Rename-Item $env:TEMP\foo -NewName bar
# Clean up the sample files.
Remove-Item $tempFiles
If you save the above to a *.ps1 file and run it (assuming you have permitted scripts to run), you'll see the following error output:
Rename-Item : Cannot rename because item at '\no\such' does not exist.
At C:\Users\jdoe\Desktop\pg\pg.ps1:8 char:5
+ Rename-Item \no\such -NewName foo
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [Rename-Item], PSInvalidOperationException
+ FullyQualifiedErrorId : InvalidOperation,Microsoft.PowerShell.Commands.RenameItemCommand
Rename-Item : Cannot create a file when that file already exists.
At C:\Users\jdoe\Desktop\pg\pg.ps1:11 char:5
+ Rename-Item $env:TEMP\foo -NewName bar
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (C:\Users\jdoe\AppData\Local\Temp\foo:String) [Rename-Item], IOException
+ FullyQualifiedErrorId : RenameItemIOError,Microsoft.PowerShell.Commands.RenameItemCommand
While the output is verbose, it does contain all relevant information, and you can even inspect it programmatically, via the automatic $Error variable, which contains all errors that were reported in the session in reverse chronological order ($Error[0] contains the most recent error).
Not that, by default, the PowerShell script will continue to run when these errors happen, because they are considered non-terminating.
The simplest way to cause the script to abort right away is to start your script with line $ErrorActionPreference= 'Stop'.
Alternatively, you can use -ErrorAction Stop on a per-command basis.
Run Get-Help about_Preference_Variables and Get-Help about_CommonParameters to learn more.
as already noted, ren working with wildcards don't tell you which file failed. So you have to process each file individually. Thanksfully, there are commands for doing so: for and forfiles.
for %%a in (*) do ren "%%a" "destination.ext" 2>nul || echo FAILED: %%a
this tries to rename every file to "destination.ext", which works fine for the very first of them.
Renaming the rest of the files will fail (destination already existing).
|| is a "if previous command failed, then" operator.
2>nul removes the errormessage (useless for you, as it doesn't tell you the filename)
(Note: this is syntax for batchfiles. If you try it directly on command line,
replace each %%a with %a)

Resources