Invoke-Command doesn't see local network drives - windows

I have two computers: A and B. I'm trying to automate some CI\CD tasks, and my task is to start some process on B remotely, from A. The .exe file itself is on the R drive, which is a local network drive. So I do this:
# here $cred has encrypted credentials, but it is off topic...
Invoke-Command -ComputerName B -Credential $cred -ScriptBlock {
R:\WebClient\Platform\UP_110\Proc.exe
}
So apparently this would be the same thing as typing R:\WebClient\Platform\UP_110\Proc.exe on B's PowerShell and hitting Enter.
Now the problem is that I get this error when running the above code on A:
The term 'R:\WebClient\Platform\UP_110\Proc.exe' is not recognized as the name of a cmdlet, function, sc
ript file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is corr
ect and try again.
+ CategoryInfo : ObjectNotFound: (R:\WebClient\Pl...IMS.UP.Host.exe:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
+ PSComputerName : B
Apparently it says that there is no such file as R:\WebClient\Platform\UP_110\Proc.exe on my B computer. But that is not true. I do have it:
As a matter of a fact, I have this R drive both on A and B.
The code works fine if I move the .exe to any directory under the C drive (which is the system disk for me), but not for R.
Now even funnier is that I can run R:\WebClient\Platform\UP_110\Proc.exe on A and B manually. And it works.
So what's the issue here I'm facing? Thanks.

PowerShell Remoting can only access drives by default that are mapped within the system context. Most commonly, this will be letter drives based on attached hardware (whether this be USB, SATA, SCSI, etc.).
Drives mapped in the user context, such as remote drives, are not mapped because a full logon does not occur the same way as if you log in locally. There are two workarounds you have at your disposal:
Use the UNC path when accessing files over an SMB/CIFS share (e.g. \\server.domain.tld\ShareName\Path\To\Folder\Or\file.ext
Map the drive within the ScriptBlock passed to Invoke-Command using New-PSDrive:
# Single letter drive name
New-PSDrive -Name "R" -PSProvider FileSystem -Root "\\server.domain.tld\ShareName"
Get-ChildItem R:
# More descriptive drive name
New-PSDrive -Name "RemoteDrive" -PSProvider FileSystem -Root "\\server.domain.tld\ShareName"
Get-ChildItem RemoteDrive:
Three things to note:
Get-ChildItem in the example above is to show that listing the contents of the new drives should show the files you expect to see at the remote directory. This can be omitted once you are sure it works for you.
Additionally, using a longer drive name is a PowerShell feature and does not mean that you can map shared folders as a drive from within File Explorer with more than a single character.
You may run into the double hop issue trying to map to a remote drive this way, if you are attempting to use the same credential you initiated Invoke-Command with. Solving it properly is beyond the scope of Stack Overflow as this is a major architectural consideration for Active Directory.
However, you can work around it by building the credential object and passing it toNew-PSDrive from within the ScriptBlock, or running Invoke-Command with-Authentication CredSSP if your organization does not block it (many do).

Related

Powershell Map Network Drive, write file to it but Notepad fails to find file

My Powershell script:
New-PSDrive -Name J -Root \\myserver\mypath -PSProvider FileSystem
"test" | Out-File J:\test.txt
Get-Content -Path J:\test.txt
notepad J:\test.txt
J drive maps ok, the file gets created, Get-Content can read it BUT notepad (or any .exe) cannot see the file.
What should I do to make the drive mapping visible to other executables run within the script?
Thanks.
New-PSDrive by default creates drives that are visible only to PowerShell commands, in the same session.
To create a regular mapped drive that all processes see, use the -Persist switch, in which case you're restricted to the usual single-letter drive names (such as J: in your example).
Note: Despite the switch's name, the resulting mapping is only persistent (retained across OS sessions) if you either invoke it directly from the global scope of your PowerShell session (possibly by dot-sourcing a script that contains the New-PSDrive call from there) or explicitly use -Scope Global.
Otherwise, the mapping goes out of scope (is removed) along with the scope in which it was defined.

Create a virtual drive with PowerShell

I'm trying to create a D: drive in Windows, which points to some local directory (e.g. C:\DDrive) using PowerShell.
This code runs fine:
New-PSDrive -Name D -Root "C:\D_Drive\" -PSProvider "FileSystem"
But in the Windows Explorer no D:-drive is visible.
How does one use that command correctly?
Also: the drive should be permanent, so I tried adding a "-Persist" parameter. But that leads to an error ("unknown parameter "-Persist"...").
Just run:
subst D: "C:\D_Drive\"
in non-elevated PS session (don't run as Administrator).
The New-PSDrive command only creates a mapping that is visible in PowerShell. It's not shown in explorer at all.
Here are two other questions that ask the same thing:
https://community.spiceworks.com/topic/649234-powershell-mapped-drive-not-showing-in-my-computer
https://social.technet.microsoft.com/Forums/windowsserver/en-US/96222ba2-90f9-431d-b05a-82b804cdc76e/newpsdrive-does-not-appear-in-explorer?forum=winserverpowershell

UnauthorizedAccessException from PowerShell when using Invoke-WebRequest

I know that the error is pretty much self-explenatory but yet I am not able to find solution. I write a PowerShell script to automate the set-up project for the dev machines. There are a set of programs that must be installed so what I want to do i first download the file and then install it. I am having problems with downloading file from the web to the local machine.
My logic is as follows. I have an .xml file where I configure all the stuff. For the downloads it looks like this:
<download source="https://github.com/msysgit/msysgit/releases/download/Git-1.9.5-preview20150319/Git-1.9.5-preview20150319.exe"
destination="C:\Temp" filename="Git-1.9.5-preview20150319.exe"/>
Then in my PowerShell script file I have this function:
function install-tools() {
Set-ExecutionPolicy RemoteSigned
$xmlFileInformation = $config.SelectNodes("/setup/downloads/download");
Foreach ($file in $xmlFileInformation)
{
$("Filename: " + $file.filename)
$("File source: " + $file.source)
$("File destionation: " + $file.destination)
$("****************************************************************");
$("*** Downlading "+ $file.filename);
$("****************************************************************");
Invoke-WebRequest $file.source -OutFile $file.destination
}
$("Task finished");
}
After executin I get the error from the title UnauthorizedAccessException from PowerShell when using Invoke-WebRequest. Two things that I can mention is that I have included Set-ExecutionPolicy RemoteSigned and also, I execute the script running PowerShell as administrator. I've tried different paths but it's the same everytime, I don't get permission to write anywhere. The only thing that I can't try is using another drive I have only one - C:\.
And one strange thing - my destination directory is C:\Temp but during one of my attempts I didn't have such a directory in C:\ so I ended up with file named Temp in my C:\ but this was the closest I get to getting a file.
I don't need to save those files in a particular place since it's very possible to delete the entire directory after successfull set-up so what I need is a way to let powershell save files somewhere in my C:\ drive. Since I'm not sure if this is related with administrating my system and setting the correct rights (I tried to lower the protection as much as I can) or I miss something in my PowerShell script?
You does not specify file name to download to. Replace
-OutFile $file.destination
to
-OutFile (Join-Path $file.destination $file.filename)

Getting ERROR 5 (0x00000005) Creating Destination Directory while using robocopy to copy files

I am getting the above error while I am using robocopy command. I have given all possible permissions on both source and destination folders but still I am getting this error. Any idea how to fix this.
Is this to and from ntfs partitions?
If you are coping to FAT of EXT then add the /FFT parameter to assume FAT file times (2 second granularity) ext2/ext3 also uses 2 second granularity.
You could also try using the /COPY:DT parameter, by default robocopy copies the data, attributes and timestamp /COPY:DT will skip the attributes.
Also check your share permissions as well as your ntfs permissions
For me it worked fine when I ran the command directly on the server, but when I ran it from PowerShell remoting I would get this error. I was also trying to copy files from the local machine to a network share. The fix for me was to use:
Invoke-Command -ComputerName $sourceServer -Credential $credential -Authentication Credssp -ScriptBlock {
& RoboCopy "C:\Source" "\\OtherServer\C$\Destination" /E
}
Specifically, using -Credential $credential -Authentication Credssp fixed the issue for me.
You didn't provide enough info to know if this is the same issue you were having, but thought I'd mention it for others who encounter the same error message.

How can I convince powershell (run through task scheduler) to find my network drive?

I have a simple powershell script on windows 7 that doesn't work properly. (this is not an issue on XP)
get-psdrive
When I run it directly, I get
Name Used (GB) Free (GB) Provider Root
---- --------- --------- -------- ----
A FileSystem A:\
Alias Alias
C 12.30 11.60 FileSystem C:\
cert Certificate \
D FileSystem D:\
Env Environment
Function Function
HKCU Registry HKEY_CURRENT_USER
HKLM Registry HKEY_LOCAL_MACHINE
**Q 1486.63 289.41 FileSystem Q:\**
Variable Variable
WSMan WSMan
When I run this through task scheduler, I get
Name Used (GB) Free (GB) Provider Root
---- --------- --------- -------- ----
A FileSystem A:\
Alias Alias
C 12.30 11.60 FileSystem C:\
cert Certificate \
D FileSystem D:\
Env Environment
Function Function
HKCU Registry HKEY_CURRENT_USER
HKLM Registry HKEY_LOCAL_MACHINE
Variable Variable
WSMan WSMan
Note that I'm missing my Q: drive. If there's any way to get this resolved, I'll be able to copy files there....
Network drives, and really all drive letters for that matter, are "mapped" to volumes for a given logon session. When you are creating a scheduled task to run it creates a new login session (even if you are currently logged in) and runs the scheduled task in that context. Thus, while you may be logged in and have a Q drive mapped - the second session that is running the task has a completely different environment, Windows is just nice enough to automatically map the C: (and other physical drives) for all sessions.
You shouldn't need to map a map a drive when using PowerShell, other than for perhaps convenience. Unlike the cmd.exe predecessor, PowerShell is perfectly happy to change the current directory to a UNC style path:
cd \\server\share\directory
Is it possible to accomplish what you need without mapping a drive at all? You have mentioned copying files - if the task is running with your credentials, and assuming you have permissions to the Q: drive (lets say \server\share), then your script should be able to do something like:
copy c:\logs\*.log \\server\share\logs
And work just fine without needing to map a drive.
Here is the complete command info for my test that worked. If your environment is different please note how. The task is configured to run as my domain account, only when I am logged in, highest privileges and configured for Windows 7/Server 2008 R2.
The action is to Start a program:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Arguments
-command copy c:\logs\*.log \\server\share\logs
Maybe before running get-psdrive in script, firstly do something like this:
$net = new-object -comobject Wscript.Network
$net.mapnetworkdrive("Q:","\\path\to\share",0,"domain\user","password")
and after doing your job (copying files..):
$net.removenetworkdrive("Q:")
There is a hack if you don't want to have a password in the script, which I prefer to avoid :
Open your folder with sufficient privileges (domain user for example)
Open a powershell as Administrator and an make a symlink from UNC to local path
New-Item -ItemType SymbolicLink -Path "C:\LocalTemp\" -Value "\unc"
You can now use the UNC path in your powershell script directly, it will open it with the credential provided in the scheduled task.
There is probably some issues with credentials in scheduled tasks, however this is still better in my opinion than password in clear or pseudo obfuscated in scripts.

Resources