I have a simple PowerShell script that copies a file from a mapped network drive, if it's modified in past 1 day.
$source = "Z:\\"
$target = "E:\target"
$files = get-childitem $source
foreach ($file in $files) {
if($file.LastWriteTime -ge (get-date).AddDays(-1)) {
Copy-Item $file.FullName $target
}
}
This script runs fine if I manually execute it.
If I try to use a scheduled task, the copy does not run. I confirmed the script is running by having it make a directory.
If I instead copy from a local drive instead of a network drive, the script runs fine with a scheduled task.
Schedule Task is running as an Admin Account.
Script copying file from network drive runs fine manually but not via scheduled task. Script runs fine as task if copying from local but not network drive.
Any ideas?
Try specifying the full UNC path rather than a network drive. (Network drives are a per-user configuration item.)
Map the drive as a temporary PowerShell drive...add the following as the first line of the script
New-PSDrive -Name Z -PSProvider FileSystem -Root \\server\sharename
Related
I have a PowerShell script, that run's at user log in on one VM (00000281):
Get-Content C:/sample.txt -TotalCount 1) | Set-Content C:\sample.txt
To execute the script automatically I have created a .cmd file and placed it in the following folder:
C:\Users\vs_domadmin\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup
The above VM(00000281) resides in my infra VM's cluster which contains four other Vm's (infra servers). I have a total of four servers, which I would like that script to run as well.
00000281, 00000282, 00000283, 00000284
My script is running for the moment only on the 00000281. Here is the trick:
I need that script from 00000281 to execute also on the rest of the infra servers 00000282, 00000283, 00000284, knowing that the script will be stored only on my first infra server 000002821. I believe that giving the script some conditions and outlining the path of the other 3 VMs in VM (00000281) should do the work correct>
VM 00000282, 00000283, 00000284 is considered as a remote machine w.r.t the current user in vm 00000281.
You have to run the script in the machine itself using the powershell Remote process of the same machine by configuring the 4 machines to run remote powershell script.
This done by enabling every machine to run the powershell remote service WinRM
For step by step details:
How to Run PowerShell Commands on Remote Computers
Modify the script in vm 00000281, add this as last line:
Invoke-Command -ComputerName 00000282, 00000283, 00000284 -FilePath path_to_script_file.ps1
In this way you run concurrently the same script in all machines.
Login user should have a permission in these machines.
Solution identified:
$File = "C:\sample.txt"
Get-Content $File -TotalCount 1 | Set-Content $File
"00000282", "00000283", "00000284" | % {
$FileUnc = "\\$($_)\$($File.Replace(':', '$'))"
Get-Content $FileUnc -TotalCount 1 | Set-Content $FileUnc
}
I have the following piece of PowerShell code:
$files = Get-ChildItem E:\Local_Files\Performance\*.txt -Recurse
foreach ($file in $files) {
(Get-Content $file.PSPath) |
Where-Object { $_.Trim() -ne "" } |
Set-Content $file.PSPath
}
Move-Item -Path E:\Local_Files\Performance\*.* -Destination E:\Local_Files\ -Force
It deletes empty rows for all files in a folder. Then, it moves any file on that folder to a second one. Z:\ is a mapped network drive for a network folder. If I run the script in PowerShell, it works. When I schedule it in the Task Scheduler, it only works the first bit (the Trim() method).
I have setup the same username to run the job in both cases. If I use a local folder as a target for move-item, it works in the Task Scheduler as well.
Do you have any idea why it might not be working?
I am on Windows Server 2012 R2.
Many thanks,
Mapped drives are only mapped when a user logs on interactively. For a scheduled task, there is no interactive logon and therefore no mapped drives, so any attempt to use them will fail.
You can either:
Map the drive letter in your script with New-PSDrive
Use the UNC path to the share (preferred method).
Also bear in mind that the user account under which your task executes must have appropriate permissions on that UNC path/share.
I'm currently trying to pull msinfo data from a remote server and then save that output onto a share located on another server. When I run the command, a progress bar appears and then completes without apparent issue, but the file isn't saved to the UNC path. I've verified that I have permissions on the share and that the nfo generation itself works. Any ideas?
C:\Windows\system32>msinfo32 /computer servername /nfo \\sharename\filename.nfo
Very Strange, It works on CMD but not with Powershell, Not had the time to explore it, However If you need to run it in powershell you can take this workaround:
$TempFile = [System.IO.Path]::GetTempFileName()
C:\Windows\system32\msinfo32 /computer Computer /nfo $TempFile
Do
{
Sleep 5
}
Until (!(Get-Process msinfo32 -ErrorAction SilentlyContinue))
Copy-Item $TempFile \\Computer\Share\output.nfo
$TempFile | Remove-Item -Force
Got it figured out - I was able to use the switch user param to save the file after pulling it from the server.
Thanks!
I need to be able to launch a .cmd file that is on a remote machine, from within the directory that the file resides on that machine.
I've tried: invoke-command -ComputerName test123 -ScriptBlock { cmd /c c:/myfile.cmd } in powershell, which launches the .cmd, but then fails because it can't find the corresponding .cmds that this one launches (which all reside in the same directory).
Is there a way to launch this .cmd file, and have it's execution persist? i.e., even after the powershell window is closed, the .cmd will continue to run on the remote machine.
You need to change the working directory in the scriptblock. Add a Set-Location before calling the batch script:
Invoke-Command -ComputerName test123 -ScriptBlock {
Set-Location 'C:\'
& cmd /c ".\myfile.cmd"
}
If you need to create a detached process, you can do that for instance via WMI:
$hostname = 'test123'
$command = 'C:\path\to\script.cmd'
$workdir = 'C:\working\directory'
$p = [wmiclass]"\\$hostname\root\cimv2:Win32_Process"
$p.Create($command, $workdir)
Note that you need admin privileges on the remote host for this.
I'm using the following powershell script to monitor new files coming
in an IBM iSeries shared folder.
# variables
#$folder = "\\10.10.0.120\transform\BE\FORM"
#$folder = "C:\Users\Administrator.ALI\Desktop\AS400"
#$folder = "\\nb091002\Temp"
$folder = "I:\"
$filter = "*.txt"
$aswform = "C:\ASWFORM\aswform.exe"
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $folder
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $false
$watcher.NotifyFilter = [System.IO.NotifyFilters]::LastWrite -bor [System.IO.NotifyFilters]::FileName
while($TRUE){
$result = $watcher.WaitForChanged([System.IO.WatcherChangeTypes]::Changed -bor [System.IO.WatcherChangeTypes]::Renamed -bOr [System.IO.WatcherChangeTypes]::Created, 2000);
if($result.TimedOut){
continue;
}
Write-Host $result.Name
#$aswform $folder
}
This seems to work fine on local folders or domain shares.
I've tried mapping the iSeries shared folder to a network drive but it doesn't work.
(10.10.0.120 is the AS400)
I'm pretty sure it has to do something with credentials....
Strange thing is I can access the shared folder from within Windows perfectly.
Does anybody have any clues or tips for me?
PS: little detail, I'll be running this script through task sheduler with this trigger
powershell -NoExit -WindowStyle Hidden -File "C:\ASWFORM\watcher.ps1"
But first I need it working when running the script manually!
I have not been able to get FileSystemWatcher to work unless the target directory was a Windows NTFS drive. If I specify the drive letter of the mapped directory I get
Exception setting "Path": "The directory name W:\ is invalid."
If I use the UNC I get
Exception calling "WaitForChanged" with "2" argument(s): "Error reading the \\\\path.to.my.ibm.i\\root\ directory."
Against a Novell file server I get the directory name is invalid if I use a drive letter. If I use a UNC against a Novell drive it does run, but doesn't detect any changes to the file system. Works fine against a local drive and also against a Windows file server on my network.
I solved the problem by writing a small C# console application to poll the folder
instead of using the .Net FileSystemWatcher object.
I manually (instsrv.exe) installed this program as a service and it seems to be running ok.
If you want the code, please send me a PM and I'll see to it that you get it.