I am attempting to make a PowerShell script to run every night, to copy over PDF files from C:\share1 to C:\share2 and write to the event log.
My current file copy script bit looks like this:
Try
{
get-childitem -Path C:\Share1 | ForEach-Object { Copy-Item $_.FullName C:\share2 -verbose }
#Writes to event log as success.
}
Catch
#Logs the event as failed
}
The issue I run into here is that the files beeing copied/replaced are in use.
When a file is in use the script stops copying on that said file with a error:
PS>TerminatingError(Copy-Item): "The process cannot access the file 'C:/share2/1.pdf' because it is being used by another process."
I would like to at least modify my script so it continues to copy the remaining files.
If i for example had 100 files and the 3rd one was in use, the entire transfer stops.
How can I modify my script to continue on remaining items?
You are looking for the common -ErrorAction paramter with set to SilentlyContinue:
get-childitem -Path C:\Share1 | ForEach-Object { Copy-Item $_.FullName C:\share2 -verbose -ErrorAction SilentlyContinue }
Note: You don't need the Foreach-Object cmdlet here:
Get-ChildItem -Path C:\Share1 | Copy-Item C:\share2 -verbose -ErrorAction SilentlyContinue
Note 2: As gvee mentioned, this will ignore all errors. Another option would be to use handle.exe from the sysinternals suite to check whether there is an open handle.
Related
I want to stop all the processes are running from specific directory if the process is from C:/CSV/X kill the process the server is remote but I have users are opened the files from different sessions so how can I kill it from different sessions
$files = gci "C:\CSV\X\*"
foreach($file in $files){
Get-Process |
Where-Object {$_.Path -eq $file.FullName} |
Stop-Process -Force -Verbose
}
I tried it and it doesn't work
For example we have a folder named: MyItem Inside the folder there is
MyItem/software.exe
MyItem/MyItem.exe
MyItem/MainMenu.exe
And I look for the processes through this: fsmgmt.msc
So I need to find a way to close all those whose source is MyItem/
If you only want to terminate 'open files' accessed from the network you can do something like this:
#set path
$path = 'C:\CSV\X\'
#get smb open files in specified directory and close em, escape $path \ because of -match operator (regex)
Get-SmbOpenFile | ?{$_.path -match ($path -replace '\\','\\')} | Close-SmbOpenFile
No need to kill processes.
So, I am writing this script in PowerShell and I am required to delete a few files in APPDATA on Windows. I wrote this line of code and it doesn't remove the item silently. It asks for confirmation even after using $Confirm:false. How do I fix this issue?
My code:
Get-ChildItem -Path $env:APPDATA\"Microsoft\teams\blob_storage" | Remove-Item -Confirm:$false -Force
I get this unwanted confirmation box every time I run the script:
Here is your modified code. I hope it will work for you.
Get-ChildItem -Path $env:APPDATA\"Microsoft\teams\blob_storage" | Remove-Item -Recurse -Force
I am calling from my Java EE application through ssh (apache) Powershell script on remote server, this script cannot be run in parallel so I need to check if the script is already running and wait until there is no process and run that waiting script. It might happen that the script will be called several times, eg. 20 times in few seconds and all of these needs to be run one by one.
Any idea how this can be achieved or if this is even possible? Thank you!
The way that I've accomplished this in some of my scripts is to use a lock file
$lockFile = "$PSScriptRoot\LOCK"
if (-not (Test-Path $lockFile)) {
New-Item -ItemType File -Path $lockFile | Out-Null
# Do other stuff...
Remove-Item -Path $lockFile -Force
}
You could maybe modify to something like this?
$lockFile = "$PSScriptRoot\LOCK"
while (Test-Path $lockFile)
{
Start-Sleep -Seconds 2
}
New-Item -ItemType File -Path $lockFile | Out-Null
# Do other stuff...
Remove-Item -Path $lockFile -Force
I have the following piece of PowerShell code:
$files = Get-ChildItem E:\Local_Files\Performance\*.txt -Recurse
foreach ($file in $files) {
(Get-Content $file.PSPath) |
Where-Object { $_.Trim() -ne "" } |
Set-Content $file.PSPath
}
Move-Item -Path E:\Local_Files\Performance\*.* -Destination E:\Local_Files\ -Force
It deletes empty rows for all files in a folder. Then, it moves any file on that folder to a second one. Z:\ is a mapped network drive for a network folder. If I run the script in PowerShell, it works. When I schedule it in the Task Scheduler, it only works the first bit (the Trim() method).
I have setup the same username to run the job in both cases. If I use a local folder as a target for move-item, it works in the Task Scheduler as well.
Do you have any idea why it might not be working?
I am on Windows Server 2012 R2.
Many thanks,
Mapped drives are only mapped when a user logs on interactively. For a scheduled task, there is no interactive logon and therefore no mapped drives, so any attempt to use them will fail.
You can either:
Map the drive letter in your script with New-PSDrive
Use the UNC path to the share (preferred method).
Also bear in mind that the user account under which your task executes must have appropriate permissions on that UNC path/share.
I'm seeing a race condition when calling New-Item to create a directory on a foreign machine using a UNC path. The code is below:
New-Item $target -itemType Directory -Force -Verbose |
%{ Write-Host "Creating dir" $_.FullName }
Using Test-Path immediately afterwards returns false. I put a Test-Path -> sleep for 1 second retry loop and after sleeping for 1 second, Test-Path is returning true.
Is New-Item a blocking call? Should I expect to have to wait after calling New-Item?
I cannot reproduce your problem.
PS > New-Item "test" -itemType Directory -Force -Verbose | %{ Test-Path $_.FullName }
VERBOSE: Performing the operation "Create Directory" on target "Destination: C:\Users\Frode\Desktop\test".
True
New-Item creates a new directory by getting a DirectoryInfo-object for the parent directory, and calling it's CreateSubDirectory, like:
DirectoryInfo subdirectory = new DirectoryInfo(parentPath).CreateSubdirectory(childName);
I'm not a developer, but AFAIK that means it's a blocking call, since it waits for an DirectoryInfo-object in return. So mabe the problem is with your storage subsystem.
Try running the New-Item command in another process and wait for it:
Start-Process powershell -Argument "-Command `"New-Item `"$myNewDir`" -ItemType `"directory`"`"" -NoNewWindow -Wait
I was writing a script that would create a folder and then write a 7zip archive to the folder but 7zip would complain that the directory did not exist. This seemed to work around the issue.