We have some old applications that communicate via directory-based queues. Each item in the queue is a file, and there's a header file that maintains an ordered list of the filenames for the items in the queue.
Naturally, this old code needs to lock the queue while items are pushed and popped. What it's doing is creating a lock subdirectory, on the assumption that mkdir() is an atomic operation - if multiple processes attempt to create a directory, only one of them is going to succeed.
One of my co-workers has been trying to chase down an obscure problem, and he thinks the causes is that this locking is no longer working, when the processes are running on different machines, and when the filesystem in question is mounted on a SAN.
Is there any possibility that he might be correct?
Very old question I know, but I hope someone finds this interesting.
I was also getting confusing results using PowerShell to create a shared folder for use as a mutex, so I created a test script.
function New-FolderMutex {
try {
New-Item -ItemType directory -Path .\TheMutex -ErrorAction Stop > $null
$true
} catch {
$false
}
}
function Remove-FolderMutex {
Remove-Item -Path .\TheMutex
}
1..100 | % {
if (New-FolderMutex) {
Write-Host "Inside loop $_"
Remove-FolderMutex
}
}
This script is run while the current directory is in a network share.
When I ran this script simultaneously in two separate PowerShell consoles, it was clear from the error messages that the approach was doomed. There are a number of different errors produced by the call to Remove-Item, even though it is being called only by the process that created the folder. It seems that behind the scenes there are a whole bunch of non-atomic steps occurring.
Of course, the OP was asking about mkdir (probably the system call) and my example uses the much higher level PowerShell cmdlets, but I hope this is of some interest.
Example output from one of two processes (edited for brevity)
Inside loop 30
Inside loop 31
Inside loop 32
Inside loop 33
Inside loop 34
Remove-Item : Access to the path 'H:\My Documents\PowerShell\MutexFolder\TheMutex' is denied.
At H:\My Documents\PowerShell\MutexFolder\Test-FolderMutex.ps1:93 char:5
+ Remove-Item -Path .\TheMutex
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (H:\My Documents...Folder\TheMutex:String) [Remove-Item], UnauthorizedAccessException
+ FullyQualifiedErrorId : RemoveItemUnauthorizedAccessError,Microsoft.PowerShell.Commands.RemoveItemCommand
Inside loop 39
Remove-Item : H:\My Documents\PowerShell\MutexFolder\TheMutex is a NTFS junction point. Use the Force parameter to delete or modify.
At H:\My Documents\PowerShell\MutexFolder\Test-FolderMutex.ps1:93 char:5
+ Remove-Item -Path .\TheMutex
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (H:\My Documents...Folder\TheMutex:DirectoryInfo) [Remove-Item], IOException
+ FullyQualifiedErrorId : DirectoryNotEmpty,Microsoft.PowerShell.Commands.RemoveItemCommand
Inside loop 42
Remove-Item : Could not find a part of the path 'H:\My Documents\PowerShell\MutexFolder\TheMutex'.
At H:\My Documents\PowerShell\MutexFolder\Test-FolderMutex.ps1:93 char:5
+ Remove-Item -Path .\TheMutex
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (H:\My Documents...Folder\TheMutex:String) [Remove-Item], DirectoryNotFoundException
+ FullyQualifiedErrorId : RemoveItemIOError,Microsoft.PowerShell.Commands.RemoveItemCommand
Inside loop 44
Inside loop 45
Related
When running the following on one of our Windows machine
$name = "My task"
Stop-ScheduledTask -TaskName $name -ErrorAction SilentlyContinue
I get this error:
Stop-ScheduledTask : Cannot connect to CIM server. The system cannot find the file specified.
+ Stop-ScheduledTask -TaskName $name -ErrorAction SilentlyContinue
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ResourceUnavailable: (PS_ScheduledTask:String) [Stop-ScheduledTask], CimJobException
+ FullyQualifiedErrorId : CimJob_BrokenCimSession,Stop-ScheduledTask
It does not look like a permission issus since it would look more like
Stop-ScheduledTask : Access is denied.
+ [void](Stop-ScheduledTask -TaskName $name -ErrorAction SilentlyContinue)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (PS_ScheduledTask:Root/Microsoft/...p_ScheduledTask) [Stop-Schedul
edTask], CimException
+ FullyQualifiedErrorId : HRESULT 0x80070005,Stop-ScheduledTask
A couple of facts to help:
I'm not running this remotely
Get-ScheduledTask will throw the same CimJob_BrokenCimSession error
Are you an administrator on the system? The error says permission denied (Misread OP)
When troubleshooting, it generally helps if you don't have -erroraction silentlycontinue if you actually want to see what's erroring. Obviously, you don't have it in a try/catch block, so you can see the error, but having the option to suppress errors doesn't make sense here.
More suggestions:
Are you running this command remotely? If you're running it against many servers, check your server list.
What happens if you get all the tasks, e.g. $tasks = get-scheduledtask, then reference the one you want with $tasks | where taskname -like "whatever" | stop-scheduledtask (or its index number in the array, e.g. $tasks[5]).
I didn't get any errors when I ran the preceding command against tasks that weren't running, but perhaps it does return an error in some cases if the task isn't in the running state.
I have a network share hosted by a server (\SERVER) which is being accessed by other servers/clients.
\\SERVER\SHARE\Folder\File
If I wanted to compress Folder and everything in it, is it possible to do using PowerShell WITHOUT having the files be downloaded to the machine that is running the command?
The files in question are large, and there is not enough room on the C drive to download and compress the files on the client. So for example if I navigated to the network share from the client using Windows File Explorer, and selected the folder to compress, it would start downloading the files to the client and then fail due to insufficient free space on the client.
What about PowerShell's Invoke-Command Option?
I do have the option to Invoke-Command from the client to the server, however, the C drive of \SERVER is to small to handle the request as well. There is a D drive (which hosts the actual \SHARE), which has plenty of space though. I would have to tell PowerShell to compress files on that drive somehow instead of the default which would be the C drive.
Error when running the below PowerShell Command
Compress-Archive -Path "\\SERVER\SHARE\Folder" -DestinationPath "\\SERVER\SHARE\OtherFolder\Archive.zip"
Exception calling "Write" with "3" argument(s): "Stream was too long."
At
C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Archive\Microsoft.PowerShell.Archive.psm1:820
char:29
+ ... $destStream.Write($buffer, 0, $numberOfBytesRead)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : IOException
The problem is caused by Compress-Archive's limits. Its maximum file size is 2 GB. Documentation mentions this:
The Compress-Archive cmdlet uses the Microsoft .NET API
System.IO.Compression.ZipArchive to compress files. The maximum file
size is 2 GB because there's a limitation of the underlying API.
As for a solution, compress smaller files, or use another a tool such as 7zip. There's a module available, though manual compression is not that complex. As 7zip is not a native tool, install either it or the Powershell module.
Set-Alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$src = "D:\somedir"
$tgt = "D:\otherdir\archive.7z"
sz a $tgt $src
If the source files are small enough so that a single file will never create an archive larger that the limit, consider compressing each file by itself. An example is like so,
$srcDir = "C:\someidir"
$dstDir = "D:\archivedir"
# List all the files, not subdirs
$files = gci $srcDir -recurse | ? { -not $_.PSIsContainer }
foreach($f in $files) {
# Create new name for compressed archive. Add file path, but
# replace \ with _ so there are no name collisions.
$src = $f.FullName
$dst = "c:\temppi\" + $src.Replace('\', '_').Replace(':','_') + ".zip"
Compress-Archive -whatif -Path $src -DestinationPath $dstDir
}
As a side note: use Enter-PSSession or Inoke-Command to run the script on the file server. There you can use local paths, though UNC paths should work pretty well - those are processed by loopback, so data isn't going through network.
I am playing around powershell the idea is simple:
I want to verify if certain TCP port is open.
Now, I can run this as PowerShell script or I can run it in ISE.
Now, in ISE everything is fine, the script runs as supposed to.
When I run it as PowerShell Script however, I am getting error message:
Method invocation failed because [System.Net.Sockets.TcpClient] does not contain a method named 'ReceiveTimeout'.
At P:\checkTCP80.ps1:7 char:1
+ $tcpClient.ReceiveTimeout(5)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : MethodNotFound
Code:
$servery = gc .\servers.txt
foreach ($server in $servery)
{
$tcpClient = New-Object System.Net.Sockets.TCPClient
$tcpClient.ReceiveTimeout(5)
$tcpClient.Connect($server,80)
Write-Host ($server, $tcpClient.Connected)
}
I have 2 questions:
How come, that the output parameter works just fine from ISE but does not work when this is launched as a script?
How to fix it?
According to the MS documentation on this class ReceiveTimeout is a property and not a method.
Try changing $tcpClient.ReceiveTimeout(5) to $tcpClient.ReceiveTimeout = 5
I am executing a PowerShell script and this line:
Copy-Item -Path "$A_DIRECTORY" -Destination "$ANOTHER_DIRECTORY" -Recurse -Force
intermittently fails with the following error:
Copy-Item : Access is denied
At C:\mydir\build.ps1:224 char:5
+ Copy-Item -Path "$A_DIRECTORY" -Destination "$ANOTHER_DIRECTORY" ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-Item], UnauthorizedAccessException
+ FullyQualifiedErrorId : System.UnauthorizedAccessException,Microsoft.PowerShell.Commands.CopyItemCommand
Now, the error message is clear, but it does not make sense because the target directory does not exist. I have write permissions and also tried to run as administrator with same result.
Why would this statement fail sometimes but not always? Perhaps it is also worth noting that $ANOTHER_DIRECTORY is deleted in the command preceding this one. The delete operation never fails.
I am trying to use a script that ignores I/O errors on a HD, to copy whatever is good there into another HD.
I found this script here : http://81.165.15.172:1983/blog/2013/06/02/ignoring-device-io-errors-during-copy-with-powershell/comment-page-1/
(https://raw.github.com/DavorJ/PS-ForceCopy/master/Force-Copy.ps1)
that does just that...but i cant get it to work.
I am trying with command :
.\Force-Copy.ps1 -SourceFilePath "I:\Downloads\" -DestinationFilePath "H:\Downloads" -MaxRetries 6
but it gives me this weird error:
F:\SSDU\Desktop\Force-Copy.ps1 : Cannot validate argument on parameter 'SourceFilePath'. The "Test-Path -LiteralPath $_ -Type Leaf" validation script for the argument with value "I:\Downloads\" did not return true. Determine why the validation script failed and then try the command again.
At line:1 char:34
+ .\Force-Copy.ps1 -SourceFilePath "I:\Downloads\" -DestinationFilePath
"H:\Downlo ...
+ ~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Force-Copy.ps1], ParameterBind
ingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationError,Force-Copy.ps1
Anyone knows how to use this in win-8 64-bit ?
-Thanks
So, it's not a PowerShell solution, but for getting whatever you can off a dying drive I recommend using Roadkil's Unstoppable Copier. You can download it from the author at:
http://www.roadkil.net/program.php/P29/Unstoppable%20Copier
I have had good success with that one.