I have a line in my script that downloads a large video file. After the download starts I want to already open the file while is it downloading. The problem is, is that the download command hasn't finished yet so the script stays stuck on the same line.
(Download-File command)
$allFiles = Get-ChildItem | Select-Object -Property name,LastAccessTime | measure-object -Property LastAccessTime -Maximum
$videoFile = Get-ChildItem | Where-Object { $_.LastAccessTime -eq $allFiles.Maximum}
Start-Process $videoFile
(I want this to run in a loop while the download-file command is running)
That should be easy. All you need to do is make it run on a different thread. Use background jobs or Runspaces. Below example is Background Job.
$ScriptBlock = {(Download-File command)}
Start-Job -ScriptBlock $ScriptBlock
do
{
$allFiles = Get-ChildItem | Select-Object -Property name,LastAccessTime | measure-object -Property LastAccessTime -Maximum
$videoFile = Get-ChildItem | Where-Object { $_.LastAccessTime -eq $allFiles.Maximum}
Start-Process $videoFile
}
while (1 -gt 0)
Although, I am not sure if you would want to open the video file in a loop. If it does support opening an incomplete video file, you will have just as many instances of it. Better enclose it in an if (!(Get-Process -name $VideoFile)){} loop to prevent that.
Use the .waitforexit method.
Example:
$proc = Start-Process cmd.exe -PassThru
$proc.WaitForExit()
After the $proc.WaitForExit() line you can open your file.
Its look much better than a loop
Related
I have two identical running processes called RocketLeague.exe
I want to store one of the processes PID's in a variable for later use in another command by matching both processes full file paths.
I have been able to come up with two commands so far that pipe the full path of the processes but can't figure out how to continue the piping of the right PID into a final custom command.
How can I store the correct PID in a variable for use in my custom command?
1) Get-Process -Name 'RocketLeague' | Format-List -Property Path
2) Get-Process -Name 'RocketLeague'
Using the feedback from user:lit I was able to come up with this solution.
$procID = Get-process -Name 'RocketLeague' | Select-Object -Property Id,Path | ForEach-Object {
If($_.Path -eq 'C:\Program Files (x86)\Steam\steamapps\common\rocketleague\Binaries\Win64\RocketLeague.exe'){
Set-Variable -Name 'procSteam' -Value $_.Id; Write-Host $procSteam
}
}
If you just want the specific Process that is equal to that Path you can use Where-Object or .Where() method for filtering. The code would be reduced to:
# This:
$procID = (Get-Process -Name 'RocketLeague').Where({
$_.Path -eq 'C:\Program Files (x86)\Steam\steamapps\common\rocketleague\Binaries\Win64\RocketLeague.exe'
}) | Select-Object -Property Id, Path
# Or this:
$procID = Get-Process -Name 'RocketLeague' | Where-Object {
$_.Path -eq 'C:\Program Files (x86)\Steam\steamapps\common\rocketleague\Binaries\Win64\RocketLeague.exe'
} | Select-Object -Property Id, Path
And if, for example there is only one of the paths that ends with Win64\....exe you can use .EndsWith() method:
$procID = (Get-Process -Name 'RocketLeague').Where({
([string]$_.Path).EndsWith('\Win64\RocketLeague.exe')
}) | Select-Object -Property Id, Path
I want to move all pictures from several folders to one destination folder, if they listed in my txt-file.
The script works, but there are about 81k pictures and 450k names (eg samlpe-green-bigpic-detail-3.jpg) in the txt-file, it is damn slow.
Is there a way to script it, so it works faster?
$qpath = "c:\sample\picz\"
$Loggit = "c:\sample\pic_move.log"
$txtZeileU = "c:\sample\names.txt"
$d_pic = "C:\sample\moved_picz"
$arrZeileU = Get-Content -Path $txtZeileU
foreach ($Zeile in $arrZeileU) {
Get-ChildItem -Path $qpath -Recurse |
where {$_.Name –eq $Zeile} |
Move-Item -Destination $d_pic -Verbose -Force *>&1 |
Out-File -FilePath $Loggit -Append
}
I'm running a powershell script, that when run from the ISE outputs one set of values but when the same task is run through task scheduler it seems to add a second value that doesn't display when run manually. The code that's being executed is as below:
import-module WebAdministration
$app_pool_name = <<app_pool_name_goes_here>>
$memused = ""
$cpuused = ""
$datetime = get-date -format s
$memused = Get-WmiObject Win32_process | where CommandLine -Match "$app_pool_name"
$id = dir IIS:\AppPools\$app_pool_name\WorkerProcesses\ | Select-Object -expand processId
$cpuUsed = Get-WmiObject Win32_PerfFormattedData_PerfProc_Process | where IDProcess -Match $id
Add-Content -path C:\Winsys\PSOutput\$app_pool_name-CPU-RAM_test.txt -value "$datetime,$($memUsed.workingsetsize),$($cpuUsed.PercentProcessorTime)"
When running the script manually the output returned is:
Date,Mem,CPU
2016-08-02T14:09:36,15062687744,0
2016-08-02T14:09:38,15062425600,0
When running the script through task scheduler the output returned is:
Date,Mem,CPU
2016-08-02T13:58:25,15065047040 624189440,0
2016-08-02T14:05:01,15061901312 624713728,0
The difference being the Mem, for some reason it's adding an extra value. Does anyone know why this is?
Turns out this was my own error, there are two app pools with very similar names, the -match was catching both. But it still didn't explain why it was only showing both in task scheduler and not ISE. Ah well, resolved now by adding a -and -notmatch "text" section.
E.g.
Get-WmiObject Win32_process | where {$_.CommandLine -Match "$app_pool_name" -and $_.CommandLine -notmatch "<<text in other command line>>"}
Add Comment
I need my program to give me every folder containing files which are out of the Windows' number of characters limit. It means if a file has more than 260 characters (248 for folders), I need it to write the address of the file's parent. And I need it to write it only once. For now, I'm using this code:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
Select-Object -ExpandProperty FullName |
Split-Path $_.FullName
But the Split-Path won't work (this is the first time I use it). It tells me the -Path parameter has a null value (I can write -Path but it doesn't change anything).
If you want an example of what I need: imagine folder3 has a 230-character address and file.txt has a 280-character address:
C:\users\folder1\folder2\folder3\file.txt
Would write:
C:\users\folder1\folder2\folder3
I'm using PS2, by the way.
Spoiler: the tool you are building may not be able to report paths over the limit since Get-ChildItem cannot access them. You can try nevertheless, and also find other solutions in the links at the bottom.
Issue in your code: $_ only works in specific contexts, for example a ForEach-Object loop.
But here, at the end of the pipeline, you're only left with a string containing the full path (not the complete file object any more), so directly passing it to Split-Path should work:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
Select-Object -ExpandProperty FullName |
Split-Path
as "C:\Windows\System32\regedt32.exe" | Split-Path would output C:\Windows\System32
Sidenote: what do (Get-Item C:\Windows\System32\regedt32.exe).DirectoryName and (Get-Item C:\Windows\System32\regedt32.exe).Directory.FullName output on your computer ? These both show the directory on my system.
Adapted code example:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
ForEach-Object { $_.Directory.FullName } |
Select-Object -Unique
Additional information about MAX_PATH:
How do I find files with a path length greater than 260 characters in Windows?
Why does the 260 character path length limit exist in Windows?
http://www.powershellmagazine.com/2012/07/24/jaap-brassers-favorite-powershell-tips-and-tricks/
https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247%28v=vs.85%29.aspx
https://gallery.technet.microsoft.com/scriptcenter/Get-ChildItemV2-to-list-29291aae
you cannot use get-childitem to list paths greater than the windows character limit.
There are a couple of alternatives for you. Try an external library like 'Alphafs' or you can use robocopy. Boe Prox has a script that utilizes robocopy and it is available on technet but i am not sure if it will work on PSV2. Anyway you can give it a try.
I've had a similar problem and resolved it like this:
$PathTooLong = #()
Get-ChildItem -LiteralPath $Path -Recurse -ErrorVariable +e -ErrorAction SilentlyContinue
$e | where {$_.Exception -like 'System.IO.PathTooLongException*'} | ForEach-Object {
$PathTooLong += $_.TargetObject
$Global:Error.Remove($_)
}
$PathTooLong
On every path that is too long, or that the PowerShell engine can't handle, Get-ChildItem will throw an error. This error is saved in the ErrorVariable called e in the example above.
When all errors are collected in $e you can filter out the ones you need by checking the error Exception for the string System.IO.PathTooLongException.
Hope it helps you out.
I have a file transfer/sync job that is copying files from the main network into a totally secure network using a custom protocol (ie no SMB). The problem is that because I can't look back to see what files exist, the destination is filling up, as the copy doesn't remove any files it hasn't touched (like robocopy MIR does).
Initailly I wrote a script that:
1. Opens the log file and grabs the file paths out (this is quite quick and painless)
2. Does a Get-ChildItem on the destination folder (now using dir /s /b as it's way faster than gci)
3. Compared the two, and then removed the differences.
The problem is that there are more jobs that require this clean up but the log files are 100MB and the folders contain 600,000 files, so it's taking ages and using tons of memory. I actually have yet to see one finish. I'd really like some ideas on how to make this faster (memory/cpu use doesn't bother me too much but speed is essential.
$destinationMatch = "//server/fileshare/folder/"
the log file contains some headers and footers and then 600,000 lines like this one:
"//server/fileshare/folder/dummy/deep/tags/20140826/more_stuff/Deeper/2012-07-02_2_0.dat_v2" 33296B 0B completed
Here's the script:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select Name -first 1
$manifestFileName = [string]$manifestFile.name
$manifestFullPath = $logPath + "\" + $manifestFileName
$copiedList = #()
(gc $manifestFullPath -ReadCount 0) | where {$_.trim() -match $DestinationMatch} | % {
if ( $_ -cmatch '(?<=")[^"]*(?=")' ){
$copiedList += ($matches[0]).replace("/","\")
}
}
$dest = $destinationMatch.replace("/","\")
$actualPathString = (gci -Path $dest -Recurse | select fullname).fullnameCompare-Object -ReferenceObject $copiedList -DifferenceObject $actualPathString -PassThru | % {
$leaf = Split-Path $_ -leaf
if ($leaf.contains(".")){
$fsoData = gci -Path $_
if (!($fsoData.PSIsContainer)){
Remove-Item $_ -Force
}
}
}
$actualDirectory | where {$_.PSIsContainer -and #(gci -LiteralPath $_.FullName -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue | where {!$_.PSIsContainer}).Length -eq 0} | remove-item -Recurse -Force
Ok, so let's assume that your file copy preserves the last modified date/time stamp. If you really need to pull a directory listing, and compare it against a log, I think you're doing a decent job of it. The biggest slow down is obviously going to be pulling your directory listing. I'll address that shortly. For right now I would propose the following modification of your code:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select -first 1
$RegExPattern = [regex]::escape($DestinationMatch)
$FilteredManifest = gc $manifestfile.FullPath | where {$_ -match "`"($RegexPattern[^`"]*)`""} |%{$matches[1] -replace '/','\'}
$dest = $destinationMatch.replace("/","\")
$DestFileList = gci -Path $dest -Recurse | select Fullname,Attributes
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -notmatch "Directory"}|Remove-Item $_ -Force
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -match "Directory" -and (gci -LiteralPath $_ -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue).Length -eq 0}{Remove-Item $_ -Recurse -Force}
This stops you from duplicating efforts. There's no need to get your manifest file, and then assign different variables to different properties of the file object, just reference them directly. Then later when you pull your directory listing of the drive (the slow part here), keep the full name and attributes of the files/folders. That way you can easily filter against Attributes to see what's a directory and what not, so we can deal with files first, then clean up directories later after the files are cleaned up.
That script should be a bit more streamlined version of yours. Now, about pulling that directory listing... Here's the deal, using Get-ChildItem is going to be slower than some alternatives (such as dir /s /b) but it stops you from having to duplicate efforts by later checking what's a file, and what's a directory. I suppose if the actual files/folders that you are concerned with are a small percentage of the total, then the double work may actually be worth the time and effort to pull the list with something like dir /s /b, and then parse against the log, and only pull folder/file info for the specific items you need to address.