Using PowerShell I would like to invoke the print verb on multiple files. In Windows Explorer I can go into a folder, select a number of files, right-click and choose the print options. This opens up the Print Pictures dialog with all the selected files. I am able to do this for one file using:
$path = "C:\person.jpg";
Start-Process -FilePath $path -Verb Print | Out-Null;
Start-Sleep -s 150;
But was wondering how I could do it for a number of files.
You can use COM to invoke a verb on multiple files in one operation.
Assuming...
$folderPath = 'X:\Test'
$verbName = 'print'
$verbArguments = ''
...you can print all objects in a folder with...
$application = New-Object -ComObject 'Shell.Application'
$folder = $application.NameSpace($folderPath)
$folderItems = $folder.Items()
Write-Verbose "Folder ""$($folder.Self.Name)"" contains $($folderItems.Count) item(s)."
$folderItems.InvokeVerbEx($verbName, $verbArguments)
I say "objects" because $folderItems will contain both files and folders. The default appears to be enumerate subfolders and ignore hidden objects, while verbs are ignored on objects that don't support them.
If, for example, you wanted to only print files with a certain extension that are immediate children and include hidden files, you can do so using the Filter() method...
New-Variable -Name 'SHCONTF_NONFOLDERS' -Option 'Constant' -Value 0x00040
New-Variable -Name 'SHCONTF_INCLUDEHIDDEN' -Option 'Constant' -Value 0x00080
$application = New-Object -ComObject 'Shell.Application'
$folder = $application.NameSpace($folderPath)
$folderItems = $folder.Items()
$folderItems.Filter($SHCONTF_NONFOLDERS -bor $SHCONTF_INCLUDEHIDDEN, '*.jpg')
Write-Verbose "Filtered folder ""$($folder.Self.Name)"" contains $($folderItems.Count) item(s)."
$folderItems.InvokeVerbEx($verbName, $verbArguments)
If you want to print some custom set of files that don't nicely align by extension, visibility, etc. then I'm not seeing a way to do that. That would seem to require modifying $folderItems or creating a new FolderItems3 instance, and neither appears to be possible. I see there is a ShellFolderView type that supports item selection, but that looks like it's for interacting with an Explorer(-like) window.
Documentation for types and constants used above:
Shell.Application property
Shell.NameSpace method
Folder object
FolderItems3 object
FolderItems2.InvokeVerbEx method
FolderItems3.Filter method
_SHCONTF Enumeration
As far as I know, you can use foreach to iterate over all items in an array of items (files that you want to print), and execute the action print on each one of them as follows:
Get-ChildItem "C:\" -Include "*.jpg" | ForEach-Object {start-process $_.FullName –Verb Print}
Start-Sleep -s 150
This will:
get all the files (jpg files) found in C
apply the verb print on each file.
Note
You can use -wait parameter for Start-Process instead of Start-Sleep -s 150; This waits for the specified process and its descendants to complete before accepting more input.
Related
I was trying to use dir command to list recursively all files that end with .cpp in a given directory, I tried to follow various solutions but my powershell seems not to accept any options after '/' sign as seen on the picture bellow:
Example
The command I initially tried was 'dir sourcefolder "*.cpp"' but it only lists files in a given folder (because I cant provide any additional options as seen in microsoft doc), also any example command provided there does not work for me giving the same error as shown in example above.
here is how I will bring out all the files in .cpp.
Here is a small program in powershell :
$path = "C:\temp\"
$filter = "*.cpp"
$files = Get-ChildItem -Path $path -Filter $filter
Write-Host "here, all the .cpp files in '$path' :"
Write-Host $files -Separator "`r`n"
I prefer to use the cmdlet "Get-ChildItem" rather than "dir".
Here the content folder for my test
And, why so many / ?
So I want to know if any of the folders in a directory have any subfolders or files in them, I tried just looking at the directory in PowerShell but it gave me only mode, last write time, and name. Is there any way of adding to this list to include metadata of the folder like size or number of subfiles/folders all I want to know is if they are empty or not so there may be a simpler way I'm missing.
Thanks for any help you can give!
I see the question is tagged 'windows', so on Windows you could also use a COM object.
$fso = New-Object -ComObject Scripting.FileSystemObject
$folder = $fso.GetFolder($pathToFolder)
$folder will be an object with a bunch of interesting metadata on it, including SubFolders and Files. One of the interesting ones is Size. If Size is zero, there are no files in that directory, or in any nested subdirectories either.
If you just want to know if there are folders/subfolders and/or files then this will work:
$folder="C:\Test"
Get-ChildItem $folder -Recurse | Measure-Object
Output (in my case)
Count : 2
Average :
Sum :
Maximum :
Minimum :
Property :
If you want to see more properties then this might work for you:
Get-ChildItem -Path $folder -Recurse | Format-List *
alternatively you can also select the first x, last x, or even skip items:
Get-ChildItem -Path $folder -Recurse |Select-Object -First 2| Format-List *
*-Recurse will check all folders below
I've seen variations of this question answered, but typically using something like 7zip. I'm trying to find a solution that will work with the capabilities that come with windows absent any additional tools.
I have a directory that contains several hundred subdirectories. I need to individually compress each subdirectory....so I'll wind up with several hundred zip files, one per subdirectory. This is on a machine at work where I don't have administrative privileges to install new software...hence the desire to stay away from 7zip, winRar, etc.
If this has already been answered elsewhere, my apologies...
Never tried that myself, but there is Compress-Archive:
The Compress-Archive cmdlet creates a zipped (or compressed) archive file from one or more specified files or folders. An archive file allows multiple files to be packaged, and optionally compressed, into a single zipped file for easier distribution and storage. An archive file can be compressed by using the compression algorithm specified by the CompressionLevel parameter.
Because Compress-Archive relies upon the Microsoft .NET Framework API System.IO.Compression.ZipArchive to compress files, the maximum file size that you can compress by using Compress-Archive is currently 2 GB. This is a limitation of the underlying API.
Here's a sample script I just hacked together:
# configure as needed
$source = "c:\temp"
$target = "d:\temp\test"
# grab source file names and list them
$files = gci $source -recurse
$files
# target exists?
if( -not (test-path $target)) {
new-item $target -type directory
}
# compress, I am using -force here to overwrite existing files
$files | foreach{
$dest = "$target\" + $_.name + ".zip"
compress-archive $_ $dest -CompressionLevel Optimal -force
}
# list target dir contents
gci $target -recurse
You may have to improve it a bit when it comes to subfolders. In the above version, subfolders are compressed as a whole into a single file. This might not exactly be what you want.
Get-ChildItem c:\path\of\your\folder | ForEach-Object {
$path = $_.FullName
Compress-Archive -Path $path -DestinationPath "$path.zip"
}
I put this, as a quick snippet. Don't hesitate to comment if this does not fit with your request.
In a folder X, there are subfolders Y1, Y2...
Y1.zip, Y2.zip... will be created.
use PowerShell go the the path that you would like to compress, do:
$folderlist = Get-ChildItem "."
foreach ($Folder in $folderlist) { Compress-Archive -path $Folder.Name -destinationPath "$($Folder.Name).zip"}
I'm trying to copy files using copy-item. Specifically, I want to copy files with a particular extension that are within a folder or its subfolders to another location, and to retain the subfolder hierarchy. I've tried using -filter and -include to specify the file extension, but no files are copied.
My source and destination paths are stored in variables $packageSourcePath and $objPath. When called, $packageSourcePath will be like the following ".\src\projects\Project1\PackageFiles" and $objPath will be like the following ".\bld\Project1\obj".
The command I've tried using is this:
Copy-Item -Path $packageSourcePath\* -Filter *.resw -Destination $objPath -Recurse
I've also tried variations, such as leaving off * from the path, or using -Include instead of -Filter. Nothing works. If I leave out the -Filter argument, then files copy, but all of the files are copied. I only want files with the particular extension.
I've given up on Copy-Item. JohnLBevan's answer didn't actually do what I want since all files in the source root get copied, even though they don't match the filter. I tried piping Convert-Path | Select-String | Copy-Item but still got all files in the source root being copied.
A contact in a different context provided a couple of suggestions:
1)
Get-ChildItem -Force -Recurse -ErrorAction Ignore -Path $packageSourcePath -Filter *.resw | % {
$src = $_.FullName
$dst = Join-Path $objPath $src.SubString($packageSourcePath.Length)
echo "copy ""$src"" ""$dst"""
}
I think this is a bit harder to follow, hence less maintainable for the next person (likely another PS-neophyte like me) a year from now. ("Why is the -ErrorAction parameter needed here? What's the behaviour of the Substring() method, and why can't I find that using Get-Help?")
This suggestion is a bit clearer, after re-familiarizing with attrib and checking the effect of the xcopy switches:
2)
cd $packageSourcePath
attrib -a /s
attrib +a *.resw /s
xcopy /eidlm $packageSourcePath $objPath
But if we're going to use xcopy, we don't need to call attrib:
xcopy $packageSourcePath*.resw $objPath /s /i > $null
The only problem with this for my scenario is that xcopy emits an error if no matching files are found. My script is being used for a VSTS build task, and the xcopy errors cause the build task to fail. (For that reason, I'm guess that suggestion 2 also wouldn't work for me.)
So, I've opted for this:
# In PS version 5.1, nothing gets copied using Copy-Item $packageSourcePath\* -Filter *.resw ...
# so resorting to using xcopy, which mostly works. The one issue is that xcopy will output an
# error if no matching file is found, so using GCI first to test for a matching file.
if ($(Get-ChildItem $packageSourcePath\*.resw -Recurse).count -gt 0) {
xcopy $packageSourcePath\*.resw $objPath /s /i > $null
}
The condition using GCI is added to check there are matching files before calling xcopy, thereby avoiding any errors.
I'm still amazed that Copy-Item -Filter -Recurse didn't work.
This should do it (obviously this could be done in 1 line; I've assigned values to the variables just to help make it readable / self-explanatory):
[string]$filter = '*.resw'
[string]$source = Join-Path -Path $packageSourcePath -ChildPath '*'
[string]$target = $objPath
$source | Convert-Path | Copy-Item -Filter $filter -Recurse -Destination $target -Container #-Force
Notes:
We append the asterisk to the source path to ensure that we copy the contents of the source folder to the destination, without copying the source's root folder into the destination (i.e. say we're copying c:\temp\from to c:\temp\to, we don't want c:\temp\to\from (unless it's a copy of c:\temp\from\from)).
We use the Join-Path cmdlet to append this asterisk to ensure the appropriate slashes are inserted into the path.
We do a Convert-Path on the source to resolve the asterisk to the child folder/file names... for some reason copy-item doesn't handle these asterisks well. NB: Convert-Path will potentially return an array of paths; i.e. if there's more than one file/subfolder directly under the source folder. Get-Item or Resolve-Path could equally be used for this; I prefer Convert-Path since it returns a simple string array, rather than a more complex type; but there's no strong argument for using any one over the others.
We pipe these source paths to the Copy-Item command so it can be applied to each path returned by Convert-Path.
We include -Recurse to say we're interested in anything in the subfolders of the copied path.
We include the -Container parameter to say that we want to preserve any folder structure when copying. Strictly this is not needed, as this switch is defaulted to true (i.e. rather we should specify if we don't want this behaviour: -Container:$false; but I like to be clear that I deliberately want to preserve the directory structure, as opposed to leaving the assumption that I may not have thought of this. There's a better explanation of this here: https://stackoverflow.com/a/21798660/361842.
You could optionally include -Force; this would mean that should an item of the same name already exist in the target we overwrite it instead of getting an error.
Related documentation:
Join-Path
Convert-Path
Copy-Item
Update 2018-01-03
Per comments, this solution should ensure that only those items you want get copied, and pre-existing directories shouldn't cause issues.
[string]$filter = '*.resw'
[string]$source = $packageSourcePath
[string]$target = $objPath
#copy all files in subfolders of the source
$source | Get-ChildItem -Directory | Copy-Item -Filter $filter -Recurse -Destination $target -Container -Force
#copy all files in root of the source
$source | Get-ChildItem -File -Filter $filter | Copy-Item -Destination $target -Container -Force
This solution uses 2 steps; there's probably a better option, but due to the peculiarities / bug in this cmdlet the above's a reliable option.
Converting all the excel files in a specified directory using the below code. When I try to convert one file to csv it works as intended but when I try to convert all excel files to csv nothing happens no exception thrown, and the files are not converted.
$path = get-childitem -path "\\sharedrive\excelfiles\" -filter *.xlsx
foreach ($file in $path)
{
$Excelfilename = $file.fullname
$CSVfilename = "" + $file.Basename
$xlCSV=6
$Excel = New-Object -comobject Excel.Application
$Excel.Visible = $False
$Excel.displayalerts=$False
$Workbook = $Excel.Workbooks.Open($ExcelFileName)
$Workbook.SaveAs($CSVfilename,$xlCSV)
$Excel.Quit()
}
I have run into this same issue before, and this technet article was very useful: https://technet.microsoft.com/en-us/library/ff730962.aspx
Basically you're doing everything right, but the .Quit() method isn't exactly what you're looking for since it gets rid of the program but not the com object. We need to go one layer further and call the .NET Framework to release the COM object like below, then release the $Excel variable.
$Excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel)
Remove-Variable $Excel
This should give you what you're looking for and clean up your environment appropriately, allowing you to loop back through and make the change to all the .csv's