I am trying to create an Init.ps1 script for a NuGet package which adds items to a solution folder which may or may not already exist in the solution.
The script successfully adds or replaces the files in the filesystem and if there was no such solution folder, the items are successfully added to the newly created solution folder, however, if the solution folder already existed, a new solution folder named NewFolder1 (or 2 and on if that already exists) is created and no files are added to the solution.
param($installPath, $toolsPath, $package)
$solutionNode = Get-Interface $dte.Solution ([EnvDTE80.Solution2])
$solutionItemsNode = $solutionNode.FindProjectItem("FolderName")
$solutionItemsProjectItems = Get-Interface $solutionNode.ProjectItems ([EnvDTE.ProjectItems])
if (!$solutionItemsProjectItems) {
$solutionItemsNode = $solutionNode.AddSolutionFolder("FolderName")
$solutionItemsProjectItems = Get-Interface $solutionItemsNode.ProjectItems ([EnvDTE.ProjectItems])
}
$rootDir = (Get-Item $installPath).parent.parent.fullname
$deploySource = join-path $installPath '\sln\'
$deployTarget = join-path $rootDir '\FolderName\'
New-Item -ItemType Directory -Force -Path $deployTarget
ls $deploySource | foreach-object {
$targetFile = join-path $deployTarget $_.Name
Copy-Item $_.FullName $targetFile -Recurse -Force
$solutionItemsProjectItems.AddFromFile($targetFile) > $null
} > $null
I have tried various alterations such as checking $solutionItemsNode instead of $solutionItemsProjectItems (same effect), adding directly to solution instead of under solution folder (did nothing) and the following change which acts the same as the original:
$solutionItemsNode = $solution.Projects | where-object { $_.ProjectName -eq "FolderName" } | select -first 1
if (!$solutionItemsNode) {
$solutionItemsNode = $solutionNode.AddSolutionFolder("FolderName")
}
Any pointers?
Related Q&As and MSDN links:
adding-solution-level-items-in-a-nuget-package
nuget-how-to-add-files-to-solution-folder
copy-files-to-solution-folder-with-init-ps1-and-nuget
link from above answer
MSDN: envdte80.solutionfolder
When you add a solution folder to a solution it is available from the EnvDTE80.Solution2's Projects property.
So using $solutionNode.Projects seems to work. The only modification to the code you had is the use of $solutionNode.Projects instead of $solution.Projects since $solution does not exist anywhere so the result would always be $null.
$solutionItemsNode = $solutionNode.Projects | where-object { $_.ProjectName -eq "FolderName" } | select -first 1
if (!$solutionItemsNode) {
$solutionItemsNode = $solutionNode.AddSolutionFolder("FolderName")
}
The code above will not add the solution folder if it already exists.
Related
I want to move all images in a directory, including subdirectories, to a new location while maintaining the existing folder structure.
Following the example, here, I put the objects into a variable, like so:
$picMetadata = Get-FileMetaData -folder (Get-childitem K:\myImages -Recurse -Directory).FullName
The move must be based on the results of a logical expression, such as the following for example.
foreach ($test01 in $picMetadata) {
if ($test01.Height -match "^[0-9]?[0-9] ") {
Write-Host "Test01.Height:" $test01.Height
}
}
Still at an early testing phase So far, I'm having no success even testing for the desired files. In the example above, I thought this simple regex test might provide for anything from "1 pixels" to "99 pixels", which would at least slim down my pictures collection (e.g. an expression without the caret, like "[0-9][0-9] " will return "NN pixels" as well as "NNN Pixels", "NNNNNN pixels", etc.)
Once I figure out how to find my desired images based on a logical, image object dimensions test, I will then need to create a script to move the files. Robocopy /MOV would be nice, but i'm probably in over my head already.
I was going to try to base it on this example (which was provided to a User attempting to COPY (not move / copy/delete) *.extension files). Unfortunately, such a simple operation will not benefit me, as I wish to move .jpg,.png,.gif, etc, based on dimensions not file extension:
$sourceDir = 'K:\myImages\'
$targetDir = ' K:\myImages_psMoveTest\'
Get-ChildItem $sourceDir -filter "*" -recurse | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
Perhaps you have a powershell script that could be used for my intended purpose? I'm just trying to move smaller images out of my collection, without having to overwrite same name images, and lose folder structure, etc.
Thank you very much for reading, and any advisory!
(Edit: Never opposed to improving Powershell skill, if you are aware of a freeware software which would perform this operation, please advise.)
If I understand your question correctly, you want to move image files with a pixel height of 1 up to 99 pixels to a new destination folder, while leaving the subfolder structure intact.
If that is true, you can do:
# needed to use System.Drawing.Image
Add-Type -AssemblyName System.Drawing
$sourceDir = 'K:\myImages'
$targetDir = 'K:\myImages_psMoveTest'
Get-ChildItem $sourceDir -File -Recurse | ForEach-Object {
$file = $_.FullName # need this for when we hit the catch block
try {
# Open image file to determine the pixelheight
$img = [System.Drawing.Image]::FromFile($_.FullName)
$height = $img.Height
# dispose of the image to remove the reference to the file
$img.Dispose()
$img = $null
if ($height -ge 1 -and $height -le 99) {
$targetFolder = Join-Path -Path $targetDir -ChildPath $_.DirectoryName.Substring($sourceDir.Length)
# create the target (sub) folder if it does not already exist
$null = New-Item -Path $targetFolder -ItemType Directory -Force
# next move the file
$_ | Move-Item -Destination $targetFolder -ErrorAction Stop
}
}
catch {
Write-Warning "Error moving file '$file': $($_.Exception.Message)"
}
}
We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName
I have the following problem and I would really appreciate it if I could get some help on that front. I am getting a constant flow of xml files into a folder. A XML file name can look like this. It only goes up to 1005.
1001.order-asdf1234.xml
1002.order-asdf4321.xml
I want to sort the files into uniquely named folders that are not based on the file names. A example for that would be
C:\Directory Path...\Peter (All files starting with 1001 go in there)
C:\Directory Path...\John (All files starting with 1002 go there)
How can I create a batch or a powershell script to continuously sorts files into the specified folders? Since I only have 5 folders I would like to simply specify the target folders for each and not have elaborate loops but I don't know how to do that.
The easiest way is to create a lookup Hashtable where you define which prefix ('1001' .. '1005') maps to which destination folder:
# create a Hasthable to map the digits to a foldername
$folderMap = #{
'1001' = 'Peter'
'1002' = 'John'
'1003' = 'Lucretia'
'1004' = 'Matilda'
'1005' = 'Henry'
}
# set source and destination paths
$rootFolder = 'X:\Where\the\files\are'
$destination = 'Y:\Where\the\files\should\go'
# loop over the files in the root path
Get-ChildItem -Path $rootFolder -Filter '*.xml' -File |
Where-Object { $_.BaseName -match '^\d{4}\.' } |
ForEach-Object {
$prefix = ($_.Name -split '\.')[0]
$targetPath = Join-Path -Path $destination -ChildPath $folderMap[$prefix]
$_ | Move-Item -Destination $targetPath -WhatIf
}
Remove the -WhatIf safety-switch if you are satisfied with the results shown on screen
You could use a switch statement to decide on the target folder based on the first part of the file name:
$files = Get-ChildItem path\to\folder\with\xml\files -Filter *.xml
switch($files)
{
{$_.Name -like '1001*'} {
$_ |Move-Item -Destination 'C:\path\to\Peter'
}
{$_.Name -like '1002*'} {
$_ |Move-Item -Destination 'C:\path\to\John'
}
{$_.Name -like '1003*'} {
# etc...
}
default {
Write-Warning "No matching destination folder for file '$($_.Name)'"
}
}
If you change your mind about loops, my preference would be to store the mapping in a hashtable and loop over the entries for each file:
$files = Get-ChildItem path\to\folder\with\xml\files -Filter *.xml
$targetFolders = #{
'1001' = 'C:\path\to\Peter'
'1002' = 'C:\path\to\John'
'1003' = 'C:\path\to\Paul'
'1004' = 'C:\path\to\George'
'1005' = 'C:\path\to\Ringo'
}
foreach($file in $files){
$targetFolder = $targetFolders.Keys.Where({$file.Name -like "${_}*"}, 'First')
$file |Move-Item -Destination $targetFolder
}
I was downloading a huge torrent (1.2tb with over 6000 folders) divided in 2 parts, so I placed the 2nd part on the designed place and it was not a problem since the master-folder of the torrent is exactly what was needed. The 1st part master-folder was named with some generic torrent name instead of the name I needed, so instead of renaming the torrent name to "source", which I think would have worked and renamed the currently generic name to "source". In files tab I selected all the files and right-click>relocate all of them and bittorrent simply moved all of the files to the same directory, without any subfolder, and created a mess.
So I have a un-finished backup of this torrent and the files are in place, so my idea was using the un-finished one's name, match with the finished ones and put the finished ones in the un-finished matching name's path folder. I hope that was clear.
I tried to resolve this using PowerShell, but I dont know much, so I came up with this and nothing happens, something is wrong. Anyone knows a solution?
$itemlistA = Get-ChildItem -Path "D:\BitTorrent\" |
ForEach-Object {
$objnameA = $_.Name
$objPathA = $_.FullName
}
$itemlistB = Get-ChildItem -Path "E:\DesiredPath\" -recurse |
ForEach-Object{
$objnameB = $_.Name
$objPathB = $_.FullName
}
ForEach-Object{
if($objnameA -eq $objnameB){
Copy-Item -path $objPathA -Destination $objPathB
Write-Host "ffff Object ($objnameA) new Path ($objPathB) ffff"
}
}
If I'm understanding your intent correctly, the script below will accomplish your goal, assuming your goal is to copy files from a flattened directory into some (potentially) nested directories so that the incoming files overwrite files with matching names.
The O(n^2) performance of the nested loops could be improved with a sort and more efficient search.
You'd need to edit the script's params to reflect your own environment.
param(
$pathToFiles = "$PSScriptRoot\BitTorrent\",
$desiredPath = "$PSScriptRoot\DesiredPath\"
)
$itemlistA = Get-ChildItem -Path $pathToFiles | Select-Object -Property Name, FullName
$itemlistB = Get-ChildItem -Path $desiredPath -Recurse | Select-Object -Property Name, FullName
foreach ($fileA in $itemlistA) {
foreach ($fileB in $itemListB) {
if ($fileB.Name -eq $fileA.Name) {
Copy-Item -path $fileA.FullName -Destination $fileB.FullName -Verbose
break
}
}
}
I'm having a little trouble finishing a powershell script. I have a program that downloads images into a temporary folder, then within an hour it will delete the image files. Occasionally I would like to keep some of the files so I'm creating a powershell script to copy the image files out of the temporary folder to a new folder so that I can sort through them and keep what I want.
This is a one way copy from download folder to a destination folder.
Since the script will be ran every 20 minutes by the windows Task Scheduler, I want the script to compare the files it finds in the download folder to the files in the destination folder to see if it's already there.
If the file is new, then copy it, if it's already in the destination folder, ignore it.
This is purely matching filename only and not doing a binary compare.
Searching the web, I found a script on TomsITpro on "How To Sync Folders With PowerShell". This script copies the files both ways between two folders. As I try to modify this to copy only one way, I'm getting powershell errors.
$DownloadFolder = 'D:\TempImages'
$KeepFolder = 'D:\KeepImages'
$DownloadFiles = Get-ChildItem -Path $DownloadFolder
$KeepFiles = Get-ChildItem -Path $KeepFolder
$FileDiffs = Compare-Object -ReferenceObject $DownloadFiles -DifferenceObject $KeepFiles
$FileDiffs | foreach {
$copyParams = #{
'Path' = $_.InputObject.Fullname
}
if ($_.SideIndicator -eq '=>')
{
$copyParams.Destination = $KeepFolder
}
Copy-Item #copyParams
}
This script sort of works and does the compare very well and only copies new files if they already exist. The problem is when I run it from the d:\script folder, it copies all the files to the d:\script folder instead of the destination folder, D:\KeepImages. What am I doing wrong here?
Any help with this powershell script would be appreciated.
$DownloadFolder = 'D:\TempImages'
$KeepFolder = 'D:\KeepImages'
$DownloadFiles = Get-ChildItem -Path $DownloadFolder
$KeepFiles = Get-ChildItem -Path $KeepFolder
$FileDiffs = Compare-Object -ReferenceObject $DownloadFiles -DifferenceObject $KeepFiles
$FileDiffs | foreach {
$copyParams = #{
'Path' = $_.InputObject.Fullname
}
$Downloadll = $copyParams.path
if ($_.SideIndicator -eq '=>')
{
Copy-Item $Downloadll -Destination $KeepFolder
}
}
Try this
try to Add: Copy-Item -Force -Destination $KeepFolder
instead of Copy-Item #copyParams