PowerShell Copying only New added folders and email results - windows

Is it possible to copy from the Source location only New folders that have been added.
I have a Source location that is updated with folders every 5 minutes. The PS1 script will run every 5 minutes and copy all the folders to the destination location.
The issue im having is - It's copying over everything, i only want it to match up what has already been copied over Prior and copy over only newly added folders, Instead of copying everything again that is already there. Is this possible?
Also if possible once the copying of only Recently added folders, can the script then email out completion of this with what is has done?
So far i have the following :
Copy-Item -Recurse \\192.168.1.37\d$\Transactions\* -Destination D:\UK_Copy\Transactions_Bk -Force –Verbose

You could create a script that just loops. Something like:
$Source = "\\192.168.1.37\d$\Transactions\"
$Destination = "D:\UK_Copy\Transactions_Bk"
$LastScan = Get-ChildItem $Destination -recurse
While(1 -lt 2){
$NewScan = Get-ChildItem $Source -recurse
Compare-Object $NewScan -DifferenceObject $LastScan -Passthru | Copy-Item -Destination ($_.DirectoryName -replace [regex]::escape($source),$Destination) -force
$LastScan = $NewScan
Start-Sleep 300
}
That will get a listing of files in the destination, then look for files in the source that are new, or have changed compared to the destination and copy those over. Then it sets that scan as the last known listing to compare to, sleeps for 5 minutes, and loops again looking for files that are new or updated since the last scan.

Related

PowerShell script to move files from one location to another. Only files which are more than 5 days old

How to create a powershell script that moves files from one location to another specifying that any file that is 5 days old does not move?
Used Move-Item to move files from one location to another, but I need to move files which are more than 5 days old.
I think this pipeline should works for you
Get-ChildItem -Path C:\Downloads -File | Where-Object { $_.CreationTime -lt (Get-Date).AddDays(-5)} | Move-Item -PipelineVariable $_ -Destination C:\button

Get folder permissions with only 3 levels of subfolders

First of all: sorry for my bad english.
So, I need to create various reports with all permissions of specified folders.
After some search I found 2 ways.
One is using AccessEnum, that it's almost perfect but it doesn't export all permissions, only the folders that have different permission from the root folder. And I need all of them, even if they are the same of the root folder.
The second one is better, a powershell script, but has one weakness: too much recursive, and one of the folders had an output report of 7GB. Holy shirt.
What I need: to modify the script to go deep only for 3 levels of subfolders, for example:
"C:\Folder1" contains various subfolders but I want the script to go deep only to "C:\Folder1\Folder2\Folder3\"
How can I do it?
This is the script:
dir -Recurse "C:\FOLDER" | where { $_.PsIsContainer } | % { $path1 = $_.fullname; Get-Acl $_.Fullname | % { $_.access | Add-Member -MemberType NoteProperty '.\Application Data' -Value $path1 -passthru }} | Export-Csv "C:\REPORT.csv"
Use
Get-Childitem
instead. It has a Depth-Parameter and you can only include Folders.
Reference:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/get-childitem?view=powershell-7

PowerShell script to Visual Studio application

Firstly I do apologise if this isn't the correct forum to be posting this question and if this isn't the place to be asking can anyone direct me to a new person forum?
Secondly you'll soon discover that I really don't know much about PowerShell or Visual Studio but I'm learning... I'm sure the Powershell script your going to see could be better but it works.
So my issue was with a system that output .txt files onto 2 PC's LH & RH these .txt files were output with different names as it was using 3 different products.
We then needed these files to be filtered by product and copied to a network drives while also being archived and deleted from the original folder.. Oh and this needed to be done real time.
So my Powershell script I've been using is the following
$folder = 'Target Folder'
$timeout = 1000
$filesystemwatcher = new-object system.IO.filesystemwatcher $folder
write-host "Monitoring... $folder
Transfering..."
while ($true) {
$result = $FileSystemWatcher.WaitForChanged('all', $timeout)
if ($result.timeout -eq $false)
{
write-warning ('file {0} : {1}' -f $result.changetype, $result.name)
}
$targetdirectory = "Target folder"
$sourcedirectory = "export folder"
if (-not(Test-Path -path $targetdirectory)) {
New-Item $targetdirectory -Type Directory}
Copy-Item -Path $sourcedirectory\"*.txt" -Destination $targetdirectory
$Files = Get-ChildItem -Path export folder -Filter "*.txt" -Recurse
foreach($File in $Files)
{
if ($File.name -like "1*.txt")
{
Move-Item -Path $File.FullName "1 folder"
}
elseif ($File.name -like "2*.txt")
{
Move-Item -Path $File.FullName "2 folder"
}
elseif ($File.name -like "3*.txt")
{
Move-Item -Path $File.FullName "3 folder"
}
}
}
}
Now this script gets the files moved and works but its a Powershell script and its running 24/7 365 days a year sometimes the script had stopped sometimes the script has been messed with its just not reliable enough.
So I want to turn it into a application via Visual Studio.
Is it possible? remember i have never used Visual Studio before (Been trying it for a few hours learnt some basics)
Has anyone done anything similar to this before/ is there any guide anyone can think of that would suit my needs more?
I'm looking for the application to have the following
A status screen ie what files it has found and where it has moved them to
A setting option to be able to set.. amount of filter string/save paths... change source & export/archive paths etc
Can anyone point me in the right direction? Being new I don't know what to search for to get guides on my needs
Cheers

Copy-Item : The given path's format is not supported

I am trying to copy configuration files for an Active X controller to all user profiles on remote computers and I am running into problems. I have tried several variations of the code to no avail, my most recent, simplified code is shown below which is generating a path format not supported error:
$From = "C:\Interactive" $To = "C:\Users\$user\appdata\Microsoft\Internet Explorer\Downloaded Program Files" ForEach ($user in (Get-ChildItem C:\Users -Exclude Public)){Copy-Item -Path $From -Destination $To}
I assume there is an argument I am missing or some sort of syntax but I cannot find it. I plan on deploying this script using PS App Deploy Toolkit through SCCM when it is working (Group Policy is not currently a viable solution for me at this time)
I have spent my day trying to find a working script and I have come up empty. I used to use Set-ActiveSetup Stub ExePath but that seems to not be working any longer.
Well I found a way that works for me. I am including how I went about it, ignore the part with the DLL registration-
$Source = "C:\Temp\Downloaded Program Files"
$Destination = "C:\users\*"
$Items = Get-ChildItem -Path $Destination
foreach ($Item in $Items)
{
Write-Verbose "List of folders: $item" -Verbose ##added for visibility when I was testing
Copy-Item $Source -Destination "$item\AppData\Local\Microsoft\Internet Explorer" -Force -Recurse
$CKIDLL = "`"$item\Appdata\Local\Microsoft\Internet Explorer\Downloaded Program Files\CKInteractiveDriver.dll`""
Start-Process -Filepath 'regsvr32.exe' -Args "/s $CKIDLL"
}

How to get the Dropbox folder in Powershell in Windows

Same question exists for Python here: How can I get the Dropbox folder location programmatically in Python?, or here for OSX: How to get the location of currently logined Dropbox folder
Same thing in Powershell. I need the path of DropBox to copy files to it (building a software and then copying it to dropbox to share with team).
This Dropbox help page tells us where this info is stored, ie, in a json file in the AppData of the user: https://www.dropbox.com/help/4584
function GetDropBoxPathFromInfoJson
{
$DropboxPath = Get-Content "$ENV:LOCALAPPDATA\Dropbox\info.json" -ErrorAction Stop | ConvertFrom-Json | % 'personal' | % 'path'
return $DropboxPath
}
The line above is taken from: https://www.powershellgallery.com/packages/Spizzi.Profile/1.0.0/Content/Functions%5CProfile%5CInstall-ProfileEnvironment.ps1
Note that it doesn't check if you've got a Dropbox business account, or if you have both. It just uses the personal one.
You can then use this base Dropbox folder to build your final path, for example:
$targetPath = Join-Path -Path (GetDropBoxPathFromInfoJson) -ChildPath 'RootDropboxFolder\Subfolder1\Subfolder2'
if (-not (Test-Path -Path $targetPath)) { throw "Path '$targetPath' not found!" }
--
Alternative way is using the host.db file, as shown on this page:
http://bradinscoe.tumblr.com/post/75819881755/get-dropbox-path-in-powershell
$base64path = gc $env:appdata\Dropbox\host.db | select -index 1 # -index 1 is the 2nd line in the file
$dropboxPath = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($base64path)) # convert from base64 to ascii

Resources