Here is a simple script:
$srcpth = "C:\Users\Mark\Desktop\dummy\"
$files = Get-ChildItem -Path $srcpth -File -Recurse
foreach ($f in $files) {
$filen = $f.Name
$filesize = $f.Length
Write-Output "$filen $filesize"
}
This will correctly loop through all subfolders in C:\Users\Mark\Desktop\dummy and output file name with file size, but it will not show relative path. How do I resolve the relative path? Thanks.
EDIT: added below for clarification of desired output:
For example, under C:\Users\Mark\Desktop\dummy are subfolders with files
C:\Users\Mark\Desktop\dummy\file00.txt
C:\Users\Mark\Desktop\dummy\folder01\file01_01.txt
C:\Users\Mark\Desktop\dummy\folder01\file01_02.txt
C:\Users\Mark\Desktop\dummy\folder01\file01_03.txt
C:\Users\Mark\Desktop\dummy\folder02\file02_01.txt
C:\Users\Mark\Desktop\dummy\folder02\file02_01.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_01.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_02.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_03.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_04.txt
Output with above code produces:
file00.txt 9
file01_01.txt 10
file01_02.txt 12
file01_03.txt 12
file02_01.txt 15
file02_01.txt 14
file03_01.txt 11
file03_02.txt 15
file03_03.txt 13
file03_04.txt 12
But what I want is:
file00.txt 9
\folder01\file01_01.txt 10
\folder01\file01_02.txt 12
\folder01\file01_03.txt 12
\folder02\file02_01.txt 15
\folder02\file02_01.txt 14
\folder03\file03_01.txt 11
\folder03\file03_02.txt 15
\folder03\file03_03.txt 13
\folder03\file03_04.txt 12
preceeding \, no slash, or .\ are fine.
Here you go:
$srcpth = "C:\Users\Mark\Desktop\dummy\"
$files = Get-ChildItem -Path $srcpth -File -Recurse
foreach ($f in $files) {
$filen = $f.Name
$filesize = $f.Length
$relativePath = $f.fullname.remove(0,($srcpth.length))
Write-Output "$filen $filesize $relativePath"
}
There aren't any object properties with the value you're looking for. But you can calculate it like above. It's always useful to look at the members of an object when you're trying to figure something like this out:
$files[0] | get-member
This will give you a better idea of what you can work with, what properties you can use, and what methods are available.
I would recommend you to output objects instead of strings as you're doing right now, in any case, you can get the relative paths either using .SubString(..):
foreach ($f in Get-ChildItem -Path $srcpth -File -Recurse) {
[pscustomobject]#{
FileName = $f.Name
FileSize = $f.Length
RelativePath = $f.FullName.Substring($srcpth.Length + 1)
}
}
Or if you're using PowerShell Core, you can access the .NET API Path.GetRelativePath(String, String):
foreach ($f in Get-ChildItem -Path $srcpth -File -Recurse) {
[pscustomobject]#{
FileName = $f.Name
FileSize = $f.Length
RelativePath = [IO.Path]::GetRelativePath($srcpth, $f.FullName)
}
}
There is also PathIntrinsics.NormalizeRelativePath(String, String) Method available to both, Windows PowerShell and PowerShell Core, though this seems an overkill:
$ExecutionContext.SessionState.Path.NormalizeRelativePath($f.FullName, $srcpth)
While the String.Substring() / .Remove() and [IO.Path]::GetRelativePath() solutions are sufficient when working with only absolute native paths, they fail when the -Path argument for Get-ChildItem is a relative path or a PowerShell-only path (see examples at the end of this answer for how they can fail).
For a solution that additionally supports PowerShell paths and relative paths, I recommend to use Resolve-Path -Relative:
# For this demo, create a Powershell-only path.
$null = New-PSDrive -Name TempDrv -Root ([IO.Path]::GetTempPath()) -PSProvider FileSystem
$srcpth = 'TempDrv:\RelativePathTest'
$null = New-Item "$srcpth\subdir\test.txt" -Force
# Set base path for Get-ChildItem and Resolve-Path. This is necessary because
# Resolve-Path -Relative resolves paths relative to the current directory.
Push-Location $srcpth
try {
foreach ($f in Get-ChildItem -File -Recurse) {
[pscustomobject]#{
FileName = $f.Name
FileSize = $f.Length
RelativePath = Resolve-Path $f.FullName -Relative
# Alternative to remove ".\" or "./" prefix from the path:
# RelativePath = (Resolve-Path $f.FullName -Relative) -replace '^\.[\\/]'
}
}
}
finally {
# Restore current directory even in case of script-terminating error
Pop-Location
}
Output:
FileName FileSize RelativePath
-------- -------- ------------
test.txt 0 .\subdir\test.txt
Modes of failure:
This is how the String.Substring() method fails for the PowerShell path of the sample above, on my system (you may see a different outcome depending on the location of your temp directory):
FileName FileSize RelativePath
-------- -------- ------------
test.txt 0 ubdir\test.txt
And this is how [IO.Path]::GetRelativePath() fails:
FileName FileSize RelativePath
-------- -------- ------------
test.txt 0 ..\..\..\..\temp\RelativePathTest\subdir\test.txt
Related
I am creating a script that splits a target folder's files into subfolders of n length, where n is a number specified dynamically.
So basically, if Folder A has 9000 files, and I limit the number of files to 1000 per folder, the script would create nine sub-directories inside of Folder A with 1000 files each.
Here is working code:
param (
[Parameter(Mandatory,Position=0)]
[String]
$FileList,
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName)]
[Int32]
$NumFilesPerFolder = 1000,
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName)]
[Int32]
$FolderNumberPadding = 2
)
$Folders = Get-Content $FileList
Set-Location -LiteralPath ([IO.Path]::GetTempPath())
function Move-Files {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position=0)]
[System.Collections.ArrayList]
$List,
[Parameter(Mandatory)]
[Int32]
$Index
)
$BaseFolder = [System.IO.Path]::GetDirectoryName($List[0])
$DestFolderName = $Index.ToString().PadLeft($FolderNumberPadding, '0')
$DestFolder = New-Item -Path (Join-Path $BaseFolder $DestFolderName) -Type Directory -Force
Move-Item $List -Destination $DestFolder -Force
}
foreach ($Folder in $Folders) {
$Files = Get-ChildItem -LiteralPath $Folder -File -Force
$filesidx = 1
$totalidx = $null
$groupidx = 0
$FilesToMove = [System.Collections.ArrayList]#()
foreach ($File in $Files) {
if($null -eq $totalidx){
$totalidx = $Files.Length
}
if($filesidx -eq 1){
$groupidx++
}
$FilesToMove.Add($File)
if($filesidx -eq $NumFilesPerFolder){
Move-Files -List $FilesToMove -Index $groupidx
$FilesToMove.Clear()
$filesidx = 1
}elseif($totalidx -eq 1){
Move-Files -List $FilesToMove -Index $groupidx
$FilesToMove.Clear()
break
}else{
$filesidx++
}
$totalidx--
}
}
Remove-Item $FileList -Force
$app = New-Object -ComObject Shell.Application
$appwin = $app.Windows()
foreach ($window in $appwin) {
if($window.Name -eq "File Explorer"){
$window.Refresh()
}
}
Invoke-VBMessageBox "Operation Complete" -Title "Operation Complete" -Icon Information -BoxType OKOnly
This code runs reasonably well, but it heavily bottlenecks when actually moving the files with Move-Item. I'd like to try and use RoboCopy here, but I am perplexed as to how I can implement it.
What I'm having trouble with is that the items I need to move are stored in a list (see the Move-Files function), and every item that needs to be moved are all in the same sub-directory. So I can't just do RoboCopy.exe C:\Source C:\Destination /mov.
How can I integrate RoboCopy here to accomplish my goal? I really need multi-threaded performance as this function will be responsible for moving thousands of files around in production on a frequent basis.
Any help would be greatly appreciated - please let me know if I can provide more information to further clarify my objective.
Thanks for any help at all!
I am writing a function in PowerShell 7 that flattens a directory.
It's ideally supposed to:
Copy / Move everything to a temp directory (Depending on whether a destination was supplied)
Rename all files that have identical filenames with a _XX numerical suffix (Padding controlled by a parameter)
Move everything back to the root of the original directory, or the destination directory supplied.
Here is the gist.
Here is the relevant code, without documentation to save space as it's a long one:
function Merge-FlattenDirectory {
[CmdletBinding(SupportsShouldProcess)]
param (
[Parameter(Mandatory, Position = 0, ValueFromPipeline)]
[ValidateScript({
if (!(Test-Path -LiteralPath $_)) {
throw [System.ArgumentException] "Path does not exist."
}
if ((Test-IsSensitiveWindowsPath -Path $_ -Strict).IsSensitive) {
throw [System.ArgumentException] "Path supplied is a protected OS directory."
}
return $true
})]
[Alias("source", "input", "i")]
[string]
$SourcePath,
[Parameter(Mandatory = $false, Position = 1, ValueFromPipelineByPropertyName)]
[Alias("destination", "dest", "output", "o")]
[string]
$DestinationPath = $null,
[Parameter(Mandatory=$false)]
[Switch]
$Force,
[Parameter(Mandatory = $false, ValueFromPipelineByPropertyName)]
[ValidateSet(1, 2, 3, 4, 5)]
[int32]
$DuplicatePadding = 2
)
begin {
# Trim trailing backslashes and initialize a new temporary directory.
$SourcePath = $SourcePath.TrimEnd('\')
$DestinationPath = $DestinationPath.TrimEnd('\')
$TempPath = (New-TempDirectory).FullName
New-Item -ItemType Directory -Force -Path $TempPath
# Escape $SourcePath so we can use wildcards.
$Source = [WildcardPattern]::Escape($SourcePath)
# If there is no $DestinationPath supplied, we flatten only the SourcePath.
# Thus, set DestinationPath to be the same as the SourcePath.
if (!$DestinationPath) {
$DestinationPath = $SourcePath
# Since there is no destination supplied, we move everything to a temporary
# directory for further processing.
Move-Item -Path $Source'\*' -Destination $TempPath -Force
}else{
# We need to perform some parameter validation on DestinationPath:
# Make sure the passed Destination is not a file
if(Test-Path -LiteralPath $DestinationPath -PathType Leaf){
throw [System.IO.IOException] "Please provide a valid directory, not a file."
}
# Make sure the passed Destination is a validly formed Windows path.
if(!(Confirm-ValidWindowsPath -Path $DestinationPath -Container)){
throw [System.IO.IOException] "Invalid Destination Path. Please provide a valid directory."
}
# Make sure the passed Destination is not in a protected or sensitive OS location.
if((Test-IsSensitiveWindowsPath -Path $DestinationPath -Strict).IsSensitive){
throw [System.IO.IOException] "The destination path is, or resides in a protected operating system directory."
}
# Since a destination was supplied, we copy everything to a new temp directory
# instead of moving everything. We want the source directory to remain untouched.
# Robocopy seems to be the most performant here.
# Robocopy on Large Dataset: ~789ms - ~810ms
# Copy-Item on Large Dataset: ~1203ms - ~1280ms
#
# Copy-Item -Path $Source'\*' -Destination $TempPath -Force -Recurse
Robocopy $Source $TempPath /COPYALL /B /E /R:0 /W:0 /NFL /NDL /NC /NS /NP /MT:48
# Create the destination directory now, ready for population in the process block.
New-Item -ItemType Directory -Force -Path $DestinationPath
}
# Grab all files as an Array of FileInfo Objects
$AllFiles = [IO.DirectoryInfo]::new($TempPath).GetFiles('*', 'AllDirectories')
# Initialize hashtable to store duplicate files
$Duplicates = #{}
}
process {
##
# $Stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
#
# Iterate over all files
foreach ($File in $AllFiles) {
# If our $Duplicates hashtable already contains the current filename, we have a duplicate.
if ($Duplicates.Contains($File.Name)) {
# Rename the duplicate file by appending a numerical index to the end of the file.
$PathTemp = Get-ItemProperty -LiteralPath $File
$RenamedFile = Rename-Item -LiteralPath $PathTemp.PSPath -PassThru -NewName ('{0}_{1}{2}' -f #(
$File.BaseName
$Duplicates[$File.Name].ToString().PadLeft($DuplicatePadding, '0')
$File.Extension
))
# Increment the duplicate counter and pass $File down to be moved.
$Duplicates[$File.Name]++
$File = $RenamedFile
} else {
# No duplicates were detected. Add a value of 1 to the duplicates
# hashtable to represent the current file. Pass $File down to be moved.
$PathTemp = Get-ItemProperty -LiteralPath $File
$Duplicates[$File.Name] = 1
$File = $PathTemp
}
# If Force is specified, we don't have to worry about duplicate files,
# as the operation will overwrite every file with a duplicate filename
if($Force){
# Move the file to its appropriate destination. (Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -Force
} else {
try {
# Move the file to its appropriate destination. (Non-Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -ErrorAction Stop
} catch {
# Warn the user that files were skipped because of duplicate filenames.
Write-Warning "File already exists in the destination folder. Skipping this file."
}
}
# Return each file to the pipeline.
# $File
}
# $Stopwatch.Stop()
# Write-Host "`$Stopwatch.Elapsed: " $Stopwatch.Elapsed -ForegroundColor Green
# Write-Host "`$Stopwatch.ElapsedMilliseconds:" $Stopwatch.ElapsedMilliseconds -ForegroundColor Green
# Write-Host "`$Stopwatch.ElapsedTicks: " $Stopwatch.ElapsedTicks -ForegroundColor Green
}
end {
}
}
# Merge-FlattenDirectory "C:\Users\username\Desktop\Testing\Test" "C:\Users\username\Desktop\Testing\TestFlat" -Force
The function works great for the most part, but there is a major problem I didn't anticipate. The code is vulnerable to naming collisions. Here's a problematic directory structure:
(Root directory to be flattened is C:\Users\username\Desktop\Testing\Test)
Directory: C:\Users\username\Desktop\Testing\Test
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 11/4/2021 10:03 PM 1552565 1088_p_01.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p_02.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p_03.jpg
Directory: C:\Users\username\Desktop\Testing\Test\Folder
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 11/4/2021 10:03 PM 1552565 1088_p_03.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p.jpg
Directory: C:\Users\username\Desktop\Testing\Test\Testing
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 11/4/2021 10:03 PM 1552565 1088_p_01.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p.jpg
If I run the function to flatten C:\Users\username\Desktop\Testing\Test I get only six files instead of seven in the destination folder. The folder is missing the second 1088_p.jpg. I can verify this by going to my temp directory and looking at what's left:
C:\Users\username\AppData\Local\Temp\DdtElMvSoXbJf\Testing\1088_p.jpg is still in temp.
Anyway, If you're still with me after all this, I thank you generously for reading.
I really need to refactor the function in a way that accounts for this edge-case and I can't figure out how to do it at all gracefully. I could desperately use some help or guidance from someone that can point me in the right direction. I've been working on this function for awhile now and I'd really like to wrap it up.
Many, many thanks.
Edit:
I have a working solution now. I added an additional layer of duplication checks, and moved the actual renaming of the file further down.
Here's the revised code (Only relevant portion included):
# Iterate over all files
foreach ($File in $AllFiles) {
# If our $Duplicates hashtable already contains the current filename, we have a duplicate.
if ($Duplicates.Contains($File.Name)) {
# Create a new name for the file by appending a numerical index to the end of the filename.
$PathTemp = Get-ItemProperty -LiteralPath $File
$NewName = ('{0}_{1}{2}' -f #(
$File.BaseName
$Duplicates[$File.Name].ToString().PadLeft($DuplicatePadding, '0')
$File.Extension
))
# Check if our new name collides with any other filenames in $Duplicates. If so, create
# another new name by appending an additional numeric index to the end of the filename.
$DuplicateCount = 1
while ($Duplicates[$NewName]) {
$NewName = ('{0}_{1}{2}' -f #(
[System.IO.Path]::GetFileNameWithoutExtension($NewName)
$DuplicateCount.ToString().PadLeft($DuplicatePadding, '0')
[System.IO.Path]::GetExtension($NewName)
))
Write-Warning $DuplicateCount.ToString().PadLeft($DuplicatePadding, '0')
$DuplicateCount++
# If we're at a depth of 8, throw. Something is obviously wrong.
if ($DuplicateCount -ge 8) {
throw [System.Exception] "Duplicate count reached limit."
break
}
}
# Finally, rename the file with our new name.
$RenamedFile = Rename-Item -LiteralPath $PathTemp.PSPath -PassThru -NewName $NewName
# Increment the duplicate counters and pass $File down to be moved.
$Duplicates[$File.Name]++
$Duplicates[$NewName]++
$File = $RenamedFile
} else {
# No duplicates were detected. Add a value of 1 to the duplicates
# hashtable to represent the current file. Pass $File down to be moved.
$PathTemp = Get-ItemProperty -LiteralPath $File
$Duplicates[$File.Name] = 1
$File = $PathTemp
}
# If Force is specified, we don't have to worry about duplicate files,
# as the operation will overwrite every file with a duplicate filename
if($Force){
# Move the file to its appropriate destination. (Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -Force
} else {
try {
# Move the file to its appropriate destination. (Non-Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -ErrorAction Stop
} catch {
# Warn the user that files were skipped because of duplicate filenames.
Write-Warning "File already exists in the destination folder. Skipping this file."
}
}
# Return each file to the pipeline.
$File
}
I have the following problem and I would really appreciate it if I could get some help on that front. I am getting a constant flow of xml files into a folder. A XML file name can look like this. It only goes up to 1005.
1001.order-asdf1234.xml
1002.order-asdf4321.xml
I want to sort the files into uniquely named folders that are not based on the file names. A example for that would be
C:\Directory Path...\Peter (All files starting with 1001 go in there)
C:\Directory Path...\John (All files starting with 1002 go there)
How can I create a batch or a powershell script to continuously sorts files into the specified folders? Since I only have 5 folders I would like to simply specify the target folders for each and not have elaborate loops but I don't know how to do that.
The easiest way is to create a lookup Hashtable where you define which prefix ('1001' .. '1005') maps to which destination folder:
# create a Hasthable to map the digits to a foldername
$folderMap = #{
'1001' = 'Peter'
'1002' = 'John'
'1003' = 'Lucretia'
'1004' = 'Matilda'
'1005' = 'Henry'
}
# set source and destination paths
$rootFolder = 'X:\Where\the\files\are'
$destination = 'Y:\Where\the\files\should\go'
# loop over the files in the root path
Get-ChildItem -Path $rootFolder -Filter '*.xml' -File |
Where-Object { $_.BaseName -match '^\d{4}\.' } |
ForEach-Object {
$prefix = ($_.Name -split '\.')[0]
$targetPath = Join-Path -Path $destination -ChildPath $folderMap[$prefix]
$_ | Move-Item -Destination $targetPath -WhatIf
}
Remove the -WhatIf safety-switch if you are satisfied with the results shown on screen
You could use a switch statement to decide on the target folder based on the first part of the file name:
$files = Get-ChildItem path\to\folder\with\xml\files -Filter *.xml
switch($files)
{
{$_.Name -like '1001*'} {
$_ |Move-Item -Destination 'C:\path\to\Peter'
}
{$_.Name -like '1002*'} {
$_ |Move-Item -Destination 'C:\path\to\John'
}
{$_.Name -like '1003*'} {
# etc...
}
default {
Write-Warning "No matching destination folder for file '$($_.Name)'"
}
}
If you change your mind about loops, my preference would be to store the mapping in a hashtable and loop over the entries for each file:
$files = Get-ChildItem path\to\folder\with\xml\files -Filter *.xml
$targetFolders = #{
'1001' = 'C:\path\to\Peter'
'1002' = 'C:\path\to\John'
'1003' = 'C:\path\to\Paul'
'1004' = 'C:\path\to\George'
'1005' = 'C:\path\to\Ringo'
}
foreach($file in $files){
$targetFolder = $targetFolders.Keys.Where({$file.Name -like "${_}*"}, 'First')
$file |Move-Item -Destination $targetFolder
}
I created a text file containing the child items from the parent folder. I removed the header and called the file. I am able to iterate through the file, but I am unable to forest out the items in the new parent directory.
#create txt for array
Get-Item -Path HKLM:\test\Software\Microsoft\Windows\Shell\Bags\* | Out-File C:\test\shell.txt
$files = "C:\test\shell.txt"
#removes header
get-content $files | select -Skip 7 | set-content "$files-temp"
move "$files-temp" $files -Force
#iterates array
Get-Content $files | ForEach-Object { Get-Item -Path HKLM:\test\Software\Microsoft\Windows\Shell\Bags\$_\* }
I need to be able to iterate through the list and obtain the information inside the folders being iterated.
Current output example:
Hive: HKEY_LOCAL_MACHINE\TEST\Software\Microsoft\Windows\Shell\Bags
Name Property
---- --------
1
10
11
12
13
14
_____________________________________________________________________________
Solution
$filedir = "HKLM:\test\Software\Microsoft\Windows\Shell\Bags"
foreach($file in Get-ChildItem -Recurse $filedir){
echo $file >> "C:\test\shell.csv"
}
Actually, I don't like how powershell does the registry that much. Here's a script called "get-itemproperty2.ps1":
param([parameter(ValueFromPipeline)]$key)
process {
$valuenames = $key.getvaluenames()
if ($valuenames) {
$valuenames | foreach {
$value = $_
[pscustomobject] #{
Path = $key -replace 'HKEY_CURRENT_USER',
'HKCU:' -replace 'HKEY_LOCAL_MACHINE','HKLM:'
Name = $Value
Value = $Key.GetValue($Value)
Type = $Key.GetValueKind($Value)
}
}
} else {
[pscustomobject] #{
Path = $key -replace 'HKEY_CURRENT_USER',
'HKCU:' -replace 'HKEY_LOCAL_MACHINE','HKLM:'
Name = ''
Value = ''
Type = ''
}
}
}
With that in place, you can do:
get-item HKLM:\Software\Microsoft\Windows\currentversion\run | get-itemproperty2
Path Name Value Type
---- ---- ----- ----
HKLM:\Software\Microsoft\Windows\currentversion\run SecurityHealth C:\Program Files\Windows Defender\MSASCuiL.exe ExpandString
HKLM:\Software\Microsoft\Windows\currentversion\run DagentUI C:\Program Files\Altiris\Dagent\dagentui.exe String
HKLM:\Software\Microsoft\Windows\currentversion\run KeyAccess kass.exe String
You can easily export that to csv. Notice that I used get-item to get the top level key properties. get-childitem can't even do that. But you can pipe get-childitem -recurse to get-itemproperty2.
The folder structure is:
--root
--root\source-code\
--root\powershell-scripts\
I need the method below that is inside the \powershell-scripts folder to target files inside \source-code:
function Test($param)
{
dir -Include ASourceCodeFile.txt -Recurse |
% { SomeMethod $_ $param }
}
What am I missing?
The $PSScriptRoot automatic variable contains the path of the directory in which the current script is located. Use Split-Path to find its parent (your --root) and Join-Path to get the path to the source-code folder:
Join-Path -Path (Split-Path $PSScriptRoot -Parent) -ChildPath 'source-code'
$PSScriptRoot was introduced in PowerShell 3.0
A bit late, but maybe still helpful for someone:
Directory structure :
MyRoot\script\scriptrunning.ps1
config:
MyRoot\config.xml
to read the xml file from scriptrunning.ps1:
[xml]$Config = Get-Content -path "${PSScriptRoot}\..\config\config.xml"
if you have a script in --root\powershell-scripts\ and you want to reference something in --root\source-code\ or say get-content you can do this:
cd --root\powershell-scripts\
get-content '..\source-code\someFile.txt'
The ..\ references the parent directory which contains \source-code\ and then you reference or pull in file or scripts from that directory.
this was a trick that I used in vbs that I converted to PS...
$scriptPath = Split-Path $MyInvocation.MyCommand.Path -Parent
$a = $scriptPath.split("``\``") for ($i = 0 ; $i -lt $a.count-1 ; $i++){
$parentDir = $parentDir + $a[$i] <br>
if($i -lt $a.count-2){$parentDir = $parentDir + "``\``"}
}
Write-Output $parentDir