How to remove partial path from Get-Location output? - windows

I'm trying to write a custom prompt for PowerShell and I was wondering how I would filter out the 1...n directories in the output of Get-Location.
function prompt {
"PS " + $(get-location) + "> "
}
So, if the path is too long I would like to omit some of the directories and just display PS...blah\blah> or something. I tried (get-container) - 1 but it doesn't work.

Use Split-Path with the -Leaf parameter if you want just the last element of a path:
function prompt {
"PS {0}> " -f (Split-Path -Leaf (Get-Location))
}

I wanted to make a more dynamic function. I do just basic string manipulation. You could do some logic nesting Split-Path but the string manipulation approach is just so much more terse. Since what you want to be returned wont be a fully validated path I feel better offering this solution.
Function Get-PartialPath($path, $depth){
If(Test-Path $path){
"PS {0}>" -f (($path -split "\\")[-$depth..-1] -join "\")
} else {
Write-Warning "$path is not a valid path"
}
}
Sample Function call
Get-PartialPath C:\temp\folder1\sfg 2
PS folder1\sfg>
So you can use this simple function. Pass is a string for the path. Assuming it is valid then it will carve up the path into as many trailing chunks as you want. We use -join to rebuild it. If you give a $depth number that is too high the whole path will be returned. So if you only wanted to have 3 folders being shown setting the $depth for 3.

Ansgar Wiechers' answer will give you the last directory but if you want a way to do multiple directories at the end of the filepath (using the triple dot notation) you can cast the directory path to a uri and then just get and join the segments:
function prompt {
$curPath = pwd
$pathUri = ([uri] $curPath.ToString())
if ($pathUri.Segments.Count -le 3) {
"PS {0}>" -f $curPath
} else {
"PS...{0}\{1}>" -f $pathUri.Segments[-2..-1].trim("/") -join ""
}
}
Or using just a string (no uri cast)
function prompt {
$curPath = pwd
$pathString = $curPath.Tostring().split('\') #Changed; no reason for escaping
if ($pathString.Count -le 3) {
"PS {0}>" -f $curPath
} else {
"PS...{0}\{1}>" -f $pathString[-2..-1] -join ""
}
}
$a = prompt
Write-Host $a
Then just change -2 to whatever you want to be the first directory and -le 3 to match. I typically use the uri cast when I have to run stuff through a browser or over connections to Linux machines (as it uses "/" as a path separator) but there is no reason to not use the string method for normal operations.

Related

Powershell IF conditional isn't firing in the way I expected. Unsure what I'm doing wrong

I am writing a simple script that makes use of 7zip's command-line to extract archives within folders and then delete the original archives.
There is a part of my script that isn't behaving how I would expect it to. I can't get my if statement to trigger correctly. Here's a snippet of the code:
if($CurrentRar.Contains(".part1.rar")){
[void] $RarGroup.Add($CurrentRar)
# Value of CurrentRar:
# Factory_Selection_2.part1.rar
$CurrentRarBase = $CurrentRar.TrimEnd(".part1.rar")
# Value: Factory_Selection_2
for ($j = 1; $j -lt $AllRarfiles.Count; $j++){
$NextRar = $AllRarfiles[$j].Name
# Value: Factory_Selection_2.part2.rar
if($NextRar.Contains("$CurrentRarBase.part$j.rar")){
Write-Host "Test Hit" -ForegroundColor Green
# Never fires, and I have no idea why
# [void] $RarGroup.Add($NextRar)
}
}
$RarGroups.Add($RarGroup)
}
if($NextRar.Contains("$CurrentRarBase.part$j.rar")) is the line that I can't get to fire.
If I shorten it to if($NextRar.Contains("$CurrentRarBase.part")), it fires true. But as soon as I add the inline $j it always triggers false. I've tried casting $j to string but it still doesn't work. Am I missing something stupid?
Appreciate any help.
The issue seems to be your for statement and the fact that an array / list is zero-indexed (means they start with 0).
In your case, the index 0 of $AllRarfiles is probably the part1 and your for statement starts with 1, but the file name of index 1 does not contain part1 ($NextRar.Contains("$CurrentRarBase.part$j.rar"), but part2 ($j + 1).
As table comparison
Index / $j
Value
Built string for comparison (with Index)
0
Factory_Selection_2.part1.rar
Factory_Selection_2.part0.rar
1
Factory_Selection_2.part2.rar
Factory_Selection_2.part1.rar
2
Factory_Selection_2.part3.rar
Factory_Selection_2.part2.rar
3
Factory_Selection_2.part4.rar
Factory_Selection_2.part3.rar
Another simpler approach
Since it seems you want to group split RAR files which belong together, you could also use a simpler approach with Group-Object
# collect and group all RAR files.
$rarGroups = Get-ChildItem -LiteralPath 'C:\somewhere\' -Filter '*.rar' | Group-Object -Property { $_.Name -replace '\.part\d+\.rar$' }
# do some stuff afterwards
foreach($rarGroup in $rarGroups){
Write-Verbose -Verbose "Processing RAR group: $($rarGroup.Name)"
foreach($rarFile in $rarGroup.Group) {
Write-Verbose -Verbose "`tCurrent RAR file: $($rarFile.Name)"
# do some stuff per file
}
}

If File Exists Just Change File Name

Am I missing the obvious here, or have I coded incorrectly? I simply want to when processing my syntax check if the file exists, if it does, save in the exact same location, but append the words "_RoundTwo" to the end of the second file. My syntax doesn't error, but the second file is never created. Can someone point out my err?
$SaveLocation = "C:\Completed\"
$WorkbookName = "Intro"
if ((Test-Path $SaveLocation\$WorkbookName + ".csv"))
{
[IO.Path]::GetFileNameWithoutExtension($WorkbookName) + "_RoundTwo" + [IO.Path]::GetExtension($WorkbookName)
}
[IO.Path]::GetFileNameWithoutExtension
That method will not create a file, it just returns a string containing the filename with its extension stripped off.
If you want to copy the file, then you need to copy, but there is a simpler way by making use of a pipeline without any objects does nothing:
dir $SaveLocation\$WorkbookName + ".csv" |
foreach-object {
$dest = $_.DirectoryName +
'\' +
[io.path]::GetFileNameWithoutExtension($_.FullName) +
$_.Extension
copy-item $_ $dest
}
If the dir does not match a file, then there is no object on the pipeline for foreach-object to process. Also the pipeline variable $_ contains lots of information to reuse (look at the results of dir afile | format-list *).

Awk command for powershell

Is there any command like awk in powershell?
I want to execute this command:
awk '
BEGIN {count=1}
/^Text/{text=$0}
/^Time/{time=$0}
/^Rerayzs/{retext=$0}
{
if (NR % 3 == 0) {
printf("%s\n%s\n%s\n", text, time, retext) > (count ".txt")
count++
}
}' file
to a powershell command.
Usually we like to see what you have tried. It at least shows that you are making an effort, and we aren't just doing your work for you. I think you're new to PowerShell, so I'm going to just spoon-feed you an answer, hoping that you use it to learn and expand your knowledge, and hopefully have better questions in the future.
I am pretty sure that this will accomplish the same thing as what you laid out. You have to give it an array of input (the contents of a text file, an array of strings, something like that), and it will generate several files depending on how many matches it finds for the treo "Text", "Time", and "Rerayzs". It will order them as Text, then a new line with Time, and then a new line with Rerayzs.
$Text,$Time,$Retext = $Null
$FileCounter = 1
gc c:\temp\test.txt|%{
Switch($_){
{$_ -match "^Text"} {$Text = $_}
{$_ -match "^Time"} {$Time = $_}
{$_ -match "^Rerayzs"} {$Retext = $_}
}
If($Text -and $Time -and $Retext){
("{0}`n{1}`n{2}") -f $Text,$Time,$Retext > "c:\temp\$FileCounter.txt"
$FileCounter++
$Text,$Time,$Retext = $Null
}
}
That will get the text of a file C:\Temp\Test.txt and will output numbered files to the same location. The file I tested against is:
Text is good.
Rerayzs initiated.
Stuff to not include
Time is 18:36:12
Time is 20:21:22
Text is completed.
Rerayzs failed.
I was left with 2 files as output. The first reads:
Text is good.
Time is 18:36:12
Rerayzs initiated.
The second reads:
Text is completed.
Time is 20:21:22
Rerayzs failed.

How can I make Perl's File::Find faster?

I have a folder named Lib and I am using the File::Find module to search that folder in whole dir say, D:\. It's taking a long time to search, say even 5 mins if the drive has a lot of subdirectories. How can I search that Lib faster so it will be done in seconds?
My code looks like this:
find( \&Lib_files, $dir);
sub Lib_files
{
return unless -d;
if ($_=~m/^([L|l]ib(.*))/)
{
print"$_";
}
return;
}
Searching the file system without a preexisting index is IO bound. Otherwise, products ranging from locate to Windows Desktop Search would not exist.
Type D:\> dir /b/s > directory.lst and observe how long it takes for that command to run. You should not expect to beat that without indexing files first.
One major improvement you can make is to print less often. A minor improvement is not to use capturing parentheses if you are not going to capture:
my #dirs;
sub Lib_files {
return unless -d $File::Find::name;
if ( /^[Ll]ib/ ) {
push #dirs, $File::Find::name;
}
return;
}
On my system, a simple script using File::Find to print the names of all subdirectories under my home directory with about 150,000 files takes a few minutes to run compared to dir %HOME% /ad/b/s > dir.lst which completes in about 20 seconds.
I would be inclined to use:
use File::Basename;
my #dirs = grep { fileparse($_) =~ /^[Ll]ib/ }
split /\n/, `dir %HOME% /ad/b/s`;
which completed in under 15 seconds on my system.
If there is a chance there is some other dir.exe in %PATH%, cmd.exe's built-in dir will not be invoked. You can use qx! cmd.exe /c dir %HOME% /ad/b/s ! to make sure that the right dir is invoked.
how about not using File::Find module
use Cwd;
sub find{
my ($wdir) = shift;
my ($sdir) = &cwd;
chdir($wdir) or die "Unable to enter dir $wdir:$!\n";
opendir(DIR, ".") or die "Unable to open $wdir:$!\n";
foreach my $name (readdir(DIR) ){
next if ($name eq ".");
next if ($name eq "..");
if (-d $name){
&find($name);
next;
}
print $name ."\n";
chdir($sdir) or die "Unable to change to dir $sdir:$!\n";
}
closedir(DIR);
}
&find(".");

Locating a file on the path

Does anybody know how to determine the location of a file that's in one of the folders specified by the PATH environmental variable other than doing a dir filename.exe /s from the root folder?
I know this is stretching the bounds of a programming question but this is useful for deployment-related issues, also I need to examine the dependencies of an executable. :-)
You can use the where.exe utility in the C:\Windows\System32 directory.
For WindowsNT-based systems:
for %i in (file) do #echo %~dp$PATH:i
Replace file with the name of the file you're looking for.
If you want to locate the file at the API level, you can use PathFindOnPath. It has the added bonus of being able to specify additional directories, in case you want to search in additional locations apart from just the system or current user path.
On windows i'd say use %WINDIR%\system32\where.exe
Your questions title doesn't specify windows so I imagine some folks might find this question looking for the same with a posix OS on their mind (like myself).
This php snippet might help them:
<?php
function Find( $file )
{
foreach( explode( ':', $_ENV( 'PATH' ) ) as $dir )
{
$command = sprintf( 'find -L %s -name "%s" -print', $dir, $file );
$output = array();
$result = -1;
exec( $command, $output, $result );
if ( count( $output ) == 1 )
{
return( $output[ 0 ] );
}
}
return null;
}
?>
This is slightly altered production code I'm running on several servers. (i.e. taken out of OO context and left some sanitation and error checking out for brevity.)
Using PowerShell on Windows...
Function Get-ENVPathFolders {
#.Synopsis Split $env:Path into an array
#.Notes
# - Handle 1) folders ending in a backslash 2) double-quoted folders 3) folders with semicolons 4) folders with spaces 5) double-semicolons i.e. blanks
# - Example path: 'C:\WINDOWS\;"C:\Path with semicolon; in the middle";"E:\Path with semicolon at the end;";;C:\Program Files;
# - 2018/01/30 by Chad#ChadsTech.net - Created
$NewPath = #()
$env:Path.ToString().TrimEnd(';') -split '(?=["])' | ForEach-Object { #remove a trailing semicolon from the path then split it into an array using a double-quote as the delimeter keeping the delimeter
If ($_ -eq '";') { # throw away a blank line
} ElseIf ($_.ToString().StartsWith('";')) { # if line starts with "; remove the "; and any trailing backslash
$NewPath += ($_.ToString().TrimStart('";')).TrimEnd('\')
} ElseIf ($_.ToString().StartsWith('"')) { # if line starts with " remove the " and any trailing backslash
$NewPath += ($_.ToString().TrimStart('"')).TrimEnd('\') #$_ + '"'
} Else { # split by semicolon and remove any trailing backslash
$_.ToString().Split(';') | ForEach-Object { If ($_.Length -gt 0) { $NewPath += $_.TrimEnd('\') } }
}
}
Return $NewPath
}
$myFile = 'desktop.ini'
Get-ENVPathFolders | ForEach-Object { If (Test-Path -Path $_\$myFile) { Write-Output "Found [$_\$myFile]" } }
I also blogged the answer with some details over at http://blogs.catapultsystems.com/chsimmons/archive/2018/01/30/parse-envpath-with-powershell
In addition to the 'which' (MS Windows) and 'where' (unix/linux) utilities, I have written my own utility which I call 'findinpath'. In addition to finding the executable that would be executed, if handed to the command line interpreter (CLI), it will find all matches, returned path-search-order so you can find path-order problems. In addition, my utility returns not just executables, but any file-specification match, to catch those times when a desired file isn't actually executable.
I also added a feature that has turned out to be very nifty; the -s flag tells it to search not just the system path, but everything on the system disk, known user-directories excluded. I have found this feature to be incredibly useful in systems administration tasks...
Here's the 'usage' output:
usage: findinpath [ -p <path> | -path <path> ] | [ -s | -system ] <file>
or findinpath [ -h | -help ]
where: <file> may be any file spec, including wild cards
-h or -help returns this text
-p or -path uses the specified path instead of the PATH environment variable.
-s or -system searches the system disk, skipping /d /l/ /nfs and /users
Writing such a utility is not hard and I'll leave it as an exercise for the reader. Or, if asked here, I'll post my script - its in 'bash'.
just for kicks, here's a one-liner powershell implementation
function PSwhere($file) { $env:Path.Split(";") | ? { test-path $_\$file* } }

Resources