How to step backwards in a path using Powershell - windows

I am searching through folders in order to find the one that has the contents that I desire.
$path = dir "C:\windows\ccmcache\*\Office.en-us" -Directory
echo $path
It returns:
Directory: C:\windows\ccmcache\c
But when I run my command:
Start-Process "$path\setup.exe /uninstall ProPlus /config Uninstall.xml" -Wait
It tries to run:
C:\windows\ccmcache\c\Office.en-us\setup.exe............
Which doesn't exist! So how can I go back a step so I can run the setup.exe command out of the c folder?
Something like:
$path2 = $path\cd..
Thank you all in advance.

You can simply do:
$Path2 = Resolve-Path (Join-Path $Path '..')
Note:
Join-Path is the cross platform way of concatenating path strings
Resolve-Path will give you a fully qualified path name
^^ step is optional, since windows will traverse the .. for you, but it helps to visually see the folder it resolves to.
Does this help?

You are using Get-ChildItem to return a System.IO.DirectoryInfo object. The path you are looking for already exists there as the Parent property.
$path2 = $path.Parent.FullName
No other cmdlets are needed here. You don't even need to save it into another variable if you don't want to.
Beware that your $path could have multiple results which will have consequences later in your code. If you only cared about the first one you could add | Select -First 1 to guarantee only one result.

It can be done simply using Resolve-Path function.
suppose structure is like following
root
Folder1 Folder2
and our current working directory is Folder1 and we want to move to Folder2.
$path2 = Resolve-Path("$path\..\Folder2\fileinFolder2")

I'm not sure I completely understand what you're asking.. Do you mean how to backtrack in your file directory? That command is "cd .."
Do you mean how to call $path THEN move one level higher in the directory? If so you'll need to create a new $var that is one level higher before calling your setup.exe

Related

How do I reference multiple paths in Powershell?

I am writing a script that pulls from multiple directories located at the root folder of the script. When I use this:
$ScriptPath="$PSScriptRoot\Scripts"
$BinaryPath="$PSScriptRoot\Binaries"
$DataFolderPath="$PSScriptRoot\Data"
PowerShell complains about 2 of the 3 paths saying they can't be found.
I also tried this but no luck.
$ScriptPath=Join-Path (Split-Path $PSScriptRoot) -ChildPath Scripts
$BinaryPath=Join-Path (Split-Path $PSScriptRoot) -ChildPath Binaries
$DataFolderPath=Join-Path (Split-Path $PSScriptRoot) -ChildPath Data
I would try:
$ScriptPath=$PSScriptRoot+"\Scripts"
$BinaryPath=$PSScriptRoot+"\Binaries"
$DataFolderPath=$PSScriptRoot+"\Data"
How are you running this?
This...$PSScriptRoot, will only be populated when you run the full script, not when you run this in the ISE/VSCode, or other editor pane window.
Lastly, $PSScriptRoot, as documented...
About Automatic Variables
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_automatic_variables?view=powershell-7.1
PSScriptRoot Contains the full path to the script that invoked
... thus, if you are not in a source directory with that set of named subfolders, then you should expect this to fail.
If you want this to work, then you must check for the existence of the subfolders before you begin, and if not already there, then, your script should create them, then run the rest of your code.

PS script copy new files

Pretty noob at writing PS scripts - wrote this up and have been actively using it although still requires some manual intervention trying to achieve my goal, which I would like to automate completely.
I will try my best to explain clearly;
I am trying to copy '.bak' files to a specific directory from a source folder that has files dropped in it on a daily basis. Problem is the way I created the script, every time it runs it creates a new folder with some of the same files as previously copied.
The files being copied all follow the same name structure in date sequence;
xxxx_2018_01_01_2131231.bak
xxxx_2018_01_02_2133212.bak
xxxx_2018_01_03_2199531.bak
How could I write the script so that it copies newer files only and not what has already been copied previously?
It would also be nice to only create a new folder then a certain part of the file name changes.
Here is the script;
$basedir = "Path:\path"
$today = (Get-Date).ToString('MM_dd_yy')
$Filter = '*.bak'
$location = New-Item -Path $basedir -Type Directory -Name $today
Copy-Item -Path 'Path:\path' -Destination $location -Filter $Filter -Recurse
Any pointers are greatly appreciated!
Thanks in advance for your help!
I'm not sure if there is an easy way to code this, but the general answer would be using the Get-ChildItem cmdlet.
"The Get-ChildItem cmdlet gets the items in one or more specified locations. If the item is a container, it gets the items inside the container, known as child items. You can use the -Recurse parameter to get items in all child containers and use the -Depth parameter to limit the number of levels to recurse."
By using the Get-ChildItem, you could get the listing of files that are in both directories, and then compare them to see if they have the same name. Then build an if() argument based on criteria you wish to use to compare them.
It's not the complete answer, but it is a good starting point.
Thanks everyone for pitching in, much appreciated!
I have switched over to the batch file route and have created the following to accomplish my goal;
#echo off
setlocal
set _source="C:\users\user1\desktop\source"
set _dest="C:\users\user1\desktop\dest"
robocopy %_source% %_dest% *.bak /mir /XC /XN /XO
Any opinion on this script is encouraged!

How to use short-cut paths to Compress-Archive to zip current folder into same destination

I am using Compress-Archive and want to zip the current directory into the same path. However I do not want to have to type out the entire file path both times. Is there an easy way to do this?
I am using windows 10 pro.
This works for the most part Compress-Archive . test.zip but I want it to be on the same level as the current directory so I need to put it back one spot.
Something like this is what I want:
path/test
path/test.zip
What I am getting:
path/test
path/test/test.zip
It is going inside the actual folder which is not what I want
You propably want that:
Compress-Archive * ..\test.zip
The wildcard * avoids that the name of the folder is put inside the zip.
Using .. for the output path we go one level up in the directory tree.
This command will fail if test.zip already exists. Either add parameter -update to update the archive or add -force to overwrite the archive. Both can be used even if the archive does not already exist.
If the current working directory is "t", it can be included using the following command. I would note that I do not think putting the destination .zip file in the directory being compressed is a good idea.
Compress-Archive -Path $(Get-ChildItem -Recurse -Exclude t.zip) -DestinationPath .\t.zip -Force
It is shorter if you are willing to use aliases and cryptic switches.
Compress-Archive $(gci -r -e t.zip) .\t.zip -Force
If I have misinterpreted your situation, please leave a comment or improve the information provided by editing the question.

how do I find all exe files using command line for windows?

I'm a newbie. I am trying to figure out how to use the command line. Please could you tell me what command I should enter so that I can get a list of all the exe files on my computer. thanks.
You can use the dir functionality to search the directory and all of its children directories while filtering on a particular file type.
dir /s /b *.exe | findstr /v .exe.
Source
If you want to find all the executable files that are on the path and/or in the current directory, i.e., all the files you can run from the command line without specifying a path, this should work:
where *.exe
To get names of all .exe files , that are currently running then type tasklist in cmd.
http://ss64.com/nt/tasklist.html
Here's another method I use a lot for tasks like this.
Open powershell and navigate to your root directory by entering the command
cd c:/
cd stands for change directory, and is an alias for the command "Set-Location". We are setting the location to C:/
Next run the following command:
Get-ChildItem -Filter "*.exe" -Recurse
Get-ChildItem is a function that gets the files and folders in a file system drive, and runs on whatever directory you're current at by default.
-Filter "*.exe" is an argument that specifies to only find filenames which end in ".exe". (The * is a type of regular expression notation).
-Recurse is an argument that specifies to search all child directories. This will make your function run on "C:/", but also all child directories of C:/, and all child directories of those directories and so on. This will allow you to search the entire drive.

Powershell find and replace not recursing through sub-directories

I would like to recurse through all subfolders of a given folder, and for all files of a given extension (optionally wildcard), search and replace some piece of text.
There's a similar previous question here, and I'm trying to get a modified version of the top answer working for my purpose. It uses Powershell for the find and replace. However, my modified version below is not recursing. I don't remember where I found the recurse part of the first line, so maybe I'm using that incorrectly.
Get-ChildItem *.* -exclude *.ps1* -recurse |
Foreach-Object {
$c = ($_ | Get-Content)
$c = $c -replace 'foo','bar'
[IO.File]::WriteAllText($_.FullName, ($c -join "`r`n"))
}
I am running the code from the Powershell command line as a PS batch file, .ps1, hence the exclusion of that extension.
This works fine if I run it directly in a folder that has some files I want searched, but if I run it from a parent of that folder, the searching/ recursion of subfolders does not happen. It doesn't even try to, which I can tell from the fact that the command finishes running instantaneously. If I run it in my first subfolder, which has about 50 text files, there's a couple seconds delay before it finishes, indicating to me it's actually searching those files.
I've also tried this with the exclude portion removed and the behaviour is the same.
My guess is that *.* matches only items with a dot in them. Folders have no extension, so they aren't matched.
gci -rec -force -exclude *.ps1 | should do what you want.

Resources