I am writing an uninstall script for Office using Powershell. I need the script to search through:
C:\windows\ccmcache\
There are a number of folders in there. Is there a way to find which folder has the contents that I am looking for? So let's say I am looking for a folder that contains:
office.en-us
office64.en-us
and so on. How can I return that exact path? Because that is where I am running the uninstall from but the catch is I don't know what folder Office is in the ccmcahce folder.
Thank you all in advance.
No loop needed.
dir "C:\windows\ccmcache\*\Office.en-us" -Directory
Note that dir is an alias for Get-ChildItem.
#Tomalak when running the code you posted it works if there aren't a large number of files and folders to sort through, but if you really have no idea where the file or folder you're looking for lives at then specifying a -recurse parameter would be helpful
Get-Childitem "C:\*\Office.en-us" -Recurse
While this is sometimes necessary (in this case it doesn't seem to be), it can take an extended amount of time to run if there are large folders to sort through...
Related
Pretty noob at writing PS scripts - wrote this up and have been actively using it although still requires some manual intervention trying to achieve my goal, which I would like to automate completely.
I will try my best to explain clearly;
I am trying to copy '.bak' files to a specific directory from a source folder that has files dropped in it on a daily basis. Problem is the way I created the script, every time it runs it creates a new folder with some of the same files as previously copied.
The files being copied all follow the same name structure in date sequence;
xxxx_2018_01_01_2131231.bak
xxxx_2018_01_02_2133212.bak
xxxx_2018_01_03_2199531.bak
How could I write the script so that it copies newer files only and not what has already been copied previously?
It would also be nice to only create a new folder then a certain part of the file name changes.
Here is the script;
$basedir = "Path:\path"
$today = (Get-Date).ToString('MM_dd_yy')
$Filter = '*.bak'
$location = New-Item -Path $basedir -Type Directory -Name $today
Copy-Item -Path 'Path:\path' -Destination $location -Filter $Filter -Recurse
Any pointers are greatly appreciated!
Thanks in advance for your help!
I'm not sure if there is an easy way to code this, but the general answer would be using the Get-ChildItem cmdlet.
"The Get-ChildItem cmdlet gets the items in one or more specified locations. If the item is a container, it gets the items inside the container, known as child items. You can use the -Recurse parameter to get items in all child containers and use the -Depth parameter to limit the number of levels to recurse."
By using the Get-ChildItem, you could get the listing of files that are in both directories, and then compare them to see if they have the same name. Then build an if() argument based on criteria you wish to use to compare them.
It's not the complete answer, but it is a good starting point.
Thanks everyone for pitching in, much appreciated!
I have switched over to the batch file route and have created the following to accomplish my goal;
#echo off
setlocal
set _source="C:\users\user1\desktop\source"
set _dest="C:\users\user1\desktop\dest"
robocopy %_source% %_dest% *.bak /mir /XC /XN /XO
Any opinion on this script is encouraged!
Is there a bat command which checks if a specific directory (not recursive in subdiretories) has a file with a specific size (like 109485 bytes)?
If I have to scan the entire directory only to check this, is it possible to only scan the 5 most recent files (last changed) to check if their size matches?
I cant scan an entire directory cause this code will execute every 3 second so I really need something really efficient.
If you're willing to involve PowerShell in the process, you could put something like this in a batch file.
PowerShell -Command "ls C:\Windows\System32\ | Where-Object {$_.length -eq 30720}"
This command lists all the files in system32 and then filters the list by files that are 30720 bytes. Where-Object also has a bunch of other useful properties and switches that you can filter by.
https://learn.microsoft.com/en-us/previous-versions/windows/it-pro/windows-powershell-1.0/ee177028(v=technet.10)
I am using Compress-Archive and want to zip the current directory into the same path. However I do not want to have to type out the entire file path both times. Is there an easy way to do this?
I am using windows 10 pro.
This works for the most part Compress-Archive . test.zip but I want it to be on the same level as the current directory so I need to put it back one spot.
Something like this is what I want:
path/test
path/test.zip
What I am getting:
path/test
path/test/test.zip
It is going inside the actual folder which is not what I want
You propably want that:
Compress-Archive * ..\test.zip
The wildcard * avoids that the name of the folder is put inside the zip.
Using .. for the output path we go one level up in the directory tree.
This command will fail if test.zip already exists. Either add parameter -update to update the archive or add -force to overwrite the archive. Both can be used even if the archive does not already exist.
If the current working directory is "t", it can be included using the following command. I would note that I do not think putting the destination .zip file in the directory being compressed is a good idea.
Compress-Archive -Path $(Get-ChildItem -Recurse -Exclude t.zip) -DestinationPath .\t.zip -Force
It is shorter if you are willing to use aliases and cryptic switches.
Compress-Archive $(gci -r -e t.zip) .\t.zip -Force
If I have misinterpreted your situation, please leave a comment or improve the information provided by editing the question.
I've got 218GB of assorted files recovered from a failing hard drive using PhotoRec. The files do not have their original file names and they're not sorted in any manner.
How can I go about sorting the files into separate folders by file type? I've tried searching for .jpg, for example, and I can copy those results into a new folder. But when I search for something like .txt, I get 16GB of text files as the result and there's no way I've found to select them all and copy them into their own folder. The system just hangs.
This is all being done on Windows 10.
Open powershell. Change to the recovered data folder cd c:\...\recovered_files. Make a directory for the text files mkdir text_files. Do the move mv *.txt text_files.
You really just want to move/cut the files like this instead of copying, because moving the files is just a name change (very fast), but to copy would have to duplicate all of the data (quite slow).
If your files are distributed among many directories, you would need to use a find command. In Linux, this would be quite simple with the command, find. In Windows, I have never tried anything like this. On MSDN there is an article about PowerShell that features an example which seems reminiscient of what you want to do. MSDN Documentation
The gist of it is that you would use the command:
cd <your recovered files directory containing the recup_dir folders>
Get-ChildItem -Path ".\*.txt" -Recurse | Move-Item -Verbose -Destination "Z:\stock_recovered\TXT"
Note that the destination is outside of the search path, which might be important!
Since I have never tried this before, there is NO WARRANTY. Supposing it works, I would be curious to know.
I have set of files in a folder with name like abcd.15678
I want to remove the . and replace it with _
Pls suggest the windows command to do this
This solution is reposted from How to Batch Rename Files in Windows: 4 Ways to Rename Multiple Files by Chris Hoffman
PowerShell offers much more flexibility for renaming files in a command-line environment. Using PowerShell, you can pipe the output of one command – known as a “commandlet” in PowerShell terms — to another command, just like you can on Linux and other UNIX-like systems.
First of all, open Powershell ISE and then navigate to the directory (folder) that has the files and folders you'd like to rename by using this command:
cd "C:\your\directory\"
The two important commands you’ll need are Dir, which lists the files in the current directory, and Rename-Item, which renames an item (a file, in this case). Pipe the output of Dir to Rename-Item and you’re in business.
After you launch PowerShell ISE, use the cd command to enter the directory containing your files. You should put the files in their own directory so you don’t accidentally rename other files.
For example, let’s say we don’t want the dot character in our file names – we’d rather have an underscore instead.
The following command lists the files in the current directory and pipes the list to Rename-Item. Rename-Item replaces each dot character with an underscore.
Dir | Rename-Item –NewName { $_.name –replace ".","_" }
Consult Microsoft’s documentation on the Rename-Item commandlet if you want help performing other, more advanced operations.
There isn't a windows command to do this. You should consider writing a script of some sort that obtains a directory listing and enumerates through each entry: changes the dot to an underscore, and calls the windows rename command appropriately.
Actually this should work :
Dir | Rename-Item –NewName { $_.Name.Replace(".","_") }