There's something I want to accomplish with administrating my Windows file server:
I want to change the "Last Modified" date of all the folders on my server (just the folders and subfolders, not the files within them) to be the same as the most recent "Created" (or maybe "Last Modified") date file within the folder. (In many cases, the date on the folder is MUCH newer than the newest file within it.)
I'd like to do this recursively, from the deepest subfolder to the root. I'd also like to do this without me manually entering any dates and times.
I'm sure with a combination of scripting and a Windows port of "touch" I could maybe accomplish this. Do you have any suggestions? I could maybe accomplish this. Do you have any suggestions?
This closed topic seems really close but I'm not sure how to only touch folders without touching the files inside, or how to get the most-recent-file's date. Recursive touch to fix syncing between computers
If it's for backup purposes, in Windows there is the Archive flag (instead of modifying the timestamp). You can set it recursively with ATTRIB /S (see ATTRIB /?)
If it is for other purposes you can use some touch.exe implementation and use a recursive for:
FOR /R (see FOR /?)
http://ss64.com/nt/for_r.html
http://ss64.com/nt/touch.html
I think that you can do this in PowerShell. I just tried throwing something together and it seems to work correctly. You could invoke this in PowerShell using Set-DirectoryMaxTime(".\Directory") and it will operate recursively on each directory under that.
function Set-DirectoryMaxTime([System.IO.DirectoryInfo]$directory)
{
# Grab a list of all the files in the directory
$files = Get-ChildItem -File $directory
# Get the current CreationTime of the directory we are looking at
$maxdate = Get-Date $directory.CreationTime
# Find the most recently edited file's LastWriteTime
foreach($file in $files)
{
if($file.LastWriteTime -gt $maxdate) { $maxdate = $file.LastWriteTime }
}
# This needs to be in a try/catch block because there is a reasonable chance of it failing
# if a folder is currently in use
try
{
# Give the directory a LastWriteTime equal to the newest file's LastWriteTime
$directory.LastWriteTime = $maxdate
} catch {
# One of the directories could not be updated
Write-Host "Could not update directory: $directory"
}
# Get all the subdirectories of this directory
$subdirectories = Get-ChildItem -Directory $directory
# Jump into each of the subdirectories and do the same thing to each of their CreationTimes
foreach($subdirectory in $subdirectories)
{
Set-DirectoryMaxTime($subdirectory)
}
}
Related
I have several hundred folders where I will have multiple files called filename.ext but also another file called filename.ext.url
I need a way of checking if filename.pdf.url exists does filename.ext exist. If they both exist delete filename.ext.url
I can't just do a search and delete all *.url files as they will be needed if the normal file does not exist
I then need to repeat that in all subdirectories of a specific directory.
I don't mind its its a batch script, powershell script that does it or any other way really. I'm just stumped on how to do what I want.
Currently I'm doing it folder by folder, manually comparing file names, file size and file icon.
foreach ($file in ls -Recurse c:\files\*.url) {
if (ls -ErrorAction Ignore "$($file.PSParentPath)\$($file.basename)") {
remove-item $file.fullname -whatif
}
}
remove whatif when ready to delete.
the basename removes the extension, so if they are all .ext.url then it will check if that file exists. It also removes the path, so we pull that as well.
an alternative way (that more matches what you're explaining) is something like
foreach ($file in ls -Recurse "c:\files\*.url") {
### replacing '.url$' means .url at the end of the line in Regex
if (ls -ErrorAction Ignore ($file.FullName -replace '\.url$')) {
remove-item $file.fullname -whatif
}
}
for /r "startingdirectoryname" %b in (*.url) do if exist "%~dpnb" ECHO del "%b"
This is expected to be executed directly from the prompt. If the requirement is as a batch line, each % needs to be doubled (ie. %%).
This also assumes that the whatever.ext file is to be in the same directory as the whatever.ext.url file.
Note that the filenames that are to be deleted will merely be echoed to the console. To actually delete the files, remove the echo keyword.
Test against a test directory first!
[untested]
To check for "filename.ext", check for "filename.ext.", they are the same file.
In CMD.EXE, you can do "IF EXIST "filename.ext." CALL :DoIt
I downloaded a backup folder of about 3,000 files from our email service provider. None of the files have an associated filetype; instead the file extension was appended to the name of each individual file. For example:
community-involvement-photo-1-jpg
social-responsibility-31-2012-png
report-02-12-15-pdf
I can manually change the last dash to a period and the files work just fine. I'm wondering if there is a way to batch convert all of the files so they can be sorted and organized properly. I know in the Command Line I can do something like ren *. *.jpg but there are several different file types contained in the folder, so it wouldn't work for all of them. Is there any way I can tell it to convert the last "-" in each file name into a "." ?
I'm on Windows 10; unable to install any filename conversion programs unless I want to go through weeks of trouble with the IT group.
$ordner = "c:\temp\pseudodaten"
$Liste = (get-childitem -path $Ordner).Name
cd $ordner
foreach ($Datei in $Liste) {
$Length = $datei.length
$NeuerName=$Datei.Substring(0,$Length-4)+"."+$datei.Substring($Length - 3, 3)
rename-item -Path $Datei -NewName $NeuerName
}
I feel like this one should be easy, but it's giving me a bit of trouble. The goal is to get downloaded wsus updates migrated to a separate drive that can be burned to a disk and moved to a secure environment.
The current process was set up by another guy and I am taking over. He has 2 batch files that run we will call them first.bat and second.bat
first.bat is run that spits out a log of how many new files there are. We'll call it new.txt this simply contains the hash file paths for the changes i.e. C:\folder\sub\1A\HASHFILE.cab
Then, we copy the file paths in new.txt by hand and manually paste them into second.bat ... add in an xcopy function and a destination folder in a new drive. i.e. xcopy C:\folder\sub\1A\HASHFILE.cab E:\eport\**1A** However, we have to manually add in the hash folder identifier (1A for example) I would like the script to pick it up from the source folder path and add it into the destination to eliminate the potential for human error.
What I am trying to write a shell script to accomplish is this.
run first.bat
export info from new.txt and modify to
add xcopy parameter to all new files in new.txt as well as the destination folder path
automate the addition of the hash folder callout (i.e. 1A to the end of the destination folder path. i.e. E:\export\**1A**)
run second.bat
Anyone have any ideas? I would like to wrap all of this up (or a similar function if that's easier, i imagine it might be) into a handy script that will automate this tedious process.
I have tinkered with a few ideas, I get it to spit out the destination path, but it doesn't actually move anything. Here is what I have so far that successfully tags the hash marker at the end of the destination folder, but does nothing else:
$source = "C:\temp\test folder\"
$dest = "C:\Temp2\export\"
$folders = Get-ChildItem $source | Where-Object {$_.PSIsContainer -eq $true} | Sort-Object
foreach ($folder in $folders){
$fname = $folder.FullName
$sourcefolder = $fname.Replace($source,"")
$newdest = $dest + $sourcefolder
Write-Output $newdest
}
I have an existing directory, let's say "C:\Users\Test" that contains files (with various extensions) and subdirectories. I'm trying to write a Powershell script to that will put each file in "C:\Users\Test" into a uniquely named subdirectory, such as "\001", "\002", etc., while ignoring any existing subdirectories and files therein. Example:
Before running script:
C:\Users\Test\ABC.xlsx
C:\Users\Test\QRS.pdf
C:\Users\Test\XYZ.docx
C:\Users\Test\Folder1\TUV.gif
After running script:
C:\Users\Test\001\ABC.xlsx
C:\Users\Test\002\QRS.pdf
C:\Users\Test\003\XYZ.docx
C:\Users\Test\Folder1\TUV.gif
Note:
Names, extensions, and number of files will vary each time the script is run on a batch of files. The order in which files are placed into the new numbered subdirectories is not important, just that each subdirectory has a short, unique name. I have another script that will apply a consistent sequential naming convention for all subdirectories, but first I need to get all files into separate folders while maintaining their native file names.
This is where I'm at so far:
$id = 1
Get-ChildItem | where {!$_.PsIsContainer}| % {
MD ($_.root +($id++).tostring('000'));
MV $_ -Destination ($_.root +(001+n))
}
The MD expression successfully creates the subdirectories, but I not sure how to write the MV expression to actually move the files into them. I've written (001+n) to illustrate the concept I'm going for, where n would increment from 0 to the total number of files. Or perhaps an entirely different approach is needed.
$id = 1
Get-ChildItem C:\Test\ -file | % {
New-Item -ItemType Directory -Path C:\Test -Name $id.tostring('000')
Move-Item $_.FullName -Destination C:\Test\$($id.tostring('000'))
$id++
}
Move-Item is what you were looking for.
Ok, think I figured it out. Running the following scripts sequentially produces the desired result. The trick was resetting the $id increment when running the MV expression. This can probably be improved though, so let me know if you have a better way! Edited: #ArcSet has provided a better answer in a single script! Thank you!
$id = 1
Get-ChildItem | where {!$_.PsIsContainer}| % {
MD ($_.root +($id++).tostring('000'))
}
$id = 1
Get-ChildItem | where {!$_.PsIsContainer}| % {
MV $_ -Destination($_.root +($id++).tostring('000'))
}
I'm posting this question after extensive searches did not yield a solution to my problem.
Here's the problem: I have a folder in windows, with multiple sub folders. Each of them has 1 or more compressed (rar) folders:
-Master_folder
sub_folder1
rarfolder1
Sub_folder2
rarfolder1
and so on
Is there a way to extract the folder that sub_folderX (where X varies from 1 to 300) contains, into sub_folderX itself, and so on for all other sub folders?.
All posts/solutions out there on extracting multiple files simultaneously (even using CLI) talk about extracting everything into a single location. I observed similar results when experimenting with the Winrar GUI options.
However, i don't want to put them in a single location since the extracted folders have the same name. Their location within their outer folder is what differentiates them.
If you are open to scripting, you can recursively iterate over the subfolders using command line winrar and some batch scripting.
#Root drive where rar files are located
$Directory = "T:\*"
$rar = Get-ChildItem -path $Directory -Recurse -Include *.rar
foreach($line in $rar){
$unradir = $line.Directory
$rarFileLocation = $line.VersionInfo.FileName
C:\"Program Files (x86)"\WinRAR\unrar.exe e -ro- $rarFileLocation $unradir
}