Windows Compare filenames and delete 1 version of the filename - windows

I have several hundred folders where I will have multiple files called filename.ext but also another file called filename.ext.url
I need a way of checking if filename.pdf.url exists does filename.ext exist. If they both exist delete filename.ext.url
I can't just do a search and delete all *.url files as they will be needed if the normal file does not exist
I then need to repeat that in all subdirectories of a specific directory.
I don't mind its its a batch script, powershell script that does it or any other way really. I'm just stumped on how to do what I want.
Currently I'm doing it folder by folder, manually comparing file names, file size and file icon.

foreach ($file in ls -Recurse c:\files\*.url) {
if (ls -ErrorAction Ignore "$($file.PSParentPath)\$($file.basename)") {
remove-item $file.fullname -whatif
}
}
remove whatif when ready to delete.
the basename removes the extension, so if they are all .ext.url then it will check if that file exists. It also removes the path, so we pull that as well.
an alternative way (that more matches what you're explaining) is something like
foreach ($file in ls -Recurse "c:\files\*.url") {
### replacing '.url$' means .url at the end of the line in Regex
if (ls -ErrorAction Ignore ($file.FullName -replace '\.url$')) {
remove-item $file.fullname -whatif
}
}

for /r "startingdirectoryname" %b in (*.url) do if exist "%~dpnb" ECHO del "%b"
This is expected to be executed directly from the prompt. If the requirement is as a batch line, each % needs to be doubled (ie. %%).
This also assumes that the whatever.ext file is to be in the same directory as the whatever.ext.url file.
Note that the filenames that are to be deleted will merely be echoed to the console. To actually delete the files, remove the echo keyword.
Test against a test directory first!
[untested]

To check for "filename.ext", check for "filename.ext.", they are the same file.
In CMD.EXE, you can do "IF EXIST "filename.ext." CALL :DoIt

Related

How to get nested path of all empty windows directory using cmd

I have a folder and numerous nested folders are present inside that. Many of the folders are empty and I have to copy an empty.property file in those empty folders. So that the end result will be no folder will be completely empty, either it contains another folder, any other file(s) or this empty.property. I have tried to get all the paths using dir /b /s, but it is returning all the paths, not only the empty one. Can anyone help me to get that very efficiently. Thanks.
You can use powershell to do it several ways with one being:
Get-ChildItem -Path C:\Temp -Directory | Where-Object {$_.GetFileSystemInfos().Count -eq 0} |
ForEach-Object -Process { Copy-Item -Path "My\File\to\copy\Path" -Destination $_.FullName }
Basically checks to see which directory doesnt have no files or folders in it, then pipes it to a foreach to a process a Copy-Item request for whatever file/folder you want it to copy from, to the empty folder.

Automatic copying and renaming of a file as soon as it is in the directory

I'm relatively new to programming so I need some help.
I have 2 directories
C:\test1
C:\test2
So in test1 will get constantly get files.
Which look like this:
testA000_00001.txt0..txt
test00A0_00102.txt1..txt
test00A0_00102_00123.txt45..txt
...
testG000_00999.txt999..txt
testH000_00013.txt0..txt
Since its essential that the files in test1 stay the way that they are I'm gonna need them in test2.
And since test2 needs to be the current version it is needed to be done the moment the files are in test1.
But without the .txt0. - .txt999. part.
testA000_00001.txt
test00A0_00102.txt
test00A0_00102_00123.txt
...
testG000_00999.txt
testH000_00013.txt
Its also essential that these files are only copied once since they aren't gonna stay in test2 for long.
I tried it with xcopy and some other versions of copy but each time it copies the files back into test2 and after I move the files from test2 the files are copied into it again.
(sry cant comment yet)
I dont have exact code (working on it)
What i think would work is to:
Check a text file for things it alr coppied
schedual a task to run every minute or so
move to the folder you want
Use the dir command to take all files in the current directory
Copy them
Write these to a text file
Change The names
loop
When i get the code done i will edit.
Also moving them assoon as they enter the directory is very resource intensive but might be possible if the code is very efficient.
#make a folder
md "C:\test\test3"
#move to test1
cd "C:\test\test1"
# Presets
$txt = ".txt"
$files = dir
#clean up the files list
$files = $files -split "`r`n"
#for each file it found
foreach ($line in $files) {
#move the file from test1 to test 3
Copy-Item "C:\test\test1\$line" -destination "C:\Test\test3"
#take the first 14 chars from the file name
$line_name = $line.Substring(0, 14)
#add a .txt extension
$line_name = $line_name + $txt
#rename the files in test3
Rename-Item -path "C:\test\test3\$line" -NewName "$line_name"
#move the files to test 2 and if they alr exist replace(update) them
Move-Item -Path "C:\test\test3\$line_name" -Destination "C:\Test\test2" -Force
}
Remove-Item –path "C:\test\test3" –recurse -force
This Code is super close to working but i might not be able to finish today.
It only failes to move the file: test00A0_00102_00123.txt45..txt
due to the name not working. I will update the script when it works fully.

Powershell: Place all files in a specified directory into separate, uniquely named subdirectories

I have an existing directory, let's say "C:\Users\Test" that contains files (with various extensions) and subdirectories. I'm trying to write a Powershell script to that will put each file in "C:\Users\Test" into a uniquely named subdirectory, such as "\001", "\002", etc., while ignoring any existing subdirectories and files therein. Example:
Before running script:
C:\Users\Test\ABC.xlsx
C:\Users\Test\QRS.pdf
C:\Users\Test\XYZ.docx
C:\Users\Test\Folder1\TUV.gif
After running script:
C:\Users\Test\001\ABC.xlsx
C:\Users\Test\002\QRS.pdf
C:\Users\Test\003\XYZ.docx
C:\Users\Test\Folder1\TUV.gif
Note:
Names, extensions, and number of files will vary each time the script is run on a batch of files. The order in which files are placed into the new numbered subdirectories is not important, just that each subdirectory has a short, unique name. I have another script that will apply a consistent sequential naming convention for all subdirectories, but first I need to get all files into separate folders while maintaining their native file names.
This is where I'm at so far:
$id = 1
Get-ChildItem | where {!$_.PsIsContainer}| % {
MD ($_.root +($id++).tostring('000'));
MV $_ -Destination ($_.root +(001+n))
}
The MD expression successfully creates the subdirectories, but I not sure how to write the MV expression to actually move the files into them. I've written (001+n) to illustrate the concept I'm going for, where n would increment from 0 to the total number of files. Or perhaps an entirely different approach is needed.
$id = 1
Get-ChildItem C:\Test\ -file | % {
New-Item -ItemType Directory -Path C:\Test -Name $id.tostring('000')
Move-Item $_.FullName -Destination C:\Test\$($id.tostring('000'))
$id++
}
Move-Item is what you were looking for.
Ok, think I figured it out. Running the following scripts sequentially produces the desired result. The trick was resetting the $id increment when running the MV expression. This can probably be improved though, so let me know if you have a better way! Edited: #ArcSet has provided a better answer in a single script! Thank you!
$id = 1
Get-ChildItem | where {!$_.PsIsContainer}| % {
MD ($_.root +($id++).tostring('000'))
}
$id = 1
Get-ChildItem | where {!$_.PsIsContainer}| % {
MV $_ -Destination($_.root +($id++).tostring('000'))
}

Recursively replace certain text in all files following certain pattern with another text

I have a certain files contained in different directories inside a certain parent directory. Some of these files are prefixed with certain text. I want to replace this text with another text using powershell. I have tried below. But no luck. The powershell outputs as if the file names have been renamed. However when I checked back in the directory it was not actually reflected:
Your ForEach-Object loop just takes the name of the files, replaces the prefix, then echoes the modified string. Use Rename-Item instead of ForEach-Object:
Get-ChildItem abc* | Rename-Item -NewName { $_.Name.Replace('abc', 'uvw') }

Recursive "touch" on fileserver

There's something I want to accomplish with administrating my Windows file server:
I want to change the "Last Modified" date of all the folders on my server (just the folders and subfolders, not the files within them) to be the same as the most recent "Created" (or maybe "Last Modified") date file within the folder. (In many cases, the date on the folder is MUCH newer than the newest file within it.)
I'd like to do this recursively, from the deepest subfolder to the root. I'd also like to do this without me manually entering any dates and times.
I'm sure with a combination of scripting and a Windows port of "touch" I could maybe accomplish this. Do you have any suggestions? I could maybe accomplish this. Do you have any suggestions?
This closed topic seems really close but I'm not sure how to only touch folders without touching the files inside, or how to get the most-recent-file's date. Recursive touch to fix syncing between computers
If it's for backup purposes, in Windows there is the Archive flag (instead of modifying the timestamp). You can set it recursively with ATTRIB /S (see ATTRIB /?)
If it is for other purposes you can use some touch.exe implementation and use a recursive for:
FOR /R (see FOR /?)
http://ss64.com/nt/for_r.html
http://ss64.com/nt/touch.html
I think that you can do this in PowerShell. I just tried throwing something together and it seems to work correctly. You could invoke this in PowerShell using Set-DirectoryMaxTime(".\Directory") and it will operate recursively on each directory under that.
function Set-DirectoryMaxTime([System.IO.DirectoryInfo]$directory)
{
# Grab a list of all the files in the directory
$files = Get-ChildItem -File $directory
# Get the current CreationTime of the directory we are looking at
$maxdate = Get-Date $directory.CreationTime
# Find the most recently edited file's LastWriteTime
foreach($file in $files)
{
if($file.LastWriteTime -gt $maxdate) { $maxdate = $file.LastWriteTime }
}
# This needs to be in a try/catch block because there is a reasonable chance of it failing
# if a folder is currently in use
try
{
# Give the directory a LastWriteTime equal to the newest file's LastWriteTime
$directory.LastWriteTime = $maxdate
} catch {
# One of the directories could not be updated
Write-Host "Could not update directory: $directory"
}
# Get all the subdirectories of this directory
$subdirectories = Get-ChildItem -Directory $directory
# Jump into each of the subdirectories and do the same thing to each of their CreationTimes
foreach($subdirectory in $subdirectories)
{
Set-DirectoryMaxTime($subdirectory)
}
}

Resources