Modifying folder without modifying timestamp - windows

I'd like to change folder contents without having Date modified change.
I sometimes do cleanup runs on old folders, trying to make space or clean up temp files and dead-ends, or adding relevant links or tags for findability. When I do this, I don't want to change the folder to Date modified: today 2015, because my folders are sorted by Date modified. A project from 2010 should remain timestamped with its last modified date from 2010, because I've only made meta-changes, not actual changes.
Currently I use SK Timestamp or Attribute Changer. I right click each folder before I want to make changes to it, I keep the Properties window open, make my modifications, then hit Apply to overwrite the original timestamps.
I'd like to do it more automated, like set a Cleanup Mode on the root folder D:\Workspace and until I pop that state no timestamps ever get changed in the subdirectories Project2010, Project2013..., Project2015
Either that or at least be able to copy the timestamp between 2 files. Like in this answer that mentions #COPY /B %1+,, %1, but not with current date.
Usage like: touch D:\Temp\Project2010\Source.txt Destination.ext

I had commented above with a couple of suggestions -- one involving modifying the system's date / time, and another involving robocopy. The system date / time one is a bit of trouble and requires Internet access, and the robocopy /dcopy:t switch didn't work at all in my tests.
But I found a better solution anyway. Use a for loop to capture the folder's date time to a variable. Then after you've made whatever changes you wish, use powershell to put the folder's date / time back the way it was.
#echo off
setlocal
set "dir=path\to\directory"
for %%I in (%dir%) do set "datetime=%%~tI"
:: Do whatever deletions and other maintenance you want here.
:: Then after all changes have completed, reset the dir timestamp.
powershell -command "(Get-Item '%dir%').LastWriteTime=(Get-Date '%datetime%')"

Related

How to make this windows script skip existing files?

I have a windows script here that takes all files in a source folder, whether they're nested or not, and moves them to one single destination folder.
However I have some duplicates within the source folders and every time it comes across one it prompts to copy or skip it. It is quite cumbersome as some of the source folders have a lot of duplicates.
Can someone advise on how to edit this script to make it auto skip any duplicates without prompting me every time or maybe even copy both and just append the name of the duplicate? Either one would work. I am still new to the whole batch script scene and it is driving me nuts.
The script I have is:
FOR /R "C:\source folder" %i IN (*) DO MOVE "%i" "C:\destination folder"
Thank you!

Windows Server: audit space usage (latest files/folders added and size)

i work in a design agency (designer, little knowledge on programming). we have a Windows Server machine (Windows Storage Server 2008) that holds all our jobs. it's going to go under maintenence soon, because it got really slow after a thunderstorm.
however, i'd like to know if there's a native way or something that can be done to find out the latest files/folders created and the space they use. because in the last month, there's been a huge increase in the space usage, coming to something like 20GB in one single day (it shouldn't be more than 4GB/day).
i'm looking for a way to find which jobs are with this unnecessary extra space and help people work the right way.
thank you!
There's no simple way to do this that i know of - however if you're ok with using the command prompt (in the server of course) then this answer from #learnScrapy at SuperUser (with a few tweeks) should do the trick.
Something like:
forfiles /P D:\ /M *.* /S /D +"01/17/2012" /C "cmd /c if #fsize gtr 209715200 echo #path #fsize #fdate #ftime"
You'd need to change D:\ to whatever path you store all your design work at, the date ("01/17/2012") would need to be changed to from whatever date you know the issue wasn't present yet (and maybe change the order depending on the server's regional settings (MM/DD/YYYY vs DD/MM/YYYY vs ....)), and finally change the value 209715200 to whatever minimum size you'd like to search files for - that value is 200MB.
If your projects span multiple files though, if all are smaller then the size you choose but there's just a awful lot of them making up for several GB then you can't know and this won't work. This will only find files with a date later than the one you specified with size larger then the one you specified in the path you specified.

Windows remembering lower case filename, how to force it to forget?

Here's my problem:
I've got source files I'm publishing (.dita files, publishing using Oxygen) and I need to change capitalization on a lot of them, along with folders and subfolders that they're in. Everything is in source control using SVN.
When I change only an initial cap, say, and leave everything about the filename the same otherwise, Windows "remembers" the lower case name, and that's what gets published, even though the source name is now upper case.
I can even search for the filename, for example Foobar.dita, and the search results will show me "foobar.dita". When I go to that location directly in the file explorer, the file is named Foobar.dita. It's not a duplicate, it's the same file.
What I understand from reading up on this is that Windows isn't case-sensitive, but it "remembers" the filename as one case or the other. So my question is, if I can't force Windows to be case-sensitive, can I somehow force Windows to forget the filename? I've tried deleting it from both Windows and SVN, and recreating it, but it still gets read as lower case when it's initial cap.
If I rename the file, even slightly, it solves the problem, but many of the filenames are just what they need to be, and it's a lot more work to rename them (to think of another good filename) than just to change to initial cap.
UPDATE:
Here's where I read about about the "remembering" idea, in response two, the one with 7 recommendations.
To be explicit: I'm not updating from SVN and thus turning it back to lower case, it's upper case in SVN. It appears upper case in the Windows folder.
UPDATE II: This seems to be what I'm up against:
http://support.microsoft.com/kb/100625
In NTFS, you can create unique file names, stored in the same directory, that differ only in case. For example, the following filenames can coexist in one directory on an NTFS volume:
CASE.TXT
case.txt
case.TXT
However, if you attempt to open one of these files in a Win32 application, such as Notepad, you would only have access to one of the files, regardless of the case of the filename you type in the Open File dialog box.
So it sounds like the only answer is rename the files, not just change case.

Tfs2010: How do I get the server path of the source file in a rename operation using a pending change in a shelf?

When I perform tf rename $/Project/Main/File1.cs $/Project/Main/File2.cs in TFS2010, I know that once I check in there will be a "rename" change on the $/Project/Main/File2.cs slot, and a "delete, source rename" change on the $/Project/Main/File1.cs slot.
However, while the changes are still pending, only the rename change exists as a pending change. No changes are displayed in Pending Change to indicate that $/Project/Main/File1.cs is being renamed. In fact, if you execute tf status $/Project/Main/File1.cs tf.exe claims there are no pending changes, which is totally false.
In my situation, I have a series of about 100 files that I have manually merged as part of a branch integration operation, and following a re-execution of the tf merge command at the command line, I am simply trying to undo the files to which they apply so that I may unshelve the merged changes.
However, the Tfs object model's PendingChange objects can supply me only with the ServerPath, which refers to the "source rename" item, not the "rename" item. I am at a loss about how I can trace my shelved pending changes to the items that would need to be undone in my workspace.
How can I get the original pre-rename server path for items in a shelf that have been renamed?
You could get specific version to a point in time that the files mapped correctly and take note of the paths there.
Also, you could try TF Rollback. If you supply the changeset where you manually merged those 100 files you will, at the very least, get a list of all relevant files if you rollback (you dont have to checkin the rollback but rather just use it to visualize the files). From there, you should be able to figure out the related files.

Are windows file creation timestamps reliable?

I have a program that uses save files. It needs to load the newest save file, but fall back on the next newest if that one is unavailable or corrupted. Can I use the windows file creation timestamp to tell the order of when they were created, or is this unreliable? I am asking because the "changed" timestamps seem unreliable. I can embed the creation time/date in the name if I have to, but it would be easier to use the file system dates if possible.
If you have a directory full of arbitrary and randomly named files and 'time' is the only factor, it may be more pointful to establish a filename that matches the timestamp to eliminate need for using tools to view it.
2008_12_31_24_60_60_1000
Would be my recommendation for a flatfile system.
Sometimes if you have a lot of files, you may want to group them, ie:
2008/
2008/12/
2008/12/31
2008/12/31/00-12/
2008/12/31/13-24/24_60_60_1000
or something larger
2008/
2008/12_31/
etc etc etc.
( Moreover, if you're not embedding the time, what is your other distinguishing characteritics, you cant have a null file name, and creating monotonically increasing sequences is way harder ? need info )
What do you mean by "reliable"? When you create a file, it gets a timestamp, and that works. Now, the resolution of that timestamp is not necessarily high -- on FAT16 it was 2 seconds, I think. On FAT32 and NTFS it probably is 1 second. So if you are saving your files at a rate of less then one per second, you should be good there. Keep in mind, that user can change the timestamp value arbitrarily. If you are concerned about that, you'll have to embed the timestamp into the file itself (although in my opinion that would be ovekill)
Of course if the user of the machine is an administrator, they can set the current time to whatever they want it to be, and the system will happily timestamp files with that time.
So it all depends on what you're trying to do with the information.
Windows timestamps are in UTC. So if your timezone changes (ie. when daylight savings starts or ends) the timestamp will move forward/back an hour. Apart from that, and the accuracy of about 2 seconds, there is no reason to think that the timestamps are invalid, and its certainly ok to use them. But I think its bad practice, when you can simply put the timestamp in the name, or in the file itself even.
What if the system time is changed for some reason? It seems handy, but perhaps some other version number counting up would be better.
Added: A similar question, but with databases, here.
I faced some issues with created time of a file after deletion and recreation under same name.
Something similar to this comment in GetFileInfoEx docs
Problem getting correct Creation Time after file was recreated
I tried to use GetFileAttributesEx and then get ftCreationTime field of
the resulting WIN32_FILE_ATTRIBUTE_DATA structure. It works just fine
at first, but after I delete file and recreate again, it keeps giving
me the original already incorrect value until I restart the process
again. The same problem happens for FindFirstFile API, as well. I use
Window 2003.
this is said to be related to something called tunnelling
try usining this when you want to rename the file
Path.Combine(ArchivedPath, currentDate + " " + fileInfo.Name))

Resources