I have recently used iCloud for Windows (10) to download all pictures from iCloud onto my local machine. I then wanted to create a backup so I copy-pasted all (15'000) pictures onto an external hard disk.
I noticed that because of the copy-paste action, the "Date Created" has -- in hindsight obviously -- been changed from the date that the picture was taken to the date / time of the copy action.
Since the copy-action and before I noticed the change of date, I have put many hours into putting pictures in subfolders, etc. I would now like to put the original date/time back into the metadata.
My idea is to make a dirlisting of the iCloud original archive, returning filename, md5 hash, and Date Created. I then want to write a script (Powershell?) to find the matching file in my subfolders, and update the date.
A few questions:
Does this seem like the best approach?
Is there a better way to copy photo's in the future, keeping the original Date Created?
Any help into the right direction whether this is a good idea with Powershell, would be greatly appreciated.
Manually changing timestamps using PowerShell
To change the Date Created, Date Modified, and Date Accessed file properties using PowerShell, I'd recommend checking out this website here. It explains how to change the Date Created/Modified/Accessed file properties for a single file, for all files in a given folder, or even for a folder itself. For example:
For a single file:
To set the date created time for "filename.txt" to 5 December 2012 at, say, 9:57:05 PM, you'd write:
(Get-Item "C:\Users\Path_to_file\filename.txt").creationtime=$(Get-Date "05/12/2012 21:57:05")
Similarly, to set the date modified, you'd write:
(Get-Item "C:\Users\Path_to_file\filename.txt").lastwritetime=$(Get-Date "05/12/2012 21:57:05")
and for date accessed time:
(Get-Item "C:\Users\Path_to_file\filename.txt").lastaccesstime=$(Get-Date "05/12/2012 21:57:05")
You can event set properties to other properties. For example:
To set the date created equal to the date modified, you'd write:
(Get-Item "C:\Users\Path_to_file\filename.txt").creationtime=$(Get-Item "C:\Users\Path_to_file\filename.txt").lastwritetime
For all files in a folder named "Test":
Get-ChildItem -force 'C:\Users\Path_to_folder\Test\' * | ForEach-Object{$_.CreationTime = ("3 August 2019 17:00:00")}
Get-ChildItem -force 'C:\Users\Path_to_folder\Test\' * | ForEach-Object{$_.LastWriteTime = ("3 August 2019 17:10:00")}
Get-ChildItem -force 'C:\Users\Path_to_folder\Test\' * | ForEach-Object{$_.LastAccessTime = ("3 August 2019 17:10:00")}
Note that the -force parameter ensures that hidden files are also included. Also, keep in mind that if you want to change each picture's timestamps to that of its own corresponding image (i.e, if you want to intelligently automate the process), you'll need to write a script that takes care of each separate case.
Copy files while preserving timestamps (and more)
The simplest tool to use is Windows' own, built-in tool: robocopy. See Microsoft's documentation on the robocopy command here. You just run the program via the command prompt (with administrator privileges).
For your needs, suppose you have images and videos in their original form (the ones that have all the correct timestamps, etc.) in the folder "Original", located at "C:\Users\Person\Desktop\Original", and you'd like to copy all those images and videos to a folder located on your external hard drive, at "D:\Pictures\Copied". The following command would probably work best:
robocopy "C:\Users\Person\Desktop\Original" "D:\Pictures\Copied" *.* /e /copy:DAT /dcopy:DAT /mt:16 /j /xjd /xa:s /r:1 /w:0 /log:"filename_path.txt"
Each argument is explained in detail on Microsoft's documentation page for robocopy, but I'll explain them here too.
The 1st argument specifies the path to the source folder. Please note that there are no trailing backslashes! If you include a backslash at the end, robocopy won't understand the input.
The 2nd argument specifies the path to the destination folder.
The 3rd argument can actually be multiple arguments. Here, the *.* specifies the file or files to be copied. Since wildcard characters (* or ?) are supported, *.* matches all files in the source directory (i.e., everything will be copied). *.* is the default parameter, so if you want all files to be copied, you don't even need to specify this parameter. If you want to copy, say, all files that end in .jpg, you'd write *.jpg. Or, if you want to copy, say, just two files file1.jpg and file2.mp4, you'd write those out explicitly, one after each other, as such robocopy "C:\Users\Person\Desktop\Original" "D:\Pictures\Copied" file1.jpg file2.mp4 /e /copy:DAT /dcopy:DAT /mt:16 /j /xjd /xa:s /r:1 /w:0 /log:"filename_path.txt".
The 4th argument /e copies subdirectories. This option automatically includes empty directories.
/copy:DAT specifies which file properties to copy. Here, D, A, and T means that the file's Data, Attributes, and Time Stamps properties will be copied. Refer to robocopy's documentation for more details.
/dcopy:DAT same as /copy:DAT except for folders.
/mt:16 creates multi-threaded copies with 16 threads. You can specify an integer between 1 and 128.
/j copies using unbuffered I/O (recommended for large files).
/xjd excludes junction points for directories.
/xa:s excludes "System" files.
/r:1 specifies the number of retries on failed copies. Here, it's set to 1. The default value is 1,000,000 (one million retries)!
/w:0 specifies the wait time between retries, in seconds. Here, it's set to 0, so 0 seconds are spent waiting.
/log:"filename_path.txt" writes the status output to the log file.
Related
When I run this file:
xcopy .\*.odt .\source.zip
I am prompted to specify what source.zip is:
xcopy .\*.odt .\source.zip
Does .\source.zip specify a file name
or directory name on the target
(F = file, D = directory)?
In my case when it find the .odt file to copy the file and place in the same directory but with new name source.zip. Is there approach to avoid the prompting since I always want destination to be a file not directory.
Any .odt file (being in .zip format in fact) is a binary file, see OpenDocument Document Representation
As a collection of several sub-documents within a package, each of which stores part of the complete document. This is the common
representation of OpenDocument documents. It uses filename extensions
such as .odt, .ott, .ods, .odp ... etc. The package is a
standard ZIP file with different filename extensions and with a
defined structure of sub-documents. Each sub-document within a package
has a different document root and stores a particular aspect of the
XML document. All types of documents (e.g. text and spreadsheet
documents) use the same set of document and sub-document definitions.
Therefore, you need to treat it as a binary file (read copy /?):
copy /B .\*.odt .\source.zip
Above command would work smoothly only if there will be only one file with extension .odt. Otherwise, it will prompt you for Overwrite .\source.zip? (Yes/No/All):. To stay on the safe side:
from command line for %G in (.\*.odt) do copy /B "%G" ".\source_%~nG.zip"
from a batch script for %%G in (.\*.odt) do copy /B "%%G" ".\source_%%~nG.zip"
%~nG (or in batch %%~nG) explanation: read Parameter Extensions.
I am copying a list of files using a prefix (i.e., ABCD*) to match files in a batch script. However, some files that appear to match are being left behind while other files that don't match are getting grabbed.
I ran a dir /X and found that the shortname for a handful of the files didn't match their longname:
4/17/2015 02:04 PM 554 ABCDEF~1.TXT abcdefghijklmnopqrs.txt
4/17/2015 02:08 PM 123 ABCDEF~2.TXT 1234567890.txt
4/17/2015 03:18 PM 233 987654~1.TXT abcdefg123456.txt
Any idea why something like this might happen and how to resolve it?
If your sample data is representative of your actual files, you could specify ABCDEFG* to workaround this issue.
EDIT
Since the above suggestion is not an option, you could use FSUTIL to remove all of the 8.3 names.
This command will analyze the files in the current directory (.) and display the changes without actually making them.
fsutil 8dot3name strip /t .
Remove the /t parameter to actually remove the 8.3 names.
You can also run:
fsutil 8dot3name strip
to see all of the options.
Short and long file names are not required to match. The default algoritm is documented here under "How NTFS Generates Short File Names". You can also find it in the wikipedia
You can change the short file name with
fsutil file setshortname longFileName shortFileName
need some help with this one
I have a directory that contains subdirectories from various applications so let's say directory is c:\home and each application has a subdirectory called the application name so we will have
c:\home\app1
c:\home\app2
etc.
These applications write large log files and they then get recreated every hour but into a different directory, called according t date and time like dd/mm/yyyy/hr and this is created within the actual subdirectory and a log file with the exact same name will be within each directory for each app. so we will end up with this
c:\home\app1\1015201410\app1.log
c:\home\app1\1015201411\app1.log
c:\home\app1\1015201412\app1.log
c:\home\app2\1015201410\app2.log
c:\home\app2\1015201411\app2.log
c:\home\app2\1015201412\app2.log
I want to list through the directories every hour and collect the latest log from each application, in other words in this instance I want to collect the following 2 only as they are the latest (end time 12 shows it is the 12th hour)
c:\home\app1\1015201412\app1.log
c:\home\app4\1015201412\app2.log
Now getting the file one by one is easy enough but the script is going to become too long and needs to be edited on a regular base to allow for new applications added to the directories.
I am able to do the copying, formatting the time/date section etc. I just need to find a way to search through the home directories for all subdirectories containing the latest timedate and then copy a file from it elsewhere.
So I tried this. Note timedateformat has been predefined:
for /D %%d in (c:\home\*\%timedateformat%\*) do (
for %%f in (%%d\.log) do (
xcopy %%f C:\destination\
)
)
but this obviously does not like the * part and therefore I will get no result.
Please if anyone is able to assist, I would greatly appreciate.
for /d %%F in ("c:\home\*") do xcopy "%%F\%timedateformat%\*.log" "c:\destination\"
I have a folder with 552k pictures. (552115 to be precise).
They are from a camera that took X photos every day over a few years, which I later would use to create a timelapse.
The problem is that the guy who set this up configured it to take a picture every two minutes 24/7.
Now I have a bunch of files I need to delete.
I only need about 1-5 pics between 09:00 and 15:00 each day and definitely not those taken at night.
The filename is Name-11-04-01_19-25-17-01.jpg
The filename is Name-YY-MM-DD_HH-MM-SS-MS.jpg
Date and time information is also available in the metadata.
I would prefer if this could be done in Batch or PowerShell, but I'm open to other scripting languages if it can be done easier than using batch/ps.
I appreciate your response! :)
The following should do the trick. You must adjust the "yourFolder" to point to your folder containing the files. It assumes all of your files have an extension, which you failed to show in your question.
The script creates a SAVE folder, and then attempts to move 4 files for each day into the SAVE folder. It looks for the earliest file each day for the hours of 9, 11, 13, and 15 where the minutes starts with 0.
After the script has run, and you confirm you have what you need, you can delete all remaining files from the original folder if you want, and keep the files in the SAVE folder.
#echo off
setlocal enableDelayedExpansion
pushd "yourFolder"
mkdir save
dir /b /a-d * | findstr /re "_09-0.-..-..\..* _11-0.-..-..\..* _13-0.-..-..\..* _15-0.-..-..\..*" >files.txt
for /f %%F in (files.txt) do (
set "name=%%~nF"
if not exist "save\!name:~0,-9!-*" move "%%F" save
)
You should be able to modify the FINDSTR search patterns to adjust the number of files per day. For example, to keep one file per day, you could use a single string: "_09-0.-..-..\..*". One advantage of this is the light level and direction should be more consistent throughout the time lapse.
I would recommend Powershell over batch files. I have written the solution here as a powershell script
Obviously test this thoroughly before running it on your prized photos.
get-childitem -filter "*_*.jpg" | foreach ($_) {$hour = [int]$_.name.Substring($_.name.indexof("_")+1, 2); if($hour -lt 09 -or $hour -ge 15) {remove-item $_.fullname}}
I am a Batch-newbie, so please accept my apologies and Thanks in advance !
This "tool" is to automate the slimming down of Windows (XP) by disabling certain system driver, DLL and EXE files. Instead of outright deletion, I wish to rename-in-place, thus "removing" them from the OS, but not losing sight of where they belong (should any need to be "restored"). Renaming is accomplished by appending a new suffix to the existing filename (eg: "wdmaud.drv.group_1") The renaming suffix should be another input variable.
The target-list is approx. 1100 files long (divided into various groups/phases), so manual renaming is out of the question. Each group will be processed in a separate run of the batch file, varying the target-list input file for each execution.
Target-list is plain text file, one filename per line (no other data in the files). Number of entries per group varies. Target list will look like this:
-- example start --
netapi.dll
netcfgx.dll
netdde.exe
netevent.dll
neth.dll
netid.dll
netrap.dll
nic1394.sys
-- example end --
Filenames may be in UPPER, lower, or MiXeD case. The files may be present in more than one folder in the C:\Windows hierarchy - or may not be present at all. If a file is not found anywhere in the system, it's name should be written to a text file, one-entry-per-line.
The specific folders of interest are:
C:\WINDOWS\
C:\WINDOWS\system\
C:\WINDOWS\system32\
C:\WINDOWS\system32\dllcache
C:\WINDOWS\system32\drivers
The renaming will be done by connecting the target OS drive to another XP computer, so locked system files should not be a problem.
Any help you can offer will be greatly appreciated.
a double FOR loop may help you.. this is a very simple example, just to get you started
for /f "tokens=*" %%f in (%targetlist%) do (
for /f "tokens=*" %%d in (%dirlist%) do (
if exist "%%d\%%f" echo %%f found in %%d
)
)
see HELP FOR.