I'm currently sorting a folder structure of files with copies over copies of the same files.
Duplicates are already eliminated.
I have a folder structure of I:\Images\YYYY\MM filled with nef tif and jpg files.
I noticed too late (after sorting them into the new structure) that the sidecar xmp-fies were not written beforehand therefore not sorted along with the image files.
I've written the sidecar files separately from a backup, therefore they have different creation dates than the original images but exactly the same filename apart from the extension
All these sidecar files are now in I:\xmp
Is there a way of copying only the files with matching filenames into the different subfolders?
Related
I have files located in different folders, but with the help of one program I took files from all these folders, but leaving copies of the files.
having worked with the files, now I want to put the files back, but there is a problem, it will take a very long time to arrange the files in folders.
so I'm looking for a way to arrange files quickly
since we have copies of files that have the same names, I would like to simply replace them by similarity of names.
I have a main folder, filled with 30 subfolders, each of them containing 13 sub folders filled with jpeg images.
Their size is too big for uploading and distributing through links, so I need to bulk compress them, maintaining the folder structure. Is there any way to do this? Probably using a batch or whatever
I ended up using a software called Caesium. It did what I needed without any complications.
I downloaded from this link
https://saerasoft.com/caesium
Thanks for the answers
So I am looking for a tool that can compare files in folders based on checksums (this is common, not hard to find); however, my use-case is that the files can exist in pretty deep folder paths that can change, I am expected to compare them every few months and ONLY create a package of the different files. I don't care what folders the files are in, the same file can move between folders regularly and files wouldn't change names much, only content (so checksums are a must).
My issue is that almost all of the tools I can find do care about the folder paths when they compare folders, I don't and I actually want it to ignore the folder paths. I rather not develop anything or at least only have to develop a small part of the process to save time.
To be clear the order I am looking for things to happen are:
Program scans directory from 1/1/2020 (A).
Program scans directory from 4/1/2020 (B)
Finds all files where checksum in B don't exist in A and make a new folder with differences (C).
Any ideas?
I am using goroutines to concurrently download data from S3. For context, I currently have a group of samples. Each sample contains data in the form of a map, with a key representing the name of a file and the value pointing to the path in S3. Each sample has about 10 files that need to be downloaded from S3. I download all of these files in parallel and write to a shared zipfile object (got the mutexes and stuff figured out). I've figured out the concurrency aspect of this problem but the issue I face is organizing the zipfile object. I was wondering if it was possible to create a subdirectory within a zipfile object. otherwise i'm left with a massive zip object of all the data I need, but it is not really organized in any tangible way. Ideally, I'd be able to create a folder in the zipfile object for each sample and save all the file data to that but i don't know if that's possible.
The zip format has no notion of folder / directory, it just contains a list of files.
The file names may be composed to have folders in them, so the folders are just "virtual" but are not recorded as they are in "real" file systems.
So no, you can't create a directory in a zip file.
Suppose I have a folder with a few files, images, texts, whatever, it only matters that there are multiple files and the folder is rather large (> 100 mb). Now I want to update five files in this folder, but I want to do this atomically, normally I would just create a temporary folder and write everything into it and if it succeeds, just replace the existing folder. But because I/O is expensive, I don't really want to go this way (resaving hundreds of files just to update five seems like a huge overhead). But how am I supposed to write these five files atomically? Note, I want the writing of all files to be atomic, not each file separately.
You could adapt your original solution:
Create a temporary folder full of hard links to the original files.
Save the five new files into the temporary folder.
Delete the original folder and move the folder of hard links in its place.
Creating a few links should be speedy, and it avoids rewriting all the files.