When does unresolved merge conflict happen? - visual-studio-2010

I've been digging through Visual Studio's settings and I found a default task list comment UnresolvedMergeConflict that has high priority status. Since the key word is written without spaces, I guess this is inserted automatically during some (merging?) process. I'd like to know when and how this message may be inserted, and if it's safe to delete it from that list.

Generally speaking a source code configuration system (see eg http://en.wikipedia.org/wiki/Software_configuration_management) is used to keep track of source files (and other files) as a project progresses. When file versions are merged (see http://en.wikipedia.org/wiki/Merge_%28revision_control%29) some changes can be automatically merged, but some changes cannot hence the conflict warning. Generally such conflicts occur when two different changes have been made to the same line (or adjacent lines) when both changes are compared to a common ancestor version of the file.

Related

Find next file (but not FindNextFile)

If a user opens a file in a program (for example using GetOpenFileNameW, DragQueryFileW, command line argument, or whatever else to get the path, and a subsequent CreateFileW call), is there a way to find the next file in the parent directory of the opened file?
The obvious solution is to cycle through the results from FindNextFileW or NtQueryDirectoryFileEx until the opened file is encountered, and just open the next file.
However, this seems undesireable.
First, because these functions use paths (instead of for example a handle), the original file is decoupled from the search algorithm, so the original file might not even get encountered in that search. This is not much of an issue (as failing in this case is the expected outcome), and it probably could be resolved with (temporarly) changing the sharing mode, using LockFile or similar (though I would like to avoid that).
Second, this cycling search would have to be done every time, because the contents of the directory might have changed (retaining hFindFile does not work, because only FindFirstFileW calls NtQueryDirectoryFileEx and enumerates the contents of the directory). Which seems like unnecessary work and might even affect performance (for example if the directory contains a lot of files).
In theory any file system has some way of enumerating the files in a directory. Meaning there is some ordered data structure of the files' metadata. And getting the next file should only involve going back from the existing file handle to that file's entry, and then getting the next entry from that data structure. So there does not seem to be a fundamental reason why this cannot be done more sanely.
I thought maybe there exist a better way to do this somewhere in WinAPI...
Same question for finding the previous file.

Fast deletion of files in a Setup project

I am maintaining a Setup project under VS2008. The project contains thousands of files arranged in a hierachy of folders.
Every now and then, I want to renew part of this hierarchy, which means deleting a number of nodes and reinserting the new content. I need to do that because some of the files are obsolete and need to be removed. It is much safer to delete all than to chase the obsolete files away.
Unfortunately, this is a very tedious task because you can't delete empty folders and you have to delete every node in the hierarchy one by one. In addition, for a large project, every deletion takes seconds.
Do you know of a way to speed-up or automate that task ? Merely erasing lines in the .vdproj file doesn't seem to work.
If you do not feel too queasy about it.... This is worth the effort if changes are frequent.
The .vcproj file is in xml format. You could use explorer to manage your files, and write a small utility that checks and removes the (then) missing files from your project. The files are marked with tags, as in
<File
RelativePath=".\AudioPlayerPane.cpp"
>
</File>
You have to remove the entire "File" tag, that's 3 lines, or more, if the file has special compilation options, etc.. up to and including the "/File" tag. In addition, you will want to delete the .suo, .ncb and .cache files
It's too bad that boost.property_tree xml does not support any encoding other than UTF-8, as I would suggest you use that to recursively walk the vcproj file. This would make the utility very easy to write, and ensure the resulting file is correct. Maybe you can use the encoding capabilities of notepad++ to manually change the encoding of the file before and after changes.

Monitor A File For Additions And Get Last Added Line

I'm having trouble monitoring a file for changes. I need to be able to know when a file changes, and when it does, I need the new line that was added. I intend to parse each line and find ones that match certain criteria, and act on information in those lines. I know the expected number of matching lines ahead of time, but I do not know how many lines in total will be added to the file, or where the matching lines will be.
I've tried 2 packages so far, with no avail.
fsnotify/fsnotify
As fas as I can tell, fsnotify can only tell me when a file is modified, not what the details of the modification was. Since I need to know what exactly was added to the file, this is no good for me.
(As a side-question, can this be run in a loop? The example that I tried exited after just one modification. I need to monitor for multiple modifications.)
hpcloud/tail
This package tries to mimic the Unix tail command, but it seems to have its own issues. The output that I get includes timestamps and other data - I just want the added line, nothing else. Also, it seems to think a file has been modified multiple times, even when it's just one edit. Further, the deal breaker here is that it does not output the last line if the line was not followed by a newline character.
Delegating to tail
I came across this answer, which suggests to delegate this work to the tail command itself, but I need this to work cross-platform (specifically, macOS, Linux and Windows). I don't believe that an equivalent command exists on Windows.
How do I go about tackling this?
#user2515526,
Usually changed diff is out of scope of file watchers' functionality, because, you know, you could change an image, and a watcher would need to keep a track several Mb of a diff in memory, and what if we have thousands of files?
However, as bad as it sounds, this may be exactly the way you want to implement this (sure, depends on your app, etc. - could be fine for text files), i.e. - keeping a map of diffs (1 diff per file) since last modification. Cannot say I like it, but sounds like fsnotify has no support for changes/diffs that you need.
Also, regarding your question about running in a loop, maybe you can get some hints here: https://github.com/kataras/iris/blob/8370d76910cdd8de043753ed81ae080eae8dc798/utils/file.go
Its a framework that allows to build a server that watches for TypeScript file changes. So sounds similar to your case/question.
Cheers,
-D

Using ReadDirectoryChangesW to read changes to the folder itself (WINDOWS)

From the doc (ReadDirectoryChangesW):
"Retrieves information that describes the changes within the specified directory. The function does not report changes to the specified directory itself."
My question is: What do I use to report changes to the specified directory itself?
I want to be able to capture changes not only to things and sub-things in the folder itself but also to detect for example, when the folder itself has been deleted.
One strategy would be to actually monitor for changes on the parent of the folder I'm really interested in and then use that to generate an event when the folder I'm interested in is deleted. This works but has the potential to generate thousands of 'uninteresting' events.
A second strategy is to have a recursive monitor for stuff under the folder I'm actually interested in and then a non-recursive monitor on it'a parent. The non-recursive monitor would then be able to tell me when the real folder of interest is deleted.
The latter, second strategy, generates fewer events and is the strategy I would like to use. BUT: It doesn't work 'in process'. That is, if I start to monitor the folder of interest recursively (HANDLE A), and it's parent non-recursively (HANDLE B) and then in the same process, I try and delete the folder of interest, no removal event is generated for it (even though I verify from a console that the thing no longer exists). My suspicion is that this is due to HANDLE A on the folder still being open, and even though I have included the "FILE_SHARE_DELETE" flag in the call to CreateFileW that gave me HANDLE A, it simply can't work.
Note that 'Out of process', i.e. when I delete the folder from within a completely separate process, the above strategy does work.
So, what are my options?
Many thanks,
Ben.

strategies for backing up packages on macosx

I am writing a program that synchronizes files across file systems much like rsync but I'm stuck when it comes to handling packages. These are folders that are identified by the system as containing a coherent set of files. Pages and Numbers can use packages rather than monolithic files, and applications are actually packages for example. My problem is that I want to keep the most recent version and also keep a backup copy. As far as I can see I have two options -
I can just treat the whole thing as a regular folder and handle the contents entry by entry.
I can look at all the modification dates of all the contents and keep the complete folder tree for the one that has the most recently modified contents.
I was going for (2) and then I found that the iPhoto library is actually stored as a package and that would mean I would copy the whole library (10s, or even 100s of gigabytes) even if only one photograph was altered.
My worry with (1) is that handling the content files individually might break things. I haven't really come up with a good solution that will guarantee that the package will work and won't involved unnecessarily huge backup files in some cases. If it is just iPhoto then I can probably put in a special case, or perhaps change strategy if the package is bigger than some user specified limit.
Packages are surprisingly mysterious, and what the system treats as a package does not seem to be just a matter of setting an extended attribute on a folder.
It depends on how you treat the "backup" version. Do you keep two versions of each file (the current and first previous), or two versions of the sync snapshot (i.e. if a file hasn't changed between the last two syncs, you only store one version)?
If it's two versions of the sync, packages shouldn't be a big problem -- just provide a way to restore the "backup" version, which if necessary splices together the changed files from the "backup" with the unchanged files from the current sync. There are some things to watch out for, though: make sure you correctly handle files that're deleted or added between the two snapshots.
If you're storing two versions of each file, things are much more complicated -- you need some way to record which versions of the files within the package "go together". I think in this case I'd be tempted to only store backup versions of files within the package from the last time something within the package changed. So, for example, say you sync a package called preso.key. On the second sync, preso.key/index.apxl.gz and preso.key/splash.png are modified, so the old version of those two files get stored in the backup. On the third sync, preso.key/index.apxl.gz is modified again, so you store a new backup version of it and remove the backup version of preso.key/splash.png.
BTW, another way to save space would be hard-linking. If you want to store two "full" versions of a big package without without wasting space, just store one copy of each unchanged file and hard-link it into both backups.

Resources