Using NSPersistentDocument to create 'Documents' - macos

I would like to create an app that uses
Swift
CoreData
'Documents' which work in the standard macOS fashion [custom extension, a single 'file'/filewrapper containing all data relating to that document]
This does not appear possible. The documentation states very clearly that
NSPersistentDocument does not support some document behaviors:
File wrappers. [..]
which makes me think that the usual ways of dealing with images in CoreData - binary data with 'allow external storage' and save them to a different location, store the URL in the database - cannot be used with NSPersistentDocument. I want my users to be able to do the usual Finder operations on my 'file' (duplicate, move to external storage, restore from external backup) and need all my data to be in one single package.
The SQL version of the file store results in the usual three-fold stack when saving - .sqlite, .sqlite-shm, .sqlite-wal - which is useless as a 'document'.
Is there a solution I have overlooked? (examples are very sparse; the Big Nerd Ranch sample does not solve this, either; neither Marcus Zarra nor Objc.io touch on NSPersistentDocument).

The only option that will work with NSPersistentDocument the way you want it is to store the images directly in the database. You need a Binary Data attribute on your entity, but you cannot turn on the Allows External Storage option.
If you turn on this option, Core Data will decide - depending on the size - whether to store the image directly in the database or in a hidden folder inside the folder where your document is located:
(I made the folder visible entering cmd-shift-. in the Finder). The sample document is named Test 1.doof and it contains three images:
You can see that the hidden folder .Test 1_SUPPORT/EXTERNAL DATA contains two files, which are the two bigger images (1.3 MB and 494 KB). The third one with only 50 KB is stored inside Test 1.doof. If you move Test 1.doof into another folder, the hidden folder is left behind. Opening the file in that other folder leads to two missing images.
Storing the images inside the database is not that bad if you put the binary data into a separate entity with a one-to-one relation to the rest of the data, like so:
That way the image does not interfere with any search or sort operation. NSPersistentDocument gives you a lot of cool functionality for free, so you should use it anyway if possible.
Two additional remarks:
If you turn on Allows External Storage for an attribute, you do not have to care about URLs or where to store the images, Core Data does that for you (but not in a useful way for document-based apps).
These shm or wal files are temporary files that appear "sometimes", for databases without external storage as well. If they stick, you can safely remove them when you app is closed.

If you want to put more then just a database in your document, then you should implement NSDocument instead of NSPersistentDocument. In that case you don't get built-in support for CoreData, but you can use your document as a container for multiple file types.
See also Is NSDocument and CoreData a possible combination, or is NSPersistentDocument the only way?

Related

(why) is FSCTL_SET_OBJECT_ID dangerous?

NTFS files can have object ids. These ids can be set using FSCTL_SET_OBJECT_ID. However, the msdn article says:
Modifying an object identifier can result in the loss of data from portions of a file, up to and including entire volumes of data.
But it doesn't go into any more detail. How can this result in loss of data? Is it talking about potential object id collisions in the file system, and does NTFS rely on them in some way?
Side node: I did some experimenting with this before I found that paragraph, and set the object id's of some newly created files, here's hoping that my file system's still intact.
I really don't think this can directly result in loss of data.
The only way I can imagine it being possible is if e.g. a backup program assumes that (1) every file has an Object Id, and (2) that the program is keeping track of all IDs at all times. In that case it might assume that an ID that is not in its database must refer to a file that should not exist, and it might delete the file.
Yeah, I know it sounds ridiculous, but that's the only way I can think of in which this might happen. I don't think you can lose data just by changing IDs.
They are used by distributed link tracking service which enables client applications to track link sources that have moved. The link tracking service maintains its link to an object only by using these object identifier (ID).
So coming back to your question,
Is it talking about potential object id collisions in the file system
?
I dont think so. Windows does provides us the option to set the object IDs using FSCTL_SET_OBJECT_ID but that doesnt bring the risk of ID collision.
Attempting to set an object identifier on an object that already has an object identifier will fail.
.. and does NTFS rely on them in some way?
Yes. Object identifiers are used to track files and directories. An index of all object IDs is stored on the volume. Rename, backup, and restore operations preserve object IDs. However, copy operations do not preserve object IDs, because that would violate their uniqueness.
How can this result in loss of data?
You wont get into a serious problem if you change(or rather set) object ID of user-created files(as you did). However, if a user(knowingly/unknowingly) sets object ID used by a shared object file/library, change will not be reflected as is.
Since Windows doesnt want everyone(but developers) to play with crutial library files, it issues a generic warning:
Modifying an object identifier can result in the loss of data from
portions of a file, up to and including entire volumes of data.
Bottom line: Change it if you know what you are doing.
There's another msn article on distributed link tracking and object identifiers.
Hope it helps!
EDIT:
Thanks to #Mehrdad for pointing out.I didnt mean object identifiers of DLLs themselves but ones which they use internally.
OLEACC(a dll), provides the Active Accessibility runtime and manages requests from Active Accessibility clients[source]. It use OBJID_QUERYCLASSNAMEIDX object identifier [ source ]

cocoa document with many files generated a bit at a time

I've a problem with a document-based project in Cocoa. I've searched for a while but I didn't find anything which seems to resemble my goal. What I want to do is a (computation intensive) simulation program which generates a lot of data (probably in the order of GBs) and store them to the disk for a future visualization (so I cannot write/read the files all at once).
I created a document-based project (I don't know if it is the way to go...) with the idea to save all the data in many binary-files within a package, so the user can see it as a single file. I have already tried that part and I was able to save the document with NSFileWrapper. But the simulation-files are generated as the simulation is running. And here comes the problem.
There is a way to force the user to save the document and retrieve the path so I can put there all the files generated? Or it's best to save the simulation-files in a temporary location and then save the document periodically so that it saves all the files ready for saving? Or what can I do? It's not clear to me the usage of the nsdocument architecture in this case and what it's a good way to achieve my goal.
The document has also another couple of files in which there are the simulation parameters and the initial state, so I can resume the simulation at a later time.

Cocoa (Mac): splitting / combining a core data based application from / into multiple save files

I'm working in OS X Lion on a Core Data based Cocoa application where I need to be able to save different parts, lets say partA and partB, of the data model into separate files.
I need to be able to save both files together as a project file/package, but also need to be able to load and save partA independently from partB.
Loading a new partA file should replace all data currently associated with partA.
Saving partA should not save data changed in partB.
Entities in partA do need to maintain relationships with entities in partB but these can (and most likely have to) be weak.
My main question is: What would be the best approach for implementing the desired features?
My first approach has been one NSManagedObjectModel containing two configurations, one for each of the parts. I have two NSPersistentStore instances assigned to my NSPersistentStoreCoordinator one for each of the configurations. And one NSManagedObjectContext instance with the storeCoordinator assigned to it.
Saving and opening separate files are currently my main concerns in this appraoch. NSManagedObjectContext -save: message seems to save both configurations. Is it possible to only save changes made to objects that belong to a specific configuration of the NSManagedObjectModel? Or would I need two NSManagedObjectContext instances, one for each of the configurations?
Opening a file by adding a store to the persistentStoreCoordinator for partA adds data to the context and so far I have not been able to replace data. Is there a way to know which store is associated with a certain configuration, perhaps by sending a message to the persistentStoreCoordinator?
I'm also thinking of a second approach: Setting up two subclasses of NSPersistentDocument one for each configuration in my data model to be able to save the data into separate files. But I'm not sure if having two separate NSManagedObjectContext instances would allow setting up relationships between NSManagedObject subclasses in the different configurations?
If anyone has a good idea or can point me in the right direction or even has an example of how I could implement the above features, that would be highly appreciated.

Using Core Data with many images per entity?

I'm new to Core Data and I'm working on my first personal iOS app.
I have an entity, lets call it Car, which has a thumbail as well as a gallery of other images associated with it. The data is synced to an online service using ASIHTTPRequest and JSONKit. The app doesn't need to create new Car's, just display them.
The thumbnail could be around 100kB so I may store that as blob data within the Car entity.
However I'm not sure how I should store the other multiple images?
The images would be around 800kB to 1MB each using so storing them in the Core Data store doesn't seem to be recommended.
The only options I can think of are:
Store the url of each photo within another entity CarImage and rely on ASIHTTPRequest's cache.
Create a folder structure and save each image into it's corresponding Car's folder and keep references to the file path in the CarImage entity
Because the data is synced, there is the potential for Car's to be deleted, so images in folders would have to be deleted as well. I can see this getting out of hand pretty quickly.
I would appreciate any advice. Thanks.
I'd take your first option.
Regarding the images that would have to be deleted: isn't that taken care of automatically by ASIHTTPRequest's cache, once they expire? At least that's what I'd expect from a cache...
I'd go with the first option. I've done something similar in the past, though I actually did store the image binary data in Core Data as well. I wouldn't recommend storing the data, though, as this caused problems for me - just rely on ASIHTTPRequest's cache.

How to store preferences for an application?

I am a newbie in Ruby coming from web development with mainly PHP/SQL. I was thinking about how I store preferences in my application. For instance, if I want to store a path as default_path and have that set also when the user restarts the application.
In the web world one would probably store this in a database or XML. Database seems overkill for a standalone application. But I am unsure wheter XML/YAML/Other-Write-Format is the way to go. And if so, where should I store these preferences? Should they be, for instance on a Mac, in ~/Library/MyAppName?
I like using YAML because it's very easily read/written by a lot of languages, making it possible for several apps to share the same configuration info. It's a well documented standard so there should be very little chance of data falling into a hole with it.
Also, because it's easy for a human to understand, and doesn't take any special tools to change, it works nicely for any data that might occasionally change in an app, either for fine-tuning or to enable special behaviors.
A little creative coding on your part that periodically checks the last modified time of the YAML file could make it so your app would modify its behavior on the fly as the prefs file is tweaked. I had a big app I didn't want to shut down for changes and set up that behavior. It ran three weeks straight, and I tweaked its operating parameters via its config file. It would read the file every minute and inherit any changes to its parameters on the fly.
Databases are a good way to store parameters/preferences if it's a centralized server or web-based app. For something distributed that runs on individual machines it makes no sense.
Ruby gives you another method for storing data called Marshaling. This will let you store a class/object to a file and reconstitute it later. If all of your user preferences are stored in a single object (or you can create an object which can hold all of the data that you need), it may be easiest to marshal the data instead of writing import/export routines to a text-based format or trying to pull in an additional library or gem.
As to where on the disk to store the data, that's up to you. Most platforms have a standard location for storing application data based on whether it's available to a single user or all users. It's usually safest to follow the common practice on your target platform of choice.
Update: The simplest example of marshaling would probably be this: Say that you have a class called UserPrefs that you use to store all of your user preferences. You can use the following code to store the preferences data into a file:
my_prefs = UserPrefs.new
# ... Fill in the 'my_prefs' object with the user's preferences, etc ...
# Store the object into a file
File.open("user_prefs.data", "wb") do |file|
Marshal.dump(my_prefs, file)
end
The next time that you load the application, you can restore those preferences using the following:
# Load prefs from file
my_prefs = nil
File.open("user_prefs.data", "rb") {|f| my_prefs = Marshal.load(f)}
At this point, the my_prefs object should be exactly the same as it was when the marshaling code was originally run. This essentially lets you take a 'snaphot' of an object at one point in time (say, when your program shuts down) and restore it later (say, when your program loads). Internally, all of the data in the structure is encoded into a single string and that string is what is stored to disk; the Marshal module simply takes care of the encoding and decoding for you.
Here is another example of using marshaling to store and retrieve data.
The default encode/decode routines built into the Marshal module are usually sufficient for most data-storing classes. Particularly complex classes may have problems, and if that is the case then you can define your own encode and decode methods (the first link includes an example of defining custom methods).
Some types of data, however, cannot be marshaled (things like handles to open files, Proc objects, etc) since they don't normally persist across Ruby sessions. If you are needing to marshal a class that includes members like this that Marshal doesn't like, you can use custom encode/decode functions to marshal the rest of the class and omit the problematic members.
I saw some applications using ruby gconf2

Resources