I am new to Mac programming and as the title to this post suggests I have some questions on how a component for QuickTime is written. I have listed my questions as points, feel free to answer any, if not all, of them. Thanks in advance.
Which QuickTime component type is for codecs? Is it image decoder component type?
What component type is for custom containers? (Non .mov files)
How does QuickTime determine appropriate container and codec handler? Does it query every component listed under '/System/Library/QuickTime' until it finds a component that works?
I looked into projects such as Perian which have '.exp' files which export names like '_AC3MovieImportComponentDispatch'. With the prefix '_AC3' how does component manager recognise this as a dispatch function?
I created a 'test.component' bundle with very similar Info.plist as other components files and placed them under '/System/Library/QuickTime'. Component Manager documentation suggest they are registered 'automatically'. When do new component files for QuickTime get registered? Is call to 'RegisterComponentResourceFile()' necessary?
My system is:
Mac OSX 10.4.11 Tiger
For information about writing file container reader and decoder for QuickTime 7 go to http://sanje2v.wordpress.com/2014/08/02/writing-file-container-reader-and-decoder-for-quicktime-7-faq/
Related
I'm sure this is right in front of my face, but I'm a bit of a nooob...
How can I define which Photos Library is loaded vs the current method which loads the System Library?
I presume it's somewhere along the lines of here from the sample code?
// Setup the media library to load only photos, don't include other source types.
let options: [String : AnyObject] =
[MLMediaLoadSourceTypesKey: MLMediaSourceType.image.rawValue as AnyObject,
MLMediaLoadIncludeSourcesKey: [MLMediaSourcePhotosIdentifier, MLMediaSourceiPhotoIdentifier] as AnyObject]
// Create our media library instance to get our photo.
mediaLibrary = MLMediaLibrary(options: options)
Is it a matter of having selected MLMediaSourceiPhotoIdentifier which defaults to the System Library? If so how do you go about opening other Libraries?
Help! Thankssss
Per the documentation of MLMediaLibrary, there is only three other options except for the standard media sources:
Non-App-Specific Media Source Identifiers:
MLMediaSourceCustomFoldersIdentifier
The media source for custom folders. Currently, the only custom folder is the folder containing audio loops from Apple.
MLMediaSourceAppDefinedFoldersIdentifier
The media source for app-defined folders. This identifies a media source created from a relative path inside the caller’s app bundle.
MLMediaSourceMoviesFolderIdentifier
The media source for the user’s Movies folder.
Based on this, I'd say it's highly likely that the functionality to create a MLMediaLibrary from an arbitrary filePath does not exist. Indeed, the API will only allow you to interface to the library that is currently being used by iPhoto, iMovie or other apps respectively.
I'm making iOS and Android app using with Xamarin.forms. (PCL project)
I need to resize my image file from 'MediaFile class' that is returned from 'CrossMedia.Current.PickPhotoAsync ()'.
What's your best way to do it?
and I have searched long time and noticed that many people uses 'writeableBitmap'.
But this 'writeableBitmap' nuget package can not be added because Xamarin platform is updated?
I got great answer from here.
https://forums.xamarin.com/discussion/comment/199212#Comment_199212
It's using DependencyService and it works beautifully.
After applied DependencyService, I got a question.
In my thinking, image file is not dependent so why don't we process it just in 'forms' using with SOME image process library in .net? (I'm not a .net developer but I believe that there is many library to process image file)
I suspect that we should have done because we want to use built-in image processing library(like UIKit) so that we could avoid adding new one?
Am I correct?
I'm a beginner at App Inventor and I don't know what I'm doing wrong with the sharing component. I'm trying to share a sound file but it gives me the following error:
No Activity found to handle Intent { act=android.intent.action.SEND flg=0x1 (has clip) (has extras) }
The code is the following:
I also tried the following things:
This doesn't work.
This reproduces the sound perfectly. That means that the picker works and the file exists.
I put a file in my root folder and it works but I want to send the file which was picked from the list which is in the following folder "/storage/sdcard0/MyDocuments/Downloads/app_inventor_1431867799168.mpeg".
I don't know where is the problem and I look in a lot of forums and none of the solutions solved my problem.
Thanks for your help.
PD: I just found that the downloaded files are stored in .MPEG format which it is a compressed format. It is possible that sharing function don't find any app that accept this type of format. If this is the problem please tell me what I have to do to make App Inventor to not change the original .mp3 file.
ok, problem solved as I said in the PD part the problem was that by default App Inventor stores the audio files in to MPEG format. This format is not recognized by whatsapp, gmail, etc...
Solution was to set a name to each file when I download them that ends with .mp3
With this extention the programs of my device recognize them and now I can share my audio files.
Thanks to everyone that was trying to solve this problem.
I am busy creating a small note taking application but I have run into a bit of an issue.
I cannot seem to get an NSTextView to work with core data. I have watched this video https://www.youtube.com/watch?v=qypMqkT20LU and I have also read "Swift development with Cocoa".
"Swift development with Cocoa" uses an nstextview but they are not using core data. From the this book I have gathered that I need to use NSAttributedString for the content of the NSTextView.
The issue that I am having is that I cannot find out how to use that with Core Data.
I am trying to make this app like the video is so that I do not need to use any code but just use pure bindings.
I have also tried using binary data type for the attribute in my entity as well as transformable but then I get an issue where the application cannot start because it cannot find the applications saved data.
Any help would be much appreciated on how to use a NSTextView with core data and bindings.
It turns out that the storedata file that is created when the application launches was not being overwritten, so every time I changed and attribute in the storedata it would break because the attributes were not the same.
To fix this issue I had to delete the storedata every time I ran the app. The file can be found in /Users/{user}/Library/Application\ Support/{app name}
Usually the app name would be com.yourcompany.appname
I'm getting quite a few emails with logs in XML format attached and I want to associate my own app with the XML file type on Windows Phone Mango, overwriting the default xml viewer.
If overwriting isn't possible I can get the logs generated with a custom extension. I would then need to associate the extension with my app same way Adobe Reader does it for PDFs.
Is this possible?
Thanks
Currently there is no way to set your app as the default viewer for any file extension. The closest you can get at the moment is using extensibility. However, these are limited to Photos, Music and Search and not custom extensions. (Even then, your app wouldn't be the default viewer, but it would be accessible from the respective hubs).