I'm sure this is right in front of my face, but I'm a bit of a nooob...
How can I define which Photos Library is loaded vs the current method which loads the System Library?
I presume it's somewhere along the lines of here from the sample code?
// Setup the media library to load only photos, don't include other source types.
let options: [String : AnyObject] =
[MLMediaLoadSourceTypesKey: MLMediaSourceType.image.rawValue as AnyObject,
MLMediaLoadIncludeSourcesKey: [MLMediaSourcePhotosIdentifier, MLMediaSourceiPhotoIdentifier] as AnyObject]
// Create our media library instance to get our photo.
mediaLibrary = MLMediaLibrary(options: options)
Is it a matter of having selected MLMediaSourceiPhotoIdentifier which defaults to the System Library? If so how do you go about opening other Libraries?
Help! Thankssss
Per the documentation of MLMediaLibrary, there is only three other options except for the standard media sources:
Non-App-Specific Media Source Identifiers:
MLMediaSourceCustomFoldersIdentifier
The media source for custom folders. Currently, the only custom folder is the folder containing audio loops from Apple.
MLMediaSourceAppDefinedFoldersIdentifier
The media source for app-defined folders. This identifies a media source created from a relative path inside the caller’s app bundle.
MLMediaSourceMoviesFolderIdentifier
The media source for the user’s Movies folder.
Based on this, I'd say it's highly likely that the functionality to create a MLMediaLibrary from an arbitrary filePath does not exist. Indeed, the API will only allow you to interface to the library that is currently being used by iPhoto, iMovie or other apps respectively.
Related
Is it possible to include resource links (i.e., res://...) within a web view? My attempts so far suggest not. I can include standard tags and reference local files, but those are not scaled for the various display densities.
res:// is the format that only the NativeScript's file system would understand, more like a custom shorthand.
If you have the image inside Android's drawable folder, you may try this
file:///android_res/drawable/YOUR_FILE_NAME
For iOS, you will have to load it with absolute path.
Creating folder by following code
var documents=Environment.GetFolderPath(Environment.SpecialFolder.MyPictures);
var directoryname = Path.Combine(documents, "XX");
Directory.CreateDirectory(directoryname);
but the folder does not exists in the specific path..May i know whats the reason.
regards..
if you need the main picture folder on the android then your code should look like this
string AndroidDcimFolder = Android.OS.Environment.GetExternalStoragePublicDirectory(Android.OS.Environment.DirectoryDcim).AbsolutePath;
From Xamarin Official Site:
Not all directories in this enumeration will be available on all
platforms for all users. When running on the Windows operating system,
the operating system determines the paths to these directories.
Enumeration means Environment.SpecialFolder.
You need to implement a DependencyService and need to call System.Environment.GetFolderPath from within the platform-specific implementation.
Check this sample.
As per my knowledge, it (your code) will not work in Xamarin.Form PCL project.
You can do another thing is, In your Android and iOS Project you can put this code to get the platform specific path:
using Xamarin Forms;
//Get current Path.
Config.PathApp = System.Environment.GetFolderPath (System.Environment.SpecialFolder.Personal);
I implemented the document provider app extension feature with my own
ios app. The problem is the app extension was not able to access the
containing app images/assets or the containing apps resource data.
Does any one knows how to achieve it?.
The extension has its own target in your Xcode project. In order to access resources in the extension you also need to add them to that target (in the File Inspector also check your extension target in the Target Membership section). Just be aware that those resources also get copied into the extension bundle, which increases your overall app size.
Sidenote: In my extension I didn't add my default image asset catalogue (Images.xcassets) to the extension, but was still able to access the containing images. Maybe that's an exception.
UPDATE
If you are supporting iOS 8 and up, you can create a framework to share among your app and it's extensions, and embed resources, strings, etc. inside of it. Make sure to use the [UIImage imageNamed:inBundle:compatibleWithTraitCollection:] and NSLocalizedStringFromTableInBundle methods, passing in the bundle that is your framework.
PREVIOUS ANSWER
Given that extensions are (and hopefully always will be) in a sub-folder of the main app at /Plugins/PluginName and have permissions to access files just like the main app, you can do the following (and I do in some of my apps on the store):
// image
UIImage* image = [UIImage imageNamed:#"../../image_name.png"]
// data
NSString* path = [NSHomeDirectory() stringByAppendingPathComponent:#"../../file.json"];
NSData* data = [NSData dataWithContentsOfFile:path];
This will save you a lot of space, especially if your extension uses a lot of images.
I am new to Mac programming and as the title to this post suggests I have some questions on how a component for QuickTime is written. I have listed my questions as points, feel free to answer any, if not all, of them. Thanks in advance.
Which QuickTime component type is for codecs? Is it image decoder component type?
What component type is for custom containers? (Non .mov files)
How does QuickTime determine appropriate container and codec handler? Does it query every component listed under '/System/Library/QuickTime' until it finds a component that works?
I looked into projects such as Perian which have '.exp' files which export names like '_AC3MovieImportComponentDispatch'. With the prefix '_AC3' how does component manager recognise this as a dispatch function?
I created a 'test.component' bundle with very similar Info.plist as other components files and placed them under '/System/Library/QuickTime'. Component Manager documentation suggest they are registered 'automatically'. When do new component files for QuickTime get registered? Is call to 'RegisterComponentResourceFile()' necessary?
My system is:
Mac OSX 10.4.11 Tiger
For information about writing file container reader and decoder for QuickTime 7 go to http://sanje2v.wordpress.com/2014/08/02/writing-file-container-reader-and-decoder-for-quicktime-7-faq/
I'm using Adobe AIR to make APK file for my Android phone. Inside the APK package there is a SWF file and in the same folder as the SWF file there is a subfolder named "images", which contains image.jpg.
I'm trying to read that image.jpg into the program at runtime but can't get the location of it (the program doesn't seem to find it.
Here's what I'm trying:
var loader:Loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onComplete);
loader.load(new URLRequest("images/image.jpg"));
while (bitmapData==null) {} //wait until loaded
return bitmapData;
The onComplete-function puts sets the bitmapData field. However, the COMPLETE event never happens so the while loop never exists.
I have also tried:
loader.load(new URLRequest("/images/image.jpg"));
loader.load(new URLRequest("app:/images/image.jpg"));
loader.load(new URLRequest("app-storage:/images/image.jpg"));
loader.load(new URLRequest("file:/images/image.jpg"));
None of it works.
I can't get this to work even when trying without the APK file, when only running the SWF file that is generated (the images subfolder is in the same folder as the SWF file).
I can't embed the JPG into the SWF file itself because I need to have a lot of image files and Flash CS6 runs out of memory when making such a big SWF file (reading the images externally would also mean a whole lot less compilation time).
Any ideas? How can I get put the external JPG file into BitmapData? Remember that the image is not outside of the APK file, it is packaged inside it.
Note: I'll also export the thing into a iOS ipa package later so it have to work there as well.
Edit because I can't self-answer due to my reputation.
Strille made me realize that the ActionScript Virtual Machine is single-threaded.
So it's impossible to make a thread sleep (wait) for an image being loaded. Instead I have to sigh rewrite a lot of code so that things are halted and continues when the onComplete function is called.
This thing that I tried before actually works but couldn't complete due to the while loop:
loader.load(new URLRequest("app:/images/image.jpg"));
If I remember correctly, you should be able to do it like this:
loader.load(new URLRequest(new File("app:/images/image.jpg").url));
Since we're using the File class, the above will not run in the Flash Player, just on iOS or Android.
From my experience there are two practical methods to load assets from a running Android APP using AIR
using URLLoader and .url property of the File instance
using FileStream (which receives a File instance)
Adding to project
You can package the files inside the APK using
the same technique as packaging ICONS, inside the .xml descriptor file for your app.
And make sure you add those to your IDE in the Android section (applies to IntelliJ and Flash Builder) "Files and folders to package"
- add the path to your wanted assets folder, and give it a relative name alias.
Elaboration
The assets themselves are placed (when the APP is installed),
inside the ''applicationDirectory'' which is available for AIR,
Yet since newer Android OS's, and Flash player security issues,
it is not possible to access those assets via the nativePath,
only the 'url' property of the File instance.
Best Practice
Use URLLoader and .url property of the File instance,
as it is also Asynchronous by nature and can resolve PNG / JPG files
using native decoder instead of having to do this manually
reading a FileStream (which returns a a ByteArray)
var f:File = File.applicationDirectory.resolvePath('images/myimage.png');
loader.load(new URLRequest(f.url));
and,
If I'm missing more ways, please let know... )