Is it possible to distribute Image Units with my application? - macos

Mac OS X Mavericks
I was told that the issue was fixed in Mac OS X 10.9
Original
I read the documentation and didn't find the answer. It suggests to create Image Units, it requires to put this unit inside either ~/Library/Graphics/Image Units or /Library/Graphics/Image Units (putting the image unit inside Your.app/Contents/Library/Graphics/Image Units has no effect).
There is other, non-recommeneded way, to create Image Unit, which allows you to distribute cikernels and access the filter from your code. But it prevent you from creating non executable filters which is a big lack of performance.
I looked through the contents of bundles of applications like Pixelmator or Acorn and found that they don't use Image Units as well. I hope this is a mistake and there is a way to distribute Image Units within an application bundle.
I'm looking for a solution that will be accepted by Mac App Store validation.
Solution which doesn't allow you to use non executable filters
From the CIPlugIn header:
/** Loads a plug-in specified by its URL. */
+ (void)loadPlugIn:(NSURL *)url allowNonExecutable:(BOOL)allowNonExecutable
AVAILABLE_MAC_OS_X_VERSION_10_0_AND_LATER_BUT_DEPRECATED_IN_MAC_OS_X_VERSION_10_7;
/** Loads a plug-in specified by its URL.
If allowExecutableCode is NO, filters containing executable code will not be loaded. If YES, any kind of filter will be loaded. */
+ (void)loadPlugIn:(NSURL *)url allowExecutableCode:(BOOL)allowExecutableCode
AVAILABLE_MAC_OS_X_VERSION_10_7_AND_LATER;
New method isn't listed in official docs. So, to load bundle you simply do:
if (floor(NSAppKitVersionNumber) <= NSAppKitVersionNumber10_6)
{
[CIPlugIn loadPlugIn:[[NSBundle mainBundle] URLForResource:#"YourPlugin" withExtension:#"plugin"]
allowNonExecutable:YES];
}
else
{
[CIPlugIn loadPlugIn:[[NSBundle mainBundle] URLForResource:#"YourPlugin" withExtension:#"plugin"]
allowExecutableCode:YES];
}
Unfortunately if you try to use CIFilter with QuartzCore framework (e.g. with CALayer) app will crash because of stack overflow.
frame #0: 0x00007fff8b6a5d36 CoreImage`-[CIFilter copyWithZone:] + 12
frame #1: 0x00007fff8b7d1c7e CoreImage`-[CIPlugInStandardFilter _provideFilterCopyWithZone:] + 18
frame #2: 0x00007fff8b6a5d59 CoreImage`-[CIFilter copyWithZone:] + 47
frame #3: 0x00007fff8b7d1c7e CoreImage`-[CIPlugInStandardFilter _provideFilterCopyWithZone:] + 18
frame #4: 0x00007fff8b6a5d59 CoreImage`-[CIFilter copyWithZone:] + 47
frame #5: 0x00007fff8b7d1c7e CoreImage`-[CIPlugInStandardFilter _provideFilterCopyWithZone:] + 18
frame #6: 0x00007fff8b6a5d59 CoreImage`-[CIFilter copyWithZone:] + 47

As well as the Library paths you mention, Mac OS X will also look in YourApp.app/Contents/Library.
I think everything should work if you put your Image Unit in YourApp.app/Contents/Library/Graphics/Image Units/.

Related

Preload (all) image assets in a Flutter app

I would like an easy approach to preload/cache all of my static image assets so that they can be rendered/served without a delay.
I've seen that there is a precacheImage() call that can be used to pre-load/cache the AssetImage. This needs a context and it is recommended to call this in the didChangeDependencies() override.
Shouldn't there be a way to make this easier and more general? My app uses a total of 1.5 MB of image data (and I've included 2.0x and 3.0x upscaled versions in that number). PNG images that are 50 KB (and no upscaled versions) takes a noticeable amount of time to display, maybe 300-600ms on both emulator and fast devices. These are local assets, not fetched over the network. I find that irritating and I'm frustrated that there isn't a better way to handle this?
I've also seen the tip to use FadeInImage but again - it's not really what I'm looking for.
I'm displaying the image in a stateless widget (a custom button). It's not possible to use precacheImage in a stateless widget afaik. So I'd need to build the Image.asset() in my parent widget, call precacheImage and then pass the image widget to my stateless widget and render it in build - this is cumbersome.
Furthermore, the images will be displayed in different places (different parent widgets). Sometimes the image widgets differ in size between widgets and since size is parameters to Image.asset() I guess I would need to precache each unique size and pass these precached image widgets around. Isn't it possible to tell Flutter to "cache" the data of the PNG so that when the Image.asset is requested it reads the PNG from cache - without having to pass around precached image widgets?
I would like a "precacheAllImageAssets()call or callprecacheImage()` with a string so that each Image.asset() that references that same asset would be cached.
I guess that Flutter internally caches the image widget (including it's size and other properties) as some internal render object that is cached. Thus pre-caching two different sizes of the same image would require two different caches. With that being said - I'd still want a precacheAllImageAssets() call that could at least read the PNG data into memory and just serve it quicker even if would need to do some processing to get the PNG data to an actual widget with a size before it could be rendered. With such a cache I could maybe get a render delay of < 50 ms instead of the current 300-600 ms.
Any idea if this is possible? If not possible - am I missing something obvious or could this be a (likely) future improvement of the Flutter framework?
Here is my similar precacheAllImageAssets(), but you need to list all image path by yourself.
final List _allAsset = [
///tabbar
'images/tabbar/tabar_personal.png',
'images/tabbar/tabar_personal_slt.png',
'images/tabbar/tabar_home.png',
'images/tabbar/tabar_home_slt.png',
'images/...'
'images/...'
}
void main() {
final binding = WidgetsFlutterBinding.ensureInitialized();
binding.addPostFrameCallback((_) async {
BuildContext context = binding.renderViewElement;
if(context != null)
{
for(var asset in _allAsset)
{
precacheImage(AssetImage(asset), context);
}
}
});
}
UPDATE: after some research I've figured out that previous version of my answer was not correct. Here is relevant one (you can see old one in edit history).
You do not need to use same ImageProvider for images to precache. So you can run precacheImage() at init time and then create another image with same path and it is gonna be obtained from cache (if it was not cleared one way on another).
Internally precacheImage() uses ImageProvider.obtainKey() which is pair of (imagePath, scale) as a key to store image in in-memory cache. So as long as you are using same key it does not matter which instance of ImageProvider/Image you are using.
For futher insights you can inspect ImageCache documentation. Specifically, take a look at putIfAbsent() method as it's main caching endpoint. To understand how images generate their key (at which they are stored in ImageCache) try to start with ImageProvider class and then look into it's implementations.

Using action script 3 with flash builder 4.7 when i try to load image of dimension 16000 X 16000, not able to upload

I found below problem in action script 3.
Using action script 3 with flash builder 4.7 when i try to load image of dimension 16000 X 16000 using below code it will CRASH my App and not able to show image.
Image size is 4.6 MB and Image dimension is 16000 X 16000 (Width X Height)
When i try with other image having Dimension 10000 X 3000 (Width X Height) it will work
var mapLoader:Loader=new Loader();
var loaderInfo:LoaderInfo=mapLoader.contentLoaderInfo;
loaderInfo.addEventListener(Event.COMPLETE, function(event:Event):void
{
var image:Image=new Image();
image.source=mapLoader.content;
image.width=image.source.width * 0.6;
image.height=image.source.height * 0.6;
image.smooth=true;
}
});
loaderInfo.addEventListener(IOErrorEvent.IO_ERROR,function(e:IOErrorEvent):void
{
//some code
});
mapLoader.load(new URLRequest(mapSrc));
Please help me and thanks in advance
Before Flash Player 11, there was a limitation on the size of any loaded image. As of Flash Player 11, this limitation as been removed, and the maximum size is dependent of the operating system.
Considering that Flash internally handle image as bitmaps, your 16000x16000px image requires around 1Gb of RAM by itself. This may be more than what your app is allowed (or even what your system is capable of).
Since you seems to be making an app, I would recommend using a tile system : the full map is cut down in smaller images, that are then placed side-by-side. Using this system, your app can load and show only the required tiles, strongly reducing the memory required, as well as the necessary bandwidth. If you don't want to show blank spaces while the tiles are loading, you can add a lowres image of the full-map under the tiles, so that the user see a blurred version of this section before the app finished loading the corresponding tile and show the highres version.

Does Qt only load widgets that fit on the window?

I was wondering why image loading in Qt appears to be so much faster than in a game I'm working on.
I've created a simple test app that loads 70 500x500 PNG images using QPixmap and then displays these in 70 QLabels in a QVBoxLayout. It opens nearly instanteous, while my game takes one or two seconds to load these using libpng.
Not all labels are visible in the window - only two in fact - so I'm wondering: Does Qt perhaps only load images that are actually used and visible on the screen?
No, they a loaded as soon as appropriate QPixmap constructor called. 70 500x500 png is not that much to spend several seconds on loading, try to profile your algorithms

Displaying full-sized camera raw files in OSX

This has been driving me mad for months: I have a little app to preview camera raw images. As the files in question can be quite big and stored on a slow network drive I wanted to offer the user a chance to stop the loading of the image.
Handily I found this thread:
Cancel NSData initWithContentsOfURL in NSOperation
and am using Nick's great convenience method to cache the data and be able to issue a cancel request halfway through.
Anyway, once I have the data I use:
NSImage *sourceImage = [[NSImage alloc]initWithData:data];
The problem comes when looking at Nikon .NEF files; sourceImage returns only a thumbnail and not the full size. Displaying Canon .CR2 files and in fact, any other .TIFF's and .JPEG's seems fine and sourceImage is the expected size. I've checked the amount of data that is being loaded (with NSLog and [data length]) and it does seem that all of the Nikon files' 12mb is there for the -initWithData:
If I use
NSImage *sourceImage = [[NSImage alloc]initWithContentsOfURL:myNEFURL];
then I get the full sized image of the Nikon files but of course the app blocks.
So after poking around for what is beginning to feel like my entire life I think I know that the problem is related to the Nikon's metadata stating that the file's DPI is 300 whereas Canon et al is 72.
I hoped a solution would be to lazily access the file with:
NSImage*tempImg = [[NSImage alloc] initByReferencingURL:myNEFURL];
and having seen similar postings here and elsewhere I found a common possible answer of simply
[sourceImage setSize:tempImg.size];
but of course this just resizes the tiny thumbnail up to 3000x2000 or thereabouts.
I've been messing with the following hoping that they would provide a way to get the big picture from the .NEF:
CGImageSourceRef isr = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
CGImageRef isrRef = CGImageSourceCreateImageAtIndex(isr, 0, NULL);
and
NSBitmapImageRep *bitMapIR = [[NSBitmapImageRep alloc] initWithData:data];
But checking the sizes on these show similar thumbnail widths and heights. In fact, isrRef returns an even smaller thumbnail, one that is 4.2 times smaller. Perhaps worth noting that 300 / 72 == 4.2, so isrRef is taking account of the DPI on an image where the DPI (possibly) has already been observed.
Please! Can someone [nicely] put me out of my misery and help me get the full-sized image from the loaded data?!?! Currently, I'm special-case'ing the NEF files with a case insensitive search on the file extension and then loading the URL with the blocking methods. I have to take a hit on the app blocking and the search can't be fool-proof in the long run.
As an aside: is this actually a bug in the OS? It does seem like NSImage's -initWithData: and -initWithContentsOfURL: methods use different engines to actually render the image. Would it not be reasonable to have assumed that -initWithURL: simply loads the data which then gets rendered just as though it had been presented to the class with -initWithData: ?
It's a bug - confirmed when I did a DTS. Apparently I need to file a bug report. Currently the only way is to use the NSURL methods. Instead of checking the file extension I should probably traverse the meta dictionaries and check the manufacturer's entry for "Nikon", though...

High-resolution icon for file in Mac OS X?

I am looking for a method exactly like -[NSWorkspace iconForFile:] but which returns the icon in a higher resolution if possible. In particular, I have an app which makes use of QuickLook to display previews of files, and I'd like it to fall back to the file icon if no quick look plugin is available. Using the iconForFile: method, however, yields a small 32x32 icon. Is there a better method around? One that returns an NSImage or CGImageRef is preferred, but less accessible methods might be fine too.
The returned image of -[NSWorkspace iconForFile:] contains multiple representations, including higher resolution ones.
If you try drawing it at 512x512 it will automatically pick the appropriate representation.
Here is the way to make icon bigger:
NSImage * icon = [NSWorkspace iconForFile:yourPath];
[icon setSize:NSMakeSize(64,64)];
Thats it. Good luck!

Resources