Firebreath Plugin to draw image on the browser - macos

Im developing a plugin that implements a ImageViewer application using FireBreath in MAC OSX, the image is located in the local filesystem. I have the following code snippet:
help me in implementing the getDrawingPrimitive function.
`FB::PluginWindowMac *wnd = dynamic_cast<FB::PluginWindowMac*>(win);
wnd->getDrawingPrimitive();
/* code related to openGL*/
CALayer* layer = [CALayer new];
[layer setContents:(id)[ImageHandler setImageWithURL:#"somePath.jpg"]];`
setContents receives the argument of type (CGImageRef).
Is this the right way to set the image to a CALayer object?

On calling wnd->StartAutoInvalidate(sometimelapse); It makes a call to onDraw function depending on Drawing Model and Event Model selected. The window context of type CGContextRef is obtained. This context can be used further to draw whatever is required. In order to draw an image, the image is converted to CGImageRef and using CGContextDrawImage the image is drawn on to the context.

Related

How to override UIImageView's image property in an IBDesignable subclass?

I made a subclass of UIImageView which uses Core Graphics to generate a new image that is cropped to a circle with an optional border. It works fine when I run the app. In Interface Builder, however, the generated image renders properly, but it does so underneath the "no image set" placeholder for a UIImageView. Also, the image property shows up twice in IB, and the new image is only generated if I set the overridden field. If I set the image in the regular UIImageView field, it just acts as though it isn't subclassed. Is this just a bug in IB, or is there a fix?
#IBInspectable override var image : UIImage? {
didSet {
// Make the image a circle
makeCircleImage()
}
}
We are stuck with Apple's IB implementation, I'm afraid. I recommend that you find a workaround to overriding image. You can instead implement prepareForInterfaceBuilder and awakeFromNib and do it there, or have a method of doing this operation once in the draw or layout methods, or have a boolean IBInspectable var that does this operation when set.

NSImage drawInRect and TemplateImages

In have an NSImage is a template image (that is, [NSImage isTemplate] returns YES).
When I use it inside an NSImageView, it is drawn correctly as a template image.
However, if I draw it manually using drawInRect:fromRect:operation:fraction:, it is drawn as a flat black image.
How can I draw an NSImage manually, and still get the 'template' effect?
The special drawing of template images is part of CoreUI and not public.
You can imitate the effect with Quartz though. Details and code can be found in this answer here on SO:
https://stackoverflow.com/a/7138497/100848

CGAffineTransformMakeScale cocoa?

How can I use CGAffineTransformMakeScale in Cocoa? On iPhone I do like this:
something.transform = CGAffineTransformMakeScale(2, 2);
But how can I do it on MAC?
CGAffineTransform, including all the related helper functions, works the same on Mac OS X as on iOS.
You do need to be linking against the Core Graphics or Application Services framework, and importing the header for whichever one you link against.
An NSView doesn't have a transform property like a UIView has, so if something was a UIView in your example, you will have to send your NSView a scaleUnitSquareToSize: message instead.
Your question is unclear. The CGAffineTransformMakeScale function is available on the Mac and it works in exactly the same way as on iOS. It is part of the Application Services framework, so you will need to add that framework to your project and import it with:
#import <ApplicationServices/ApplicationServices.h>
If on the other hand you're referring to view transforms, then on the Mac, NSView objects are not layer-backed by default and do not have a transform property.
If you make the view layer-backed then you can access the view's CALayer object via thelayer property of the view, and you can then apply a transform to that:
aView.layer.transform = CATransform3DMakeScale(2.0, 2.0, 0.0);
Note that both iOS and Mac OS X use CATransform3D structures for their layer transform property. If you want to set the layer transform to a CGAffineTransform then you need to use the ‑setAffineTransform: method of CALayer.

Converting vector image to Quartz 2D code

Is it possible to convert a vector image into Quartz 2D code (mac) so that
image can be drawn programmatically?
Not easily, you would have to write all the code yourself to do this. You might like to have a look at the Opacity image editor, which allows you to generate images and export them as Quartz or Cocoa drawing code.
What kind of vector image?
NSImage loads PDFs the same way it loads bitmaps.
NSImage is Quartz 2D drawing, but if you meant that you need a CGImage, NSImage in 10.6 has a method for getting one. However, CGImage is explicitly bitmap based, unlike NSImage. The parameters you pass to -[NSImage CGImageForRect:context:hints:] will determine how the art is rasterized. It will be rasterized the same way it would be if drawing to the passed rect in the passed context.
You can use Vector code http://www.vectorcodeapp.com to generate core graphics code, which you may use programmatically, or even use to generate postscripts.

Getting a CGImageRef from IKImageBrowserView

I feed the image browser view with image filenames and it manages loading them.
Is there a way to retrieve the CGImageRef of those images from the browser after it loads them? I'd like to do some Core Animation with them when the user clicks on an image.
Probably the simplest way to do this is to use the NSView method
-bitmapImageRepForCachingDisplayInRect:
to build an NSBitmapImageRep for the area, then
-cacheDisplayInRect:toBitmapImageRep:
to draw into it, then NSBitmapImageRep's
-CGImage
method to get a CGImageRef.

Resources