Render off-screen WKWebView into NSImage - macos

I've tried rendering an offscreen WKWebView into an image using
func cacheDisplayInRect(rect: NSRect, toBitmapImageRep bitmapImageRep: NSBitmapImageRep)
and func drawLayer(layer: CALayer, inContext ctx: CGContext)
without success. The resulting image is always empty (white or transparent). Has anyone managed to do this on Yosemite?

You can do it using drawViewHierarchyInRect: none of the other methods seem to work, use it like so:
UIGraphicsBeginImageContextWithOptions(newRect.size, YES, 0);
BOOL ok = [view drawViewHierarchyInRect:newRect afterScreenUpdates:YES];
if (!ok) {
NSLog(#"Problem with drawView...");
}
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
I'm doing the same in iOS, but unfortunately this method is slow, must also be run in the main thread and only works if afterScreenUpdates is set to yes. See this answer : How can I take a snapshot of a UIView that isn't rendered?
Also there's no way to tell, from what I can see, if any aspect of the webpage needs redrawing.

Related

Creating a png image from SpriteKit Node/SpriteNode in Xcode for Mac

Currently I'm making an isometric map editor (in Swift) where each map tile is added to an SKNode called mapLayer. I want to know whether it is possible to produce a png image from this mapLayer, once I have finished designing a map?
Not in Swift but it should give you a good idea on how to do it.
You can capture your screen to a UIImage by doing this:
CGRect bounds = self.scene.view.bounds;
UIGraphicsBeginImageContextWithOptions(bounds.size, NO, [UIScreen mainScreen].scale);
[self drawViewHierarchyInRect:bounds afterScreenUpdates:YES];
UIImage* screenshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Then create a PNG image from the UIImage (and write it to disk) like this:
// Create paths to output images
NSString *pngPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Test.png"];
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Test.jpg"];
Source here.
// Write a UIImage to JPEG with minimum compression (best quality)
// The value 'image' must be a UIImage object
// The value '1.0' represents image compression quality as value from 0.0 to 1.0
[UIImageJPEGRepresentation(image, 1.0) writeToFile:jpgPath atomically:YES];
// Write image to PNG
[UIImagePNGRepresentation(image) writeToFile:pngPath atomically:YES];
// Let's check to see if files were successfully written...
// Create file manager
NSError *error;
NSFileManager *fileMgr = [NSFileManager defaultManager];
// Point to Document directory
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
// Write out the contents of home directory to console
NSLog(#"Documents directory: %#", [fileMgr contentsOfDirectoryAtPath:documentsDirectory error:&error]);
Source here.
Here's an example of how to write the contents of the view to a PNG file. First, define a UIView extension that captures the contents of the view and converts it to a UIImage.
extension UIView {
func screenGrab() -> UIImage {
// Uncomment this (and comment out the next statement) for retina screen capture
// UIGraphicsBeginImageContextWithOptions(bounds.size, false, UIScreen.mainScreen().scale)
UIGraphicsBeginImageContextWithOptions(bounds.size, false, 1.0)
drawViewHierarchyInRect(self.bounds, afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
Then, define a UIImage extension that writes a UIImage to a file.
extension UIImage {
func writeToFile(fileName:String) {
let path = NSHomeDirectory().stringByAppendingPathComponent("Documents/"+fileName)
UIImagePNGRepresentation(self).writeToFile(path, atomically: true)
}
}
Lastly, capture and write the contents of the view to a PNG file with
self.view?.screenGrab().writeToFile("capture.png")
and here's how to determine where the PNGs will be stored:
println(NSHomeDirectory().stringByAppendingPathComponent("Documents"))
This is one of the instances where macOS and iOS behave very differently. Spoiler: the usual methods of capturing the contents of an NSView to NSImage don't work.
NSImage(data:inside:) and NSView.bitmapImageRepForCachingDisplay(in:to) work fine for NSView, but if you embed an SKView or apply them to the SKView directly, you'll only get a blank canvas.
The solution lies in thinking SpriteKit:
An SKScene is an SKNode at heart (actually, an SKEmitterNode), and SKView has a method called texture(from: SKNode) which gives you an SKTexture. SKTexture has a cgImage() property, and there you are.
This assumes three properties in the NSViewController that presents the scene: skView: SKView, scene: SKScene?, and an NSImageView for display named snapView.
if let rendered = skView.texture(from: scene!){
snapView.image = NSImage(cgImage: rendered.cgImage(), size: self.skView.bounds.size)
(Don't force unwrap, it's only for demo purposes)

UITextField resize issue

I have found an issue with UITextField. I have created a subclass of uitexfield that allows the user to click on the text to start editing and then rotate and resize the text.
If you resize the textfield by making the height larger the centered text moves to the right even though the width of the textfield does not increase. I have investigated this and found that uitextfield has an internval view of the type UIFieldEditor which in turn has a _UIFieldEditorContentView view. UIFieldEditor seems to be a subclass of UIScrollView and the contentsize of this scrollview becomes much larger than the size of uitextview. When the textview increases it's height the scrollviews contentsize width increases. I guess this might be an internal autolayout issue.
I have added a demo project here that demonstrates the issue. CLick on the text to start edit, then drag the resize icon so that the height increases and you will see the issue.
https://github.com/permagnus/UITextField-Resize-Issue-Demo
Screenhots from revealapp:
Incorrect size of underlying view in scrollview: https://github.com/permagnus/UITextField-Resize-Issue-Demo/blob/master/Screenshots/screenshot-showing-incorrect-size.png
The actuall size of the uitextfield: https://github.com/permagnus/UITextField-Resize-Issue-Demo/blob/master/Screenshots/screenshot-showing-textfield-size.png
Any ideas on how to fix this issue?
I found two ways to fix the issue:
The problems lies within the underlying scrollview. One way is to find the scrollview and se how much offseted it is and compensate for the wrong offset:
- (CGRect)editingRectForBounds:(CGRect)bounds
{
CGRect editRect = [super editingRectForBounds:bounds];
UIScrollView *scrollView = [self findScrollViewFromView:self];
if(scrollView)
{
float diff = (self.bounds.size.width - scrollView.contentSize.width)/2;
return CGRectInset(editRect, diff, 0);
}
return editRect;
}
- (UIScrollView *)findScrollViewFromView:(UIView *)view
{
if([view isKindOfClass:[UIScrollView class]])
{
return (UIScrollView *) view;
}
for(UIView *v in view.subviews)
{
UIScrollView *scrollView = [self findScrollViewFromView:v];
if(scrollView)
{
return scrollView;
}
}
return nil;
}
I also contacted Apple Support to get their point of the problem. They confirmed that this probably is a bug and I have submitted it as one. Their solution is the following:
Field editor is only activated for current editing session, so you can
end the editing session of the text field (by calling
resignFirstResponder) before resizing it (in touchesBegan... ?). In
your scenario, I guess keeping the editing session might not be
necessary.
If you really need to keep the editing session, one solution (ugly) I
can see is to reset the text and make sure the cursor is at the
beginning of document:
self.text = [self.text copy];
UITextPosition *beginningOfDocument = [self positionFromPosition:self.beginningOfDocument offset:0];
self.selectedTextRange = [self textRangeFromPosition:self.beginningOfDocument toPosition:beginningOfDocument];
Both of these solutions are shitty hacks and are not recommended so you use these at your own risk :)

CCLayer to UIImage - Anti-aliasing?

When I grab a snapshot of a CCLayer as an UIImage with the help of CCRenderTexture it seems like I'm loosing the anti-aliasing, resulting in the output image looking slightly different from what the screen actually looks like.
Is there a way of getting an output image that corresponds more exactly to what is shown on the screen?
This is how I'm getting my UIImage:
-(UIImage*)layerRepresentation {
CCLayer *layer1 = self;
CCRenderTexture *renderer01 = [CCRenderTexture renderTextureWithWidth:layer1.contentSize.width height:layer1.contentSize.height];
[renderer01 begin];
[self visit];
[renderer01 end];
UIImage *image = [renderer01 getUIImage];
return image;
}
When CCRenderTexture is created, it sendssetAliasTexParameters message to its texture. Try
[renderer01.sprite.texture setAntiAliasTexParameters];

Image won't appear in Xcode

I'm a first time programmer and i'm trying to make a soundboard app. Its not finished yet and it is pretty basic. I have a main menu with one button. When this button is pressed, a different background image should appear along with a back button. The image does not change. here is the code:
-(IBAction)redgradient {
UIImage *img = [UIImage imageNamed:#"redgradient.jpg"];
[imageView setImage:img];
}
-(IBAction)redgradient2 {
UIImage *img = [UIImage imageNamed:#"redgradient2.png"];
[imageView setImage:img];
}
I have mentioned the IBActions in the H file and used an IBOutlet of ImageView. Help is appreciated.
Here's a more simple way to do that, test if this works better:
-(IBAction)redgradient {
imageView.image = [UIImage imageNamed:#"redgradient.jpg"];
}
-(IBAction)redgradient2 {
imageView.image = [UIImage imageNamed:#"redgradient2.png"];
}
Check if the images ends on .jpg or .png if that is the problem.
Here are a few things to check:
The image files have been added to the project.
The image names are exactly as they appear in the project (case sensitive and correct format).
The connections are properly established in the IB.

NSButton with variable size width, rounded corners

What is the best way to create an NSButton with a custom background image, that is able to have variable width, without making the corner bezel look stretched? I know there are convenience methods to do this with UIButton: http://jainmarket.blogspot.com/2009/04/create-uibuttonbutton-with-images.html but I haven't seen anything similar in NSButton.
I needed to have a custom button background, here's how I did it. I made an NSButton subclass and overrode drawrect method:
- (void)drawRect:(NSRect)dirtyRect
{
// My buttons don't have a variable height, so I make sure that the height is fixed
if (self.frame.size.height != 22) {
self.frame = CGRectMake(self.frame.origin.x, self.frame.origin.y, self.frame.size.width,
22.0f);
}
//
switch (self.state) {
// Onstate graphics
case NSOnState:
NSDrawThreePartImage(self.bounds,
[NSImage imageNamed:#"btnmain_lb_h.png"], [NSImage imageNamed:#"btnmain_bg_h.png"], [NSImage imageNamed:#"btnmain_rb_h.png"],
NO, NSCompositeSourceAtop, 1.0, NO);
// Offstate graphics
default:
case NSOffState:
NSDrawThreePartImage(self.bounds,
[NSImage imageNamed:#"btnmain_lb.png"], [NSImage imageNamed:#"btnmain_bg.png"], [NSImage imageNamed:#"btnmain_rb.png"],
NO, NSCompositeSourceAtop, 1.0, NO);
break;
}
[super drawRect:dirtyRect];
}
Then I could put the buttons using Interface Builder, and to get the custom graphics I just have to change the class to my new subclass.
this worked perfectly fine for me:
[self.addBuddyCommitButton.cell setBezelStyle:NSRoundedBezelStyle];
NSButton doesn't have the same convenience methods for background images as UIButton (which is odd and here's to hoping Apple bridges that gap). You'll need to create a custom button my subclassing NSView and handling the variable width and corners yourself. I don't think it will be easy, but I don't think it would be terribly difficult either.

Resources