I have 2 NSView , firt is backgroung (the transparent black circle), and a second view wich contain some labels .
Now the problem: When i run the app , the background layer is above label Layer, so if i set alphaValue at 100 for background layer , the labels will be invisible.
The strange thing is if i run Debug View Hierarchy, the layers seems to be ordoned correctly !
I hope to be just a bug from yosemite, but i'm not sure, so i decide to ask here.
Thanks for help, and sorry for my english!
Related
I've been working hard, searching the internet on that problem for 3 days and I'm now running out of ressource.
Currently porting an iOS app to MacOS (deployment 10.11). The problem:
I have a view hierarchy as below:
NSScrollview
documentView
grouping view
tiling view one
array of NSImageView (each one being a tile)
tiling view two
array of NSImageView (each one being a tile)
The two tiling views are overlaying completely, depending of UI one may be hidden or the second one has opacity set below 1.0 to blend the two tiled view.
Because of opacity requirement, as well as performance, views are CAlayer backed. This is done from IB where the to NSScrollview is checked for Core Animation. Thereoff, all the view tree is (implicitly) layer backed.
Works as expected, scroll, magnify, etc..
Then I need to make an image out of the document view to generate an SCNMaterial content (3D view).
On iOS the documentView renderInContext works as expected, and allows an image to be created.
On Appkit the context stay transparent, so is the image, a valid object while as if clearColor.
If the documentView.canDrawSubviewsIntoLayer is set at creation, the view tree renders OK. This can't be the solution since it prevents opacity setting to work.
Even when one tiling view is hidden (no opacity compositing) rendering fails.
I read that some kinds of views are not rendered. I don't use them. There are no filters, no masks, beside a default masksToBounds setting on all the view tree. I don't know why and where it is set. I tried to unset it on all the views at creation with no success. It is set again somehow, on the grouping view below the documentView. This may be the problem but why this property is out of my control ?
Alternative way to get view tree rendering, bitmapImageRepForCachingDisplayInRect: / cacheDisplayInRect:toBitmapImageRep: works the same : ok with canDrawSubviewsIntoLayer, KO otherwise. Apple code examples to make a texture out of a view are using either one of the two methods.
There are plenty of posts, mainly on SO, complaining about CALayer renderInContext and code to custom render a layer tree. Nevertheless, most are quite old and now there must be a simple standard way to achieve it.
Edit: among other attempts, I tried to set each view wantsLayer, with no success.
Well, as often when you post a request for help, you finally find the answer by yourself… Here it is:
Given the view tree as listed in the question, I achieved to render it by setting canDrawSubviewsIntoLayer on each tiling view. This way, the opacity compositing between layers is working, AND the views are rendered.
I post this as an answer, because it solves the problem.
As of WHY this works, here is my guess, but this is not a authorised answer: Each NSImageView tile, subviews of tiling view, have their origin set to their frame. This is not a transform on the layer but a position of the frame in the coordinates of the tiling view. This is a difference with the iOS code version where the tiles are positioned by a transform. I'm going to test further, and see if using a transform rather that a frame origin makes a difference to renderInContext.
Edit: after more testing it appears that the iOS version has a transform AND an offset to the tile.
So the only clue is that the layer system get lost when rendering in context on OSX when some subview have an offset ??
Summary:
The key point to the solution is to find the right layer where to set canDrawSubviewsIntoLayer
Good Afternoon helpful people!
I'm building a mac app that displays images. The app will be in fullscreen mode 100% of the time it is running.
My issue is that the images I am displaying do not fill the entire screen, therefore, showing a grayish background. Is there anyway to change this background color to Black? Or maybe it's the NSImageView that needs the background color changed?
I do not see anywhere in my Xcode Attributes Inspector to change color.
Thanks!
The solution to this is, in my case, to add an NSBox in storyboard and set its constraints so that it ALWAYS fills the view. Once you have done this, go to the Atrributes Inspector and change its Fill Color to your desired color.
The important part here is the constraints. Set Top, Bottom, Left, and Right space to 0. Hope that makes sense, i'm new to constraints.
I have an application , where I drag a circular UIImageView on top of another UIImageView , and since a circle could never be squared , the white borders of the image must show , so I wonder if there is a way to remove those borders or to hide them (not by making them have the same color ).
I am expecting that you are looking for image masking feature. Have a look at this tutorial and this one too. Both of them uses CoreGraphics frame functions and are really quicker in response.
Please note that it is not the Swift version but will give you atleast a starter.
try to set the image.layer.borderWidth = 0
(this is the swift version)
So I'm still getting accustomed to the world of Auto Layout in iOS 6 and it's been a fun (or in plain English -- tough) migration coming over from strings & struts.
I have a UIImageView that's the background for a game I'm working on. Here's what a regular 3.5" Retina Display looks like in Interface Builder:
but if I change the "Size" pop-up in the Simulated Metrics field for the content view to "Retina 4 Full Screen", here's what I see:
And you can see the ugly black bar appearing along the bottom edge of the simulated iPhone 5 screen. This same ugly black bar makes it over to the compiled app running in the iPhone 5 simulator.
Are there any attributes or constraints I can apply via Interface Builder to get the UIImageView's frame to size correctly for the appropriate iPhone device screen size?
Or do I have to enter in constraints via code? (ugh)
I've watched the three WWDC videos and if the engineers covered the topic of sizing a view to fit a parent, they must have glossed over it really fast because I've haven't yet found or heard a decent method to get both UIImageView and NSImageView to size correctly to their parent views under both the iOS and MacOS side.
From what you're showing here, I think that what you want to do is drag the image to meet the bottom of the superview so that it takes up all of the space you want. On the bottom right in the IB view you will see a small, grey pill-shaped set of 3 buttons. Click the center one (once you have your imageView selected). It's the one that looks a little like this: |--|. It will bring up a list of constraints. Once you have done that, select "Bottom Space to Superview". You should then be able to switch between phones and have the image resize automatically.
I have an NSImageView in a view that utilizes Core Animation. Prior to using Core Animation the image looks fine but now its blurry and low quality. If I let NSImageView have a bezel border the issue goes away but I need it to have no border. Had this happened to anyone else?
Imgae in the background with no border, same image in the front with a border.
Thanks
EDIT: I forgot to mention that the image is an icon file (ICNS) so it has various sizes. The bordered view loads the correct size and the transparent one loads the smallest and stretches it.
Although not the way I wanted to, I managed to create a fix for the issue. The issue seemed to be the way that NSImageView was drawing the image so I created a custom NSView subclass with support for the same bindings I used in my original image view. Im not sure why the blurry-ness happend in the beginning, but drawing the image by hand in an NSView seems to do the trick.
Your image may be drawing in a non-pixel-aligned way. Have you tried shifting it by a half pixel?
Apple has a good demonstration of this in the BlurryView app in their "Cocoa Tips and Tricks" sample code.
Cocoa Tips and Tricks