UIKit controls visible after pushScene in cocos2d - uikit

I've got a scene with two UIKit controls: UITextView. But from this scene, user has possibility to go to another scene to check something and come back. After pushScene the UIKit controls are still visible on screen and user can tap on them and edit them.
How can I get rid of them so that they are not visible after pushScene?
I don't want to remove one scene and add another, as I want to maintain whatever happened in it, this also means I cannot release the controls and create new ones after user does popScene.

When calling pushScene, set the UITextViews as hidden:
myTextView.hidden = YES;
After popping the pushed scene simply unhide the text fields again. While hidden, UIViews don't receive input events but they remain in the view hierarchy.

Related

UIScrollView on tvOS

The question is very simple, how to enable scroll and zoom inside a UIScrollView in tvOS?
I tried the same initializer code from iOS and returned the scrollview for the focusedView var, but nothing happens when i touch the remote.
Also, i tried to add another custom UIPanGestureRecognizer to the scrollview and actually it works, but i don't want to handle the pan with custom code, just use the same pan behavior like iOS.
Let me know, thanks.
You can configure the scroll view's built-in pan gesture to recognize touches on the Siri Remote. It doesn't do that automatically, because normally scroll views on tvOS aren't scrolled directly by touches: they're scrolled automatically as focus moves between views within the scroll view.
If you really want the scroll view to move directly from touches, you'll need to add UITouchTypeIndirect to the allowedTouchTypes of the scroll view's panGestureRecognizer:
scrollView.panGestureRecognizer.allowedTouchTypes = #[ #(UITouchTypeIndirect) ];
You'll also need to make sure that either the scroll view itself is the focused view, or is a parent of the focused view, since all touches from the remote will start at the center of the focused view: you need to make sure the scroll view is getting hit-tested for the events to work.
Zooming won't work, because the Siri Remote can only recognize one touch at a time, so you can't do a pinch gesture on it.
Swift 4 version (from here: https://stackoverflow.com/a/41000183/945247)
scrollView.panGestureRecognizer.allowedTouchTypes = [NSNumber(value:UITouchType.indirect.rawValue)]

How is a subview of an NSView dynamically positioned?

I am attempting to change the coordinates of an NSButton that is contained within a parent NSView and something is clearly not working, because the button position does not change. Both elements are defined in a nib file and the parent view has animation applied to it using CoreAnimation.
I have tried the following.
button.frame.origin.x = 500
and...
var frame:CGRect = button.frame
frame.origin.x = 500
button.frame = frame
Even with the animations disabled, I can not seem to dynamically position the subview. Is there some feature that prevents children views from being positioned programmatically?
Please note that I am using Swift with XCode 6.3.1.
I'm guessing you're using AutoLayout constraints, given you're using the latest tools.
If so, setting a subview's frame directly won't work the way you're expecting (if it does anything at all, it'll cause strange drawing glitches / flashing when mixed with animation). You have to create outlets for your layout constraints and modify them.
If you're not using AutoLayout, I suggest having a look at your button outlet to make sure it's actually connected (ie, you're not talking to nil). Even if the outlet is connected, make sure it's not nil at runtime - you may be trying to talk to the button before the nib is loaded and the outlet / action connections are restored.

NSTextField dragging values

I am trying to create a text field in the Mac OSX environment that allows a user to select a number and drag horizontally to adjust that number up and down.
I know this can be done because Apple have implemented it in the inspector panel of the Sprite Kit emitter section: see the image
I have tried sub classing NSTextField to capture mouse drag events and doing the math, but they don't seem to get passed through. The mouse down event does though...
I have also tried placing a dummy view over the top and catching events as they come through. This works for mouse down, but again, the mouse dragged is never called. I know that the view can receive the mouse dragged event though, because if I place that view somewhere that a text field isn't under it, everything works fine.
What are my options here? I feel I have tried everything and the only option left is to create a new control from scratch. I don't mind that too much, but I also want the user to be able to add equations in the text field and drag individual values - that means the full functionality of NSTextField will need to be rebuilt just to add this dragging feature... Is there anything else???

How do I do something after the next NSView -layout has occurred?

In response to a user event, I want to:
add a new NSView to the window, and then
show an NSPanel positioned just below that view
I have each half of this done. I can add a new subview, and the container view's -updateConstraints identifies it and adds the correct layout constraints, so that the next time layout is performed, it's positioned correctly in the window. Also, I have a NSWindowController subclass that puts the panel on the screen.
Unfortunately, there's an ordering problem. My panel's controller just looks at the new NSView's frame property for deciding where to put it, but during this iteration of the main event loop, the -layout method hasn't been called yet, so it's still positioned at (0,0).
(If I separate these two pieces of functionality, and require two separate user events for "add view" and "create panel", then the panel is correctly positioned below the view.)
Is there a way to attach an NSPanel to an NSView, as if with a layout constraint? Or is there a way to say "do this (window controller stuff), but only after the next -layout call"?
Just call -layoutSubtreeIfNeeded on your NSView’s superview as soon as you add it and its constraints, so it will lay out immediately, then add the panel.
Or use an NSPopOver, although those draw a certain way and you might not want that.

Custom NSCursor is being reset when subviews are added to a view in the window

I'm using NSTrackingArea to define 2 areas in a NSView subclass. Then I'm using mouseEntered/mouseExited to change the cursor to a custom one.
So all works fine when the mouse enters the top tracking area and the custom cursor gets set as expected. All still good when I mouseDown and drag on the top tracking area. But I have another part of the UI that updates when the mouse is dragged and it adds subviews to a view elsewhere on the same window.
As soon as the first subview is added elsewhere, my custom cursor disappears and it reverts back to the arrow cursor. I thought I might be able to force the cursor back to the custom one using cursorUpdate for my view but for some reason it never gets called, even when set as an option in the NSTrackingArea.
Am a bit stumped with this one...
I solved it by overriding cursorUpdate in the window's custom contentView. An empty cursorUpdate method stopped the update from getting passed up the chain and the custom cursor now remains as I've set it.

Resources