NSView - userInteractionEnabled equivalent - macos

Is there reliable/official way to completely disable user interaction with the view? Similar SO questions only suggest intercepting mouse events, but I'm looking for a complete solution, which will disable all interactions in view and it's descendants, including:
mouse events
trackpad
keyboard focus/events
accelerator keys
voice input
mind control techniques
whatever other official way for user to spawn control events from the UI
The view (and it's descendants) should also immediately lose keyboard/mouse focus once interaction is disabled.
This should be similar to what transitionFromViewController:toViewController:options:completionHandler: does without NSViewControllerTransitionAllowUserInteraction flag, but I can't find a way to do that outside an animation.
Update:
The other way to describe what I'm looking for is: view must act totally like it's hidden, but still be drawn on screen.

Related

OnGesture events are empty

I'm trying to get interactive gesture recognition (specifically, zooming and panning) working for my Delphi / C++Builder 10.2 Tokyo app.
What I've done so far:
Add a TGestureManager, GestureManager1 to my form.
Set the form's Touch.GestureManager to GestureManager1.
Leave everything under Touch.Gestures unchecked, because I want interactive gestures (zoom and pan), not "standard" gestures.
Make sure that Touch.InteractiveGestures.igZoom is checked.
Assign an OnGesture event handler.
The OnGesture event handler is triggered as expected, but the event's EventInfo.GestureID (which is supposed to give the type of gesture - pan, zoom, etc.) is always 0.
What am I doing wrong?
"Standard" gestures (the various lines and shapes under Touch.Gestures.Standard and "interactive" gestures (panning, zooming, rotating) are mutually exclusive.
To process "standard" or "custom" gestures", add a TGestureManager.
To receive "interactive" gestures, you need to remove the TGestureManager. This Embarcadero DocWiki article, which explains how gestures work in VCL and FireMonkey, specifically says:
In order to use Interactive Gestures such as zoom and rotate in a component, you do not need to associate the component with a Gesture Manager.
This Intel article has more details on Windows' various gesture interfaces. What Delphi calls "interactive" gestures correspond to Windows' WM_GESTURE message.

Between windowDidMove and windowWillMove

I've been trying with windowDidMove and windowWillMove (NSWindowDelegate) but I think I need something between these two...
Is there any other way to detect when I move my window in cocoa?
I mean - I want to trigger a function if I drag a window to the bottom of the screen, but I want this function to be run even if I didn't yet release the window?
The middle ground you are seeking is handling the mouse events yourself and implementing the window dragging. If you do this you determine how dragging works; so you can constrain the window to an area of the screen, trigger events when the window reaches a screen edge, etc.
You'll need to do some reading, you could start with Apple's Handling Mouse Events.
If you have problems once you've done the reading, written some code, etc. ask a new question, showing your code, and explain the problem you've hit. Somebody will probably help you out.
HTH

How to keep a hidden view in responder chain?

I have a NSSplitViewController and in one of the items I have multiple buttons with keyboard shortcuts.
Once the user hides the item, the shortcuts don't fire.
Is there any way to keep the buttons in the hidden view as part of the responder chain?
Sounds like the simple answer is no, according to Apple's docs. A simple workaround, however, might be to move the buttons out of the visible area by, say, shifting their bounds right by 10,000 or so. If they are in a scrollview/clipview that would expand to show the items in their new position, then this would not work so well, but if they aren't, it ought to work fine. If they are in a scrollview, then you might find a way to make them completely transparent, to achieve a similar effect.
That said, perhaps it is worth considering whether you have the right design in the first place, since having buttons that are not visible respond to key events is a questionable design from a user-interface perspective (as reflected by the fact that Apple tries to prevent it). Maybe those keyboard events should really be getting handled by a view higher in the view hierarchy, or by the window, or some such entity?

Scroll Gestures not Passed to IScrollInfo implementing panel in Windows Phone 7 CTP

I am using a custom panel as a ItemsPanel for a ItemsControl in a with a custom template that provides for a scroll viewer. (See Xaml below.) So long as my panel does not implement IScrollInfo, scrolling works in this scenerio.
I implement IScrollInfo and update my viewport and extent sizes in measure override. The scroll bar shows the correct relative size, and if I call the IScrollInfo methods directly, scrolling works as expected. However, the drag and flick gestures no longer scroll the content. Putting a breakpoint on the input of every IScrollInfo method shows that drag and pick are not calling the interface. Removing the IScrollInfo interface declaration restores the scroll on drag and flick behavior.
Is there a simple way to restore the flick and pan gestures to ItemControls with panels that implement IScrollInfo?
An unfortunate answer I received from Eric Sink, A MSFT forum moderator.
I believe that what is happening is that, when you inherit from
IScrollInfo, your panel takes over all of the scroll functionality but
we use an internal interface, as martin mentioned, to control the
flick animation. Since your object does not inherit from this
interface the underlying code will bypass this functionality.
I think that you should still be able to override the OnManipulation*
events and setup your own storyboard animation.
It sounds like if you want to do IScrollInfo, you're on your own for the manipulation.

Programmatically closing an NSWindow when it loses focus

I am making an image picker that will display an n by n grid of selectable button when the picker is popped up. This grid of buttons will be contained within an NSWindow but I would like for the window to be close automatically if the user clicks off the screen. Is there a flag that can be set so that when the window looses focus it will be closed automatically?
There are two notifications that you may be interested in: NSWindowDidResignKeyNotification and NSWindowDidResignMainNotification. You can simply register for the one you're interested in in awakeFromNib (or windowDidLoad if you have a custom controller) and then close or hide the window as appropriate when you receive the notifications.
I won't delve too much into whether or not this is a good idea from UI standpoint. But, it might be a better idea to have either an overlay view or a panel for the functionality you describe.
You might check out NSPanel. It's an NSWindow subclass that will hide itself when the app is in the background, and that behavior sounds very similar to what you are looking for.

Resources