Where do mouseDown events go when the control key is down? - macos

I have subclassed NSComboBox for a number of reasons, including a strategy for displaying contextual menus without the OS adding arcane things to them. (“Add to iTunes as a spoken track”???) Here are my mouse event methods:
public override func mouseDown (event: NSEvent)
{ NSLog("mouseDown")
if NSEvent.modifierFlags().contains(.ControlKeyMask)
{ self.rightMouseDown(event) }
else
{ super.mouseDown(event) }
}
public override func rightMouseDown (event: NSEvent)
{ NSLog("rightMouseDown")
super.menu?.delegate = self
super.menu?.allowsContextMenuPlugIns = false
super.menu?.popUpMenuPositioningItem(nil, atLocation:
self.convertPoint(event.locationInWindow, fromView: nil), inView: self)
}
The rightMouseDown method does the last-second menu configuration I want. And I think the (left) mouseDown method would also work (it’s there only because ctrl-left-click is a traditional alternate to right-click), except that with the control key down it never sees the mouse event. The event seems to get to the superclass by going around rather than through my subclass, because NSComboBox does display a menu, just not the one I want (and the menu delegate isn’t right, etc).
I suspect there is some kind of legacy propagation path for ctrl-left-clicks, from the era when Apple mice had only one button. If I knew where these events were directed (I don’t think they’re going to my NSPanel), I might be able to intercept them. Does anyone know where they go? Is there something in NSEvent documentation I’m staring at and not seeing?

Related

How to align a toolbar (or its items) with the leading edge of a split view controller's child?

In iOS, a toolbar can be added to any view. In macOS however, it seems only possible to add a toolbar to a window.
I'm working on an app with a split view controller with a toolbar but the toolbar's items only have a meaning with respect to the right view controller's context.
E.g. let's say I have a text editor of some sort, where the left pane shows all documents (like in the Notes app) and the right pane shows the actual text which can be edited. The formatting buttons only affect the text in the right pane. Thus, it seems very intuitive to place the toolbar within that right pane instead of stretching it over the full width of the window.
Is there some way to achieve this?
(Or is there a good UX reason why this would be a bad practice?)
I've noticed how Apple solved this problem in terms of UX in their Notes app: They still use a full-width toolbar but align the button items that are only related to the right pane with the leading edge of that pane.
So in case, there is no way to place a toolbar in a view controller, how can I align the toolbar items with the leading edge of the right view controller as seen in the screenshot above?
Edit:
According to TimTwoToes' answer and the posts linked by Willeke in the comments, it seems to be possible to use Auto Layout for constraining a toolbar item with the split view's child view. This solution would work if there was a fixed toolbar layout. However, Apple encourages (for a good reason) to let users customize your app's toolbar.
Thus, I cannot add constraints to a fixed item in the toolbar. Instead, a viable solution seems to be to use a leading flexible space and adjust its size accordingly.
Initial Notes
It turns out this is tricky because there are many things that need to be considered:
Auto Layout doesn't seem to work properly with toolbar items. (I've read a few posts mentioning that Apple has classified this as a bug.)
Normally, the user can customize your app's toolbar (add and remove items). We should not deprive the user of that option.
Thus, simply constraining a particular toolbar item with the split view or a layout guide is not an option (because the item might be at a different position than expected or not there at all).
After hours of "hacking", I've finally found a reliable way to achieve the desired behavior that doesn't use any internal / undocumented methods. Here's how it looks:
How To
Instead of a standard NSToolbarFlexibleSpaceItem create an NSToolbarItem with a custom view. This will serve as your flexible, resizing space. You can do that in code or in Interface Builder:
Create outlets/properties for your toolbar and your flexible space (inside the respective NSWindowController):
#IBOutlet weak var toolbar: NSToolbar!
#IBOutlet weak var tabSpace: NSToolbarItem!
Create a method inside the same window controller that adjusts the space width:
private func adjustTabSpaceWidth() {
for item in toolbar.items {
if item == tabSpace {
guard
let origin = item.view?.frame.origin,
let originInWindowCoordinates = item.view?.convert(origin, to: nil),
let leftPane = splitViewController?.splitViewItems.first?.viewController.view
else {
return
}
let leftPaneWidth = leftPane.frame.size.width
let tabWidth = max(leftPaneWidth - originInWindowCoordinates.x, MainWindowController.minTabSpaceWidth)
item.set(width: tabWidth)
}
}
}
Define the set(width:) method in an extension on NSToolbarItem as follows:
private extension NSToolbarItem {
func set(width: CGFloat) {
minSize = .init(width: width, height: minSize.height)
maxSize = .init(width: width, height: maxSize.height)
}
}
Make your window controller conform to NSSplitViewDelegate and assign it to your split view's delegate property.1 Implement the following NSSplitViewDelegate protocol method in your window controller:
override func splitViewDidResizeSubviews(_ notification: Notification) {
adjustTabSpaceWidth()
}
This will yield the desired resizing behavior. (The user will still be able to remove the space completely or reposition it, but he can always add it back to the front.)
1 Note:
If you're using an NSSplitViewController, the system automatically assigns that controller to its split view's delegate property and you cannot change that. As a consequence, you need to subclass NSSplitViewController, override its splitViewDidResizeSubviews() method and notify the window controller from there. Your can achieve that with the following code:
protocol SplitViewControllerDelegate: class {
func splitViewControllerDidResize(_ splitViewController: SplitViewController)
}
class SplitViewController: NSSplitViewController {
weak var delegate: SplitViewControllerDelegate?
override func splitViewDidResizeSubviews(_ notification: Notification) {
delegate?.splitViewControllerDidResize(self)
}
}
Don't forget to assign your window controller as the split view controller's delegate:
override func windowDidLoad() {
super.windowDidLoad()
splitViewController?.delegate = self
}
and to implement the respective delegate method:
extension MainWindowController: SplitViewControllerDelegate {
func splitViewControllerDidResize(_ splitViewController: SplitViewController) {
adjustTabSpaceWidth()
}
}
There is no native way to achieve a "local" toolbar. You would have to create the control yourself, but I believe it would be simpel to make.
Aligning the toolbar items using autolayout is described here. Align with custom toolbar item described by Mischa.
The macOS way is to use the Toolbar solution and make them context sensitive. In this instance the text attribute buttons would enable when the right pane has the focus and disable when it looses the focus.

Dismissing keyboard in UITextField with RAC(5)?

Newbie to ReactiveCocoa and ReactiveSwfit here... Sorry if the answer is obvious.
I am trying to adapt the Start Developing iOS Apps with Swift sample to ReactiveSwift / ReactiveCocoa, and I am running into an issue with "translating" the UITextField's Delegate method -- which gets rid of the keyboard and essentially ends the editing (so I can capture the text field in the mealNameLabel) :
func textFieldShouldReturn(_ textField: UITextField) -> Bool
I am using
nameTextField.reactive.textValues.observeValues { value in
viewModel.mealName.swap(value ?? "")
}
// Setup bindings to update the view's meal label
// based on data from the View Model
mealNameLabel.reactive.text <~ viewModel.mealLabel
to get the value from the text field into the view model and percolate the view model's label back to the UILabel (convoluted...)
That works fine, as long as I maintain the viewController as the UITextField's delegate and I still implement the method depicted in the tutorial and mentioned above. Essentially :
override func viewDidLoad() {
super.viewDidLoad()
nameTextField.delegate = self
// view controller logic
...
}
func textFieldShouldReturn(_ textField: UITextField) -> Bool {
// Hide the keyboard.
textField.resignFirstResponder()
return true
}
I tried using
nameTextField.reactive.controlEvents
but that failed miserably due to my lack of understanding of controlEvents (docs anywhere ?).
So what do I need to do to make the keyboard disappear when the user is done editing, the "reactive way" ?
Thanks !!!
(Of course right after I post my question...)
It looks like this might actually do the trick :
nameTextField.reactive.controlEvents(UIControlEvents.primaryActionTriggered)
.observeValues { textField in
textField.resignFirstResponder()
}
After fiddling with the different event types, it looks like .primaryActionTriggered is what gets triggered when the "Done" button is pressed.
Any better way to do this ?

Scrolling in NSScrollView stops when overwriting scrollWheel function

I experience a weird bug when I overwrite the scrollWheel(theEvent: NSEvent) function in a NSScrollView.
Here's my code:
import Cocoa
class GraphScrollView: NSScrollView {
var axisScrollViewInstance: AxisScrollView?
override func scrollWheel(theEvent: NSEvent) {
if theEvent.deltaY != 0 {
axisScrollViewInstance?.scrollEventFromOtherScrollView(theEvent)
super.scrollWheel(theEvent)
} else if theEvent.deltaX != 0 {
super.scrollWheel(theEvent)
}
}
}
class AxisScrollView: NSScrollView {
var graphScrollViewInstance: GraphScrollView?
override func scrollWheel(theEvent: NSEvent) {
if theEvent.deltaY != 0 {
super.scrollWheel(theEvent)
graphScrollViewInstance?.scrollWheel(theEvent)
}
}
func scrollEventFromOtherScrollView(theEvent: NSEvent) {
if theEvent.deltaY != 0 {
super.scrollWheel(theEvent)
}
}
}
The code basically checks in which direction the user is scrolling and forwards the NSEvent to another NSScrollView if the direction is vertically. I need this so that I can have two NSScrollViews next to each other, which both scroll vertically and one of them scrolls horizontally as well. As an example you can look at the Mac Calendar app, which has an hour column on the left that only scrolls vertically and a week overview on the right that scrolls horizontally and vertically.
However, if I overwrite the scrollWheel-method and call the super function from inside, it leads to a weird problem with the Mighty Mouse. I can only scroll upwards and to the right, whereas it does nothing if I scroll in any other direction, although a NSEvent is always occurring, disregarding in which direction I scroll. After a while it stops scrolling altogether.
I have already tried to simply overwrite the scrollWheel-method and only call the super function with the event like this:
override func scrollWheel(theEvent: NSEvent) {
super.scrollWheel(theEvent)
}
but it leads to the same problems. I have a second (non-Apple) mouse connected to my system, which has a 'normal' scroll wheel and with which my code works perfectly fine.
Do you have an idea what the problem could be? I heard that the NSEvents are of different subtype (1: Apple Device, 0: Any other Device). Maybe that has something to do with it?
Otherwise, do you know how I can have two NSScrollViews next to each other that scroll vertically simultaneously?
Okay, so it's a bug with the Interface Builder in Xcode 6.4 and maybe in earlier versions as well. You have to make sure that the box "Show Vertical Scroller" and "Show Horizontal Scroller" in the Attribute Inspector of the NSScrollView are checked.
If you check both boxes you won't have problems with overwriting the scrollWheel(theEvent: NSEvent) function. You can then remove the scroll bars programmatically again without problems by setting the hasHorizontalScroller and hasVerticalScroller to false in e.g. your init()
Update at 12.10.2015
I've reported the 'bug' to Apple, however it seems that by overwriting the -scrollWheel function, the view loses its responsive scrolling ability.
This is the answer of the Apple Dev Team:
If you override -scrollWheel, then that scrollView cannot perform responsiveScrolling.
If responsive scrolling is not on, then the scroller must be shown in order for scrollView to scroll.
See Documentation for +isCompatibleWithResponsiveScrolling
In the documentation is says:
The default implementation of this method returns true unless the class overrides the lockFocus or scrollWheel: method
I had the same issue and needed horizontal scrolling override, what seems to work is this:
override func scrollWheel(with event: NSEvent) {
self.hasHorizontalScroller = true
self.hasVerticalScroller = true
// do your magic here
self.hasHorizontalScroller = false
self.hasVerticalScroller = false
}

Cocoa: Key down event on NSView not firing

I have made a custom NSView and have implemented the keyDown: method. However, when I press keys the method is never called. Do I have to register to receive those events? fyi, I am making a document based application and can handle this code anywhere (doesn't have to be in this view). What is the best place to do this in a document based application such that the event will occur throughout the entire application?
You need to override -acceptsFirstResponder to return YES.
In Swift:
class MDView: NSView {
override var acceptsFirstResponder: Bool { return true }
}

Custom NSView in NSMenuItem not receiving mouse events

I have an NSMenu popping out of an NSStatusItem using popUpStatusItemMenu. These NSMenuItems show a bunch of different links, and each one is connected with setAction: to the openLink: method of a target. This arrangement has been working fine for a long time. The user chooses a link from the menu and the openLink: method then deals with it.
Unfortunately, I recently decided to experiment with using NSMenuItem's setView: method to provide a nicer/slicker interface. Basically, I just stopped setting the title, created the NSMenuItem, and then used setView: to display a custom view. This works perfectly, the menu items look great and my custom view is displayed.
However, when the user chooses a menu item and releases the mouse, the action no longer works (i.e., openLink: isn't called). If I just simply comment out the setView: call, then the actions work again (of course, the menu items are blank, but the action is executed properly). My first question, then, is why setting a view breaks the NSMenuItem's action.
No problem, I thought, I'll fix it by detecting the mouseUp event in my custom view and calling my action method from there. I added this method to my custom view:
- (void)mouseUp:(NSEvent *)theEvent {
NSLog(#"in mouseUp");
}
No dice! This method is never called.
I can set tracking rects and receive mouseEntered: events, though. I put a few tests in my mouseEntered routine, as follows:
if ([[self window] ignoresMouseEvents]) { NSLog(#"ignoring mouse events"); }
else { NSLog(#"not ignoring mouse events"); }
if ([[self window] canBecomeKeyWindow]) { dNSLog((#"canBecomeKeyWindow")); }
else { NSLog(#"not canBecomeKeyWindow"); }
if ([[self window] isKeyWindow]) { dNSLog((#"isKeyWindow")); }
else { NSLog(#"not isKeyWindow"); }
And got the following responses:
not ignoring mouse events
canBecomeKeyWindow
not isKeyWindow
Is this the problem? "not isKeyWindow"? Presumably this isn't good because Apple's docs say "If the user clicks a view that isn’t in the key window, by default the window is brought forward and made key, but the mouse event is not dispatched." But there must be a way do detect these events. HOW?
Adding:
[[self window] makeKeyWindow];
has no effect, despite the fact that canBecomeKeyWindow is YES.
Add this method to your custom NSView and it will work fine with mouse events
- (void)mouseUp:(NSEvent*) event {
NSMenuItem* mitem = [self enclosingMenuItem];
NSMenu* m = [mitem menu];
[m cancelTracking];
[m performActionForItemAtIndex: [m indexOfItem: mitem]];
}
But i'm having problems with keyhandling, if you solved this problem maybe you can go to my question and help me a little bit.
Add this to your custom view and you should be fine:
- (BOOL)acceptsFirstMouse:(NSEvent *)theEvent
{
return YES;
}
I added this method to my custom view, and now everything works beautifully:
- (void)viewDidMoveToWindow {
[[self window] becomeKeyWindow];
}
Hope this helps!
I've updated this version for SwiftUI Swift 5.3:
final class HostingView<Content: View>: NSHostingView<Content> {
override func viewDidMoveToWindow() {
window?.becomeKey()
}
}
And then use like so:
let item = NSMenuItem()
let contentView = ContentView()
item.view = HostingView(rootView: contentView)
let menu = NSMenu()
menu.items = [item]
So far, the only way to achieve the goal, is to register a tracking area manually in updateTrackingAreas - that is thankfully called, like this:
override func updateTrackingAreas() {
let trackingArea = NSTrackingArea(rect: bounds, options: [.enabledDuringMouseDrag, .mouseEnteredAndExited, .activeInActiveApp], owner: self, userInfo: nil)
addTrackingArea(trackingArea)
}
Recently I needed to show a Custom view for a NSStatusItem, show a regular NSMenu when clicking on it and supporting drag and drop operations on the Status icon.
I solved my problem using, mainly, three different sources that can be found in this question.
Hope it helps other people.
See the sample code from Apple named CustomMenus
In there you'll find a good example in the ImagePickerMenuItemView class.
It's not simple or trivial to make a view in a menu act like a normal NSMenuItem.
There are some real decisions and coding to do.

Resources