A bit of context first. Essentially I have a window that is covering the desktop. On it I have a few WebKit WebView views which allow user interaction. By deafult, as one would expect, when another application is active it does not receive these events (such as hovering, mouse entered, and clicking). I can make it work by clicking my window first, then moving the mouse, but this is not good for usability. I've also managed to make it activate the window when the cursor enters, but it's far from ideal and rather hacky.
So instead I'm trying to use a tracking area. At the moment on the WebViews superview I have this tracking area:
NSTrackingArea *trackingArea = [[NSTrackingArea alloc] initWithRect:[self visibleRect]
options:NSTrackingMouseEnteredAndExited | NSTrackingMouseMoved | NSTrackingInVisibleRect | NSTrackingActiveAlways
owner:self
userInfo:nil];
This works as I want it to, I'm receiving the all the mouse events. However, the WebViews don't seem to be responding as intended. JavaScript mouse move events only fire when I hold and drag, not just hover and drag.
I've tried using hitTest to get the correct view, but nothing seems to work. Here's an example method, I'm using the isHandlingMouse boolean because without it an infinite loop seemed to be created for some reason:
- (NSView *)handleTrackedMouseEvent: (NSEvent *)theEvent{
if(isHandlingMouse)
return nil;
isHandlingMouse = true;
NSView *hit = [self hitTest: theEvent.locationInWindow];
if (hit && hit != self) {
return hit;
}
return nil;
}
- (void)mouseMoved:(NSEvent *)theEvent{
NSView *hit = [self handleTrackedMouseEvent: theEvent];
if (hit){
[hit mouseMoved: theEvent];
}
isHandlingMouse = false;
}
The 'hit' view, is a WebHTMLView, which appears to be a private class. Everything seems like it should be working,but perhaps there's something I'm doing that's breaking it, or I'm sending the event to the WebHTMLView incorrectly.
Post a sample Xcode project to make it easier for people to test solutions to this problem.
I was doing something similar and it took a lot of trial and error to find a solution. You will likely need to subclass NSWindow and add - (BOOL)canBecomeKeyWindow { return YES; }, then whenever you detect the mouse is over your window, you might call [window orderFrontRegardless] just so it can properly capture the mouse events.
Related
My goal is simple and yet I cannot find a solution in spite of lots of searching.
Basically, when my app is in full-screen (kiosk) mode, I want the toolbar only to auto-hide, but I want the menu bar hidden.
Apparently this combination is not valid. I've tried:
- (NSApplicationPresentationOptions)window:(NSWindow *)window willUseFullScreenPresentationOptions: (NSApplicationPresentationOptions)proposedOptions
{
return (NSApplicationPresentationFullScreen |
NSApplicationPresentationHideDock |
NSApplicationPresentationHideMenuBar |
NSApplicationPresentationAutoHideToolbar);
}
I get the following exception:
"... fullscreen presentation options must include NSApplicationPresentationAutoHideMenuBar if NSApplicationPresentationAutoHideToolbar is included"
Thing is, I don't want the menu bar displayed at all!
So, I'm presuming this is not possible using the standard presentation options. Any ideas how I might approach implementing this behaviour manually?
I'm thinking along the lines of: detect the mouse position and only show/hide the toolbar when the mouse is at/near the top of the screen.
I'm very new to Cocoa so not sure where I would start to achieve this. Any help much appreciated!
Many thanks,
John
I've got It to work, but only by using private APIs.
First I had to find out how to prevent the menubar from appearing. I discovered the functions _HIMenuBarPositionLock and _HIMenuBarPositionUnlock, from Carbon (link the app with Carbon.framework).
Then I had to create a custom subclass of NSToolbar, at awakeFromNib I register notification observers to lock and unlock the menubar when the window enters and exits fullscreen, respectively:
- (void)awakeFromNib
{
[super awakeFromNib];
[[NSNotificationCenter defaultCenter] addObserverForName:NSWindowWillEnterFullScreenNotification object:[self _window] queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification *note) {
// lock menubar position when entering fullscreen so It doesn't appear when the mouse is at the top of the screen
_HIMenuBarPositionLock();
}];
[[NSNotificationCenter defaultCenter] addObserverForName:NSWindowWillExitFullScreenNotification object:[self _window] queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification *note) {
// unlock menubar position when exiting fullscreen
_HIMenuBarPositionUnlock();
}];
[self _setupToolbarHotspotTrackingView];
}
_setupToolbarHotspotTrackingView is a method on SOToolbar which adds a view to the window, this view will be used to track the mouse location and show/hide the toolbar accordingly.
- (void)_setupToolbarHotspotTrackingView
{
NSView *contentView = [self _window].contentView;
self.toolbarHotspotTrackingView = [[SOToolbarTrackingView alloc] initWithFrame:contentView.bounds];
[contentView addSubview:self.toolbarHotspotTrackingView];
self.toolbarHotspotTrackingView.autoresizingMask = NSViewWidthSizable|NSViewHeightSizable;
self.toolbarHotspotTrackingView.toolbar = self;
}
I also had to override _attachesToMenuBar on SOToolbar so the animation works properly.
- (BOOL)_attachesToMenuBar
{
return NO;
}
SOToolbarTrackingView sets up a tracking area for mouse moved events and checks to see if the mouse is at the top of the window. It then calls some methods on the private class NSToolbarFullScreenWindowManager to show and hide the toolbar.
There's too much stuff to explain It all in detail here, I've uploaded my experimental project so you can take a look. Download the sample project here.
I'm playing around with an idea and basically I want a NSStatusItem with a NSPopoverController. I read about all the problem people had but I just want to try it. Is there a clean way to do it by now? All the versions I've seen are at least 1 year old and suuuuper hacky.
This was my approach so far but if I click my app in the statusbar nothing happens...
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
self.statusItem = [[NSStatusBar systemStatusBar] statusItemWithLength:NSVariableStatusItemLength];
//[self.statusItem setView:view];
[self.statusItem setTitle:#"Test"];
[self.statusItem setHighlightMode:YES];
[self.statusItem setAction:#selector(activatePopover:)];
}
-(IBAction)activatePopover:(id)sender
{
BOOL isEnabled = NO;
if (isEnabled) {
[self.popover showRelativeToRect:NSMakeRect(0, 0, 50, 50) ofView:statusItem.view preferredEdge:NSMinYEdge];
} else {
[self.popover close];
}
}
Any ideas how to get this running?
Thanks
This will not work without using a custom view on the status item. If you don't set a custom view, the view property will be empty (it only returns custom views, not whatever view NSStatusItem uses internally when you just use setTitle).
Unfortunately, as per Apple's docs, you'll need to provide your own view and handle clicks yourself if you want to use NSPopover.
I haven't seen a complete example that encompasses correct handling of this (the default implementation of status items does rather a lot which you will have to do all manually), and also fixes popover wonkynesses:
NSPopover, by default, won't become the key window (some controls won't work), unless you overwrite canBecomeKeyWindow of NSPopover's window
Correctly dismissing menus of other status items (you can call popUpStatusItemMenu with an empty menu to correctly focus your status item)
Drawing the highlighted background with drawStatusBarBackgroundInRect
Reacting to both left and right mouse clicks
Using NSRunningApplication.currentApplication.activateWithOptions to make sure all windows of your status item become active (otherwise your popover will, erratically, not be the receiver of keyboard input)
Dismissing the NSPopover with NSEvent.addGlobalMonitorForEventsMatchingMask (the built-in dismissal mechanism popovers come with doesn't work with status items)
Removing the status item on termination with NSStatusBar.systemStatusBar.removeStatusItem
I hope to have a blog post about this out sometime soon (note: I'm using RubyMotion, not Objective-C), that explains all these issues and hopefully provides an easier base to create menulets. I'll update this comment if I write that post.
Code:
-(void)initializeStatusBarItem
{
self.statusItem = [[NSStatusBar systemStatusBar] statusItemWithLength:NSSquareStatusItemLength];
NSImage* image = [NSImage imageNamed:#"image"];
// [image setTemplate:YES];
self.statusItem.button.image = image;
self.statusItem.highlightMode = NO;
self.statusItem.button.action = #selector(statusBarItemDidClick:);
}
- (void)statusBarItemDidClick:(NSStatusBarButton *)sender{
MainViewController *mainView = [[MainViewController alloc] init];
self.popoverView = [[NSPopover alloc] init];
[self.popoverView setContentViewController:mainView];
self.popoverView.contentSize = CGSizeMake(300, 400);
self.popoverView.behavior = NSPopoverBehaviorTransient;
[self.popoverView showRelativeToRect:sender.bounds ofView:sender preferredEdge:NSMaxYEdge];
}
[self.scrollView scrollRectToVisible:textField.bounds animated:YES];
I can't seem to get my UIScrollView to scroll at all so that it doesn't obscure my UITextField. I thought that scrollRectToVisible would be my savior but it looks like a no go. Maybe I'm missing something like translating the coordinates of my textField to my scrollView. Either way check out my sample project.
https://github.com/stevemoser/Programming-iOS-Book-Examples/tree/master/ch20p573scrollViewAutoLayout2
Oh, and this project might be missing the delegate connection but I checked that and it still doesn't scroll.
I've seen other questions similar to this but none that mention Autolayout.
I was having issues with scrollRectToVisible:: as well after converting to Auto Layout. I just changed it to a direct call to setContentOffset:: and it started working again.
I had the same problem, I wanted to scroll an autolayouted UITextEdit into view without making it the first responder.
For me the issue was that the bounds of the UITextField were set later on during the auto layout pass, so if you do it immediately after setting up the layout the bounds are not valid yet.
To workaround I did create a descendant of UITextField, did overwrite setBounds: and added a 0 timer to scroll into view "later on" (You can't scroll in that moment because the auto layout pass of the system might no be finished at that point)
#interface MyTextField: UITextField
{
bool _scrollIntoView;
}
..
#end
#implementation MyTextField
-(void)setBounds:(CGRect)bounds
{
bool empty=CGRectIsEmpty(self.bounds);
bool isFirstResponder=self.isFirstResponder;
[super setBounds:bounds];
if (empty && !isFirstResponder && _scrollIntoView)
[self performSelector:#selector(scrollIntoViewLater) withObject:nil afterDelay:0];
else if (empty && isFirstResponder)
[self performSelector:#selector(becomeFirstResponder) withObject:nil afterDelay:0];
}
-(void)scrollIntoViewLater
{
CGRect r=[scrollView convertRect:self.bounds fromView:self];
[scrollView scrollRectToVisible:r animated:TRUE];
}
#end
If the field should be additionally editable with the on screen keyboard, simply call becomeFirstResponder later on: it scrolls automagically into view above the keyboard using the private scrollTextFieldToVisible API which in turn calls scrollRectToVisible:animated: of the scrollview.
Your sample link is broken btw...
I am writing an application targeting OS X Lion and Snow Leopard. I have a view that I want to have respond to swipe events. My understanding is that three-finger swipes will call -[NSResponder swipeWithEvent:] if that method is implemented in my custom view. I have already looked at this question and corresponding answers, and tried the following modified stub implementation of Oscar Del Ben's code:
#implementation TestView
- (id)initWithFrame:(NSRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code here.
}
return self;
}
- (void)drawRect:(NSRect)dirtyRect
{
[[NSColor redColor] set];
NSRectFillUsingOperation(dirtyRect, NSCompositeSourceOver);
}
- (void)swipeWithEvent:(NSEvent *)event {
NSLog(#"Swipe event detected!");
}
- (void)beginGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture detected!");
}
- (void)endGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture end detected!");
}
- (void)mouseDown:(NSEvent *)theEvent {
NSLog(#"mouseDown event detected!");
}
#end
This compiles and runs fine, and the view renders as expected. The mouseDown: event is properly registered. However, none of the other events are triggered. Neither the begin/endGestureWithEvent: methods, nor the swipeWithEvent: method. Which makes me wonder: do I need to set a project/application setting somewhere to properly receive and/or interpret gestures? Thanks in advance for the help.
To receive swipeWithEvent: messages, you have to ensure that the 3 finger swipe gesture is not mapped to anything that might cause a conflict. Go to System preferences -> Trackpad -> More Gestures, and set these preferences to one of the following:
Swipe between pages:
Swipe with two or three fingers, or
Swipe with three fingers
Swipe between full-screen apps:
Swipe left or right with four fingers
Specifically, the swipe between full-screen apps should not be set to three fingers, otherwise you will not get swipeWithEvent: messages.
Together, these two preference settings cause swipeWithEvent: messages to be sent to the first responder.
Of course, you still have to implement the actual swipe logic. And if you want to perform a fluid scroll-swipe à la iOS, then you will need to do a little more work. There is an example of how to do this in the Lion App Kit release notes under the section "Fluid Swipe Tracking."
See http://developer.apple.com/library/mac/#releasenotes/Cocoa/AppKit.html
try with [self setAcceptsTouchEvents:YES]; where it says // Initialization code here.
Not sure if it's the problem, but only the key window receives Gestures. Is your window key?
Is your view accepting first responders?
- (BOOL) acceptsFirstResponder
{
return YES;
}
I have an NSMenu popping out of an NSStatusItem using popUpStatusItemMenu. These NSMenuItems show a bunch of different links, and each one is connected with setAction: to the openLink: method of a target. This arrangement has been working fine for a long time. The user chooses a link from the menu and the openLink: method then deals with it.
Unfortunately, I recently decided to experiment with using NSMenuItem's setView: method to provide a nicer/slicker interface. Basically, I just stopped setting the title, created the NSMenuItem, and then used setView: to display a custom view. This works perfectly, the menu items look great and my custom view is displayed.
However, when the user chooses a menu item and releases the mouse, the action no longer works (i.e., openLink: isn't called). If I just simply comment out the setView: call, then the actions work again (of course, the menu items are blank, but the action is executed properly). My first question, then, is why setting a view breaks the NSMenuItem's action.
No problem, I thought, I'll fix it by detecting the mouseUp event in my custom view and calling my action method from there. I added this method to my custom view:
- (void)mouseUp:(NSEvent *)theEvent {
NSLog(#"in mouseUp");
}
No dice! This method is never called.
I can set tracking rects and receive mouseEntered: events, though. I put a few tests in my mouseEntered routine, as follows:
if ([[self window] ignoresMouseEvents]) { NSLog(#"ignoring mouse events"); }
else { NSLog(#"not ignoring mouse events"); }
if ([[self window] canBecomeKeyWindow]) { dNSLog((#"canBecomeKeyWindow")); }
else { NSLog(#"not canBecomeKeyWindow"); }
if ([[self window] isKeyWindow]) { dNSLog((#"isKeyWindow")); }
else { NSLog(#"not isKeyWindow"); }
And got the following responses:
not ignoring mouse events
canBecomeKeyWindow
not isKeyWindow
Is this the problem? "not isKeyWindow"? Presumably this isn't good because Apple's docs say "If the user clicks a view that isn’t in the key window, by default the window is brought forward and made key, but the mouse event is not dispatched." But there must be a way do detect these events. HOW?
Adding:
[[self window] makeKeyWindow];
has no effect, despite the fact that canBecomeKeyWindow is YES.
Add this method to your custom NSView and it will work fine with mouse events
- (void)mouseUp:(NSEvent*) event {
NSMenuItem* mitem = [self enclosingMenuItem];
NSMenu* m = [mitem menu];
[m cancelTracking];
[m performActionForItemAtIndex: [m indexOfItem: mitem]];
}
But i'm having problems with keyhandling, if you solved this problem maybe you can go to my question and help me a little bit.
Add this to your custom view and you should be fine:
- (BOOL)acceptsFirstMouse:(NSEvent *)theEvent
{
return YES;
}
I added this method to my custom view, and now everything works beautifully:
- (void)viewDidMoveToWindow {
[[self window] becomeKeyWindow];
}
Hope this helps!
I've updated this version for SwiftUI Swift 5.3:
final class HostingView<Content: View>: NSHostingView<Content> {
override func viewDidMoveToWindow() {
window?.becomeKey()
}
}
And then use like so:
let item = NSMenuItem()
let contentView = ContentView()
item.view = HostingView(rootView: contentView)
let menu = NSMenu()
menu.items = [item]
So far, the only way to achieve the goal, is to register a tracking area manually in updateTrackingAreas - that is thankfully called, like this:
override func updateTrackingAreas() {
let trackingArea = NSTrackingArea(rect: bounds, options: [.enabledDuringMouseDrag, .mouseEnteredAndExited, .activeInActiveApp], owner: self, userInfo: nil)
addTrackingArea(trackingArea)
}
Recently I needed to show a Custom view for a NSStatusItem, show a regular NSMenu when clicking on it and supporting drag and drop operations on the Status icon.
I solved my problem using, mainly, three different sources that can be found in this question.
Hope it helps other people.
See the sample code from Apple named CustomMenus
In there you'll find a good example in the ImagePickerMenuItemView class.
It's not simple or trivial to make a view in a menu act like a normal NSMenuItem.
There are some real decisions and coding to do.