How can I track opening and closing event of NSWindow? - cocoa

I did try – windowDidExpose: but it didn't work. What do I have to try for this?
My window is a utility window.
-- edit for more clarity --
What I want are:
viewWillAppear
viewWillDisappear
viewDidLoad
viewDidUnload
in Cocoa Touch.

Very old question, but only for documentation purpose:
Track open:
In your windows controller override the method:
-(void)showWindow:(id)sender
{
//add this for track the window close
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(windowWillClose)
name:NSWindowWillCloseNotification
object:nil];
[super showWindow:sender];
//do here what you want...
}
Track close:
Implement the method
-(void)windowWillClose
{
[[NSNotificationCenter defaultCenter] removeObserver:self];
//do here what you want...
}

There is windowDidClose:, but that probably only refers to closing; if you're sending your window an orderOut: message, I don't think that counts.
You probably need to either just track it from whatever code you're ordering the window in and out from, or subclass the window's class and override methods like makeKeyAndOrderFront: and orderOut: (whatever you're using, at least) to post custom notifications before calling up to super.

For Swift
Track open: In your windows controller override the method:
override func showWindow(sender: AnyObject?) {
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(windowWillClose), name: NSWindowWillCloseNotification, object: nil)
}
Track close: Implement the method:
func windowWillClose() -> Void {
NSNotificationCenter.defaultCenter().removeObserver(self);
//Do here what you want..
}

I came up with a hack for dealing with this. There is no notification that signals that a window has been put on screen, but there's a notification that's pretty much guaranteed to be sent when a window is put on screen. I'm speaking of NSWindowDidUpdateNotification, which indicates that a window has refreshed itself.
Of course, it's not only sent when the window appears—it's sent every time the window updates. Needless to say, this notification is sent a lot more than once. So you want to watch for it the first time, do your thing, and ignore any subsequent notifications. In my case, I wanted to add a sheet to a window that another part of my app would order in later. So I did something like this:
__block id observer = [NSNotificationCenter.defaultCenter addObserverForName:NSWindowDidUpdateNotification object:window queue:nil usingBlock:^(NSNotification *note) {
[self showSetupSheet];
[NSNotificationCenter.defaultCenter removeObserver:observer];
}];
There's no particular reason you would have to use a block-based observer—a method-based observer would work just as well.

Related

Changing the selection behaviour of NSCollectionView

In my Mac app I have a NSCollectionView with multi select enabled. In my app being able to select more than one item is the norm, and having to press cmd while clicking to select multiple items is frustrating some users and most don't realise they can do it (I get a lot of feature requests asking for multi select).
So, I want to change the behaviour so that:
when a user clicks a second item, the first item remains selected (without the need for holding cmd)
When a user clicks a selected item, the item is deselected
I've tried overriding setSelected on my own subclass of NSCollectionViewItem like so:
-(void)setSelected:(BOOL)flag
{
[super setSelected:flag];
[(MyView*)[self view] setSelected: flag];
[(MyView*)[self view] setNeedsDisplay:YES];
}
Calling super setSelected is required to make sure the collection view functions correctly, but it also seems to be what is responsible for the default behaviour.
What should I do instead?
You could try intercepting all left-mouse-down events using a local events monitor. Within this block you'd then work out if the click happened on your collection view. If it did, create a new event which mimics the event you intercepted but add in the command key mask if it isn't already present. Then, at the end of the block return your event rather than the one you intercepted. Your collection view will behave as if the user had pressed the command key, even though they haven't!
I had a quick go with this in a very simple demo app and it looks like a promising approach - though I expect you'll have to negotiate a few gotchas along the way.
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
[NSEvent addLocalMonitorForEventsMatchingMask:NSEventMaskFromType(NSLeftMouseDown)
handler:^NSEvent *(NSEvent *originalEvent) {
// Did this left down event occur on your collection view?
// If it did add in the command key
NSEvent *newEvent =
[NSEvent
mouseEventWithType: NSLeftMouseDown
location: originalEvent.locationInWindow
modifierFlags: NSCommandKeyMask // I'm assuming it's not already present
timestamp: originalEvent.timestamp
windowNumber: originalEvent.windowNumber
context: originalEvent.context
eventNumber: originalEvent.eventNumber
clickCount: originalEvent.clickCount
pressure:0];
return newEvent; // or originalEvent if it's nothing to do with your collection view
}];
}
Edit (by question author):
This solution is so heavily based on the original answer that this answer deserves credit (feel free to edit)
You can also intercept the mouse event by subclassing the NSCollectionView class and overriding mousedown like this:
#implementation MyCollectionView
-(void) mouseDown:(NSEvent *)originalEvent {
NSEvent *mouseEventWithCmd =
[NSEvent
mouseEventWithType: originalEvent.type
location: originalEvent.locationInWindow
modifierFlags: NSCommandKeyMask
timestamp: originalEvent.timestamp
windowNumber: originalEvent.windowNumber
context: originalEvent.context
eventNumber: originalEvent.eventNumber
clickCount: originalEvent.clickCount
pressure: originalEvent.pressure];
[super mouseDown: mouseEventWithCmd];
}
#end

Activity Indicator above Button prevents Click Recognition

I have an UIButton "bn" and an UIActivityIndicator "ai" which is above the button (ai.center = bn.center).
As long as ai is visible and animating, I can't press the Button underneath ai's frame but out of the range I can.
Do I have to add GestureRecognition to ai or is there a smarter way to click on the "ai".
Kind regards. $h#rky
Can you simply set ai.userInteractionEnabled = NO;? I'm surprised it is enabled anyway, on an activity indicator - is this a standard component or have you made a subclass?
As an aside, it is usually poor UI design to have an interactive element that is covered by another view, particularly one which is used to indicate that something is "busy", but in your example of a clickable thumbnail image, it seems to make sense.
You need to override the UIView method
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
in the parent view. If the hit is within the button the you return the button, else you return nil. This is part of the the discussion on this method:
This method traverses the view hierarchy by sending the
pointInside:withEvent: message to each subview to determine which
subview should receive a touch event. If pointInside:withEvent:
returns YES, then the subview’s hierarchy is traversed; otherwise, its
branch of the view hierarchy is ignored. You rarely need to call this
method yourself, but you might override it to hide touch events from
subviews.
EDIT:
Say you have a UIView subclass which contains bn and ai, you can implement the method like this
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if (CGRectContainsPoint(bn.frame, point)) {
return bn;
}
return nil;
}
that way your button will get the touch events (if they are within its frame) no matter if something is on top of it or not. You do not need to do anything else.
Use the following stuff:
First, create subclass for UIActivityIndicator with the following method override:
#interface MyActivityIndicator: UIActivityIndicator
#end
#implementation MyActivityIndicator
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return NO;
}
#end
After that, use MyActivityIndicator everywhere in your project instead of UIActivityIndicator (in NIB file or in your code, depends where do you create it)

-[NSResponder swipeWithEvent:] not called

I am writing an application targeting OS X Lion and Snow Leopard. I have a view that I want to have respond to swipe events. My understanding is that three-finger swipes will call -[NSResponder swipeWithEvent:] if that method is implemented in my custom view. I have already looked at this question and corresponding answers, and tried the following modified stub implementation of Oscar Del Ben's code:
#implementation TestView
- (id)initWithFrame:(NSRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code here.
}
return self;
}
- (void)drawRect:(NSRect)dirtyRect
{
[[NSColor redColor] set];
NSRectFillUsingOperation(dirtyRect, NSCompositeSourceOver);
}
- (void)swipeWithEvent:(NSEvent *)event {
NSLog(#"Swipe event detected!");
}
- (void)beginGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture detected!");
}
- (void)endGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture end detected!");
}
- (void)mouseDown:(NSEvent *)theEvent {
NSLog(#"mouseDown event detected!");
}
#end
This compiles and runs fine, and the view renders as expected. The mouseDown: event is properly registered. However, none of the other events are triggered. Neither the begin/endGestureWithEvent: methods, nor the swipeWithEvent: method. Which makes me wonder: do I need to set a project/application setting somewhere to properly receive and/or interpret gestures? Thanks in advance for the help.
To receive swipeWithEvent: messages, you have to ensure that the 3 finger swipe gesture is not mapped to anything that might cause a conflict. Go to System preferences -> Trackpad -> More Gestures, and set these preferences to one of the following:
Swipe between pages:
Swipe with two or three fingers, or
Swipe with three fingers
Swipe between full-screen apps:
Swipe left or right with four fingers
Specifically, the swipe between full-screen apps should not be set to three fingers, otherwise you will not get swipeWithEvent: messages.
Together, these two preference settings cause swipeWithEvent: messages to be sent to the first responder.
Of course, you still have to implement the actual swipe logic. And if you want to perform a fluid scroll-swipe à la iOS, then you will need to do a little more work. There is an example of how to do this in the Lion App Kit release notes under the section "Fluid Swipe Tracking."
See http://developer.apple.com/library/mac/#releasenotes/Cocoa/AppKit.html
try with [self setAcceptsTouchEvents:YES]; where it says // Initialization code here.
Not sure if it's the problem, but only the key window receives Gestures. Is your window key?
Is your view accepting first responders?
- (BOOL) acceptsFirstResponder
{
return YES;
}

How to hide window of UIAgent process with cocoa

I have an UIAgent application with one window. I want to hide/show it from another application.How do I do it with cocoa? Seems like hide/unhide methods of NSRunningApplication doesn't affect UIAgent processes.
Thanks in advance
I solved it with NSDistributionNotifications. In the UIAgent application I add an observer to a #"QuitProcessNotification" (any other name):
[[NSDistributedNotificationCenter defaultCenter]
addObserver:self selector:#selector(quit:)
name:#"QuitProcessNotification"
object:#"com.MyCompany.MyApp"
suspensionBehavior:NSNotificationSuspensionBehaviorDeliverImmediately];
The callback looks like that:
- (void) quit:(NSNotification *) notification
{
[NSApp terminate:nil];
}
In the main application:
Sending notification:
[[NSDistributedNotificationCenter defaultCenter]
postNotificationName:#"QuitProcessNotification"
object:#"com.MyCompany.MyApp"
userInfo: nil /* no dictionary */
deliverImmediately: YES];
Be sure, that the object parameter is indeed your sender application's bundle identifier.

Custom NSView in NSMenuItem not receiving mouse events

I have an NSMenu popping out of an NSStatusItem using popUpStatusItemMenu. These NSMenuItems show a bunch of different links, and each one is connected with setAction: to the openLink: method of a target. This arrangement has been working fine for a long time. The user chooses a link from the menu and the openLink: method then deals with it.
Unfortunately, I recently decided to experiment with using NSMenuItem's setView: method to provide a nicer/slicker interface. Basically, I just stopped setting the title, created the NSMenuItem, and then used setView: to display a custom view. This works perfectly, the menu items look great and my custom view is displayed.
However, when the user chooses a menu item and releases the mouse, the action no longer works (i.e., openLink: isn't called). If I just simply comment out the setView: call, then the actions work again (of course, the menu items are blank, but the action is executed properly). My first question, then, is why setting a view breaks the NSMenuItem's action.
No problem, I thought, I'll fix it by detecting the mouseUp event in my custom view and calling my action method from there. I added this method to my custom view:
- (void)mouseUp:(NSEvent *)theEvent {
NSLog(#"in mouseUp");
}
No dice! This method is never called.
I can set tracking rects and receive mouseEntered: events, though. I put a few tests in my mouseEntered routine, as follows:
if ([[self window] ignoresMouseEvents]) { NSLog(#"ignoring mouse events"); }
else { NSLog(#"not ignoring mouse events"); }
if ([[self window] canBecomeKeyWindow]) { dNSLog((#"canBecomeKeyWindow")); }
else { NSLog(#"not canBecomeKeyWindow"); }
if ([[self window] isKeyWindow]) { dNSLog((#"isKeyWindow")); }
else { NSLog(#"not isKeyWindow"); }
And got the following responses:
not ignoring mouse events
canBecomeKeyWindow
not isKeyWindow
Is this the problem? "not isKeyWindow"? Presumably this isn't good because Apple's docs say "If the user clicks a view that isn’t in the key window, by default the window is brought forward and made key, but the mouse event is not dispatched." But there must be a way do detect these events. HOW?
Adding:
[[self window] makeKeyWindow];
has no effect, despite the fact that canBecomeKeyWindow is YES.
Add this method to your custom NSView and it will work fine with mouse events
- (void)mouseUp:(NSEvent*) event {
NSMenuItem* mitem = [self enclosingMenuItem];
NSMenu* m = [mitem menu];
[m cancelTracking];
[m performActionForItemAtIndex: [m indexOfItem: mitem]];
}
But i'm having problems with keyhandling, if you solved this problem maybe you can go to my question and help me a little bit.
Add this to your custom view and you should be fine:
- (BOOL)acceptsFirstMouse:(NSEvent *)theEvent
{
return YES;
}
I added this method to my custom view, and now everything works beautifully:
- (void)viewDidMoveToWindow {
[[self window] becomeKeyWindow];
}
Hope this helps!
I've updated this version for SwiftUI Swift 5.3:
final class HostingView<Content: View>: NSHostingView<Content> {
override func viewDidMoveToWindow() {
window?.becomeKey()
}
}
And then use like so:
let item = NSMenuItem()
let contentView = ContentView()
item.view = HostingView(rootView: contentView)
let menu = NSMenu()
menu.items = [item]
So far, the only way to achieve the goal, is to register a tracking area manually in updateTrackingAreas - that is thankfully called, like this:
override func updateTrackingAreas() {
let trackingArea = NSTrackingArea(rect: bounds, options: [.enabledDuringMouseDrag, .mouseEnteredAndExited, .activeInActiveApp], owner: self, userInfo: nil)
addTrackingArea(trackingArea)
}
Recently I needed to show a Custom view for a NSStatusItem, show a regular NSMenu when clicking on it and supporting drag and drop operations on the Status icon.
I solved my problem using, mainly, three different sources that can be found in this question.
Hope it helps other people.
See the sample code from Apple named CustomMenus
In there you'll find a good example in the ImagePickerMenuItemView class.
It's not simple or trivial to make a view in a menu act like a normal NSMenuItem.
There are some real decisions and coding to do.

Resources