I am writing a custom app using a MKMapView, and I need add a place mark in the map by clicking in the map view, but the mouseDown method is never called. I can not find any doc in the apple developer help.
Place a MKAnnotation is not problem.
Any one can help me with this
You'll need to disable mouse interaction with the MKMapView, by setting the scrollEnabled and zoomEnabled properties to No.
Once those are set the mouseDown and other mouse events fire as expected.
Related
I have an NSView subclass in which I need to detect left and right mouse down events. mouseDown: is working just fine, but rightMouseDown: doesn't fire until the mouse button has been released, at which point both the down and up methods are called in succession. How can I make the right mouse down event trigger its corresponding method immediately?
The problem was that I have a NSPanGestureRecognizer added to the NSView with its buttonMask set to 0x2 (right mouse button). If I remove or disable this gesture recogniser, it allows rightMouseDown: to be called when the right button is pressed down. I'm still trying to figure out why, but at least now I have a starting point.
Subclassing NSButton to capture mouseUp and mouseDown events but retain the drawing methods of NSButtons super class.
My goal is as stated above, to subclass NSButton and to have it perform it's regular super class functionality whilst allowing me to override mouseDown and mouseUp and send it back to the action with the NSEvent so the button code can then examine what type of even occurred and respond respectively.
As I experimented with the NSButton sub class, I noticed that when you override the mouseDown as below:
override open func mouseDown(with event: NSEvent)
{
super.mouseDown(with: event)
// call the target with left button down
_ = target?.perform(action, with: event)
}
What appears to be happening is the the super.mouseDown captures the mouse events and subsequently your sub class of NSButton will not receive the appropriate mouseUp event.
Seeing as that is how things are working, I simply did an override of mouseDown and mouseUp without calling any of the Super Class functions. This indeed provides an even for mouseDown and mouseUp. I forward the event to the action and let the action code process which event has occurred and everything is fine. The only caveat is that the default behavior for the button state is not occurring. The button will stay in it's original non-selected state. If I change the state and force an update to the button, one would think it would draw the button in it's selected state. This does not happen hence why I am writing for some assistance.
I would love to be able to have the default drawing behavior of NSButton mouseDown event occur. Is there a way to set a property of NSButton aka it's state and force a redraw? I can't seem to be able to do this. If there is no way to do this, then I will be forced to draw the buttons content in it's selected state some how via overriding the draw method.
Any help would be appreciated.
You could inspect the mouseUp event in the #IBAction of the button instead:
#IBAction private func buttonClicked(_: AnyObject?) {
NSLog("Event: \(NSApp.currentEvent)")
}
If you don’t need the mouseDown event then there might be no need to subclass.
Note that the button can be activated with a keyboard, so you won’t receive mouse events, but you will receive the event from the action above.
I've spent the last couple of days reading through docs and answers trying to get a solution for this. So I resort to this, with a probability of being a duplicate, as a cry for help.
I have a GestureOverlay panel to read horizontal swipe gestures. This sits on top of a view pager.
I need to disable the functioning of the view pager when the touch event is read on the gesture panel.
I've extended ViewPager to let me enable and disable it with a member method.
Overrode the onTouch event of gesture to return true (consume the event) and disallow Intercept touch event on its parent (only on action_move/down). I also tried to disable the view pager onTouch, but the viewpager's on touch event is fired first
How do I achieve this?
I want to respond to the drag event when it is dragging a file but not a window.
I got the mouseDragged event like this:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSLeftMouseDraggedMask
handler: ^(NSEvent *mouseDraggedEvent){
//do something with event
}];
The drag and drop system is implemented atop the event-handling system. The event-handling system (which is what you are monitoring) has no concept of what is being dragged, just that a mouse drag is occurring.
If you want to know what is being dragged then you will need to record the location of the mouse at the start of the drag and use that to calculate what is being dragged. You could use the accessibility APIs for this.
I have 2 subclasses of NSView that are subviews to a common superview. They dont overlap and they both intercept mousedragged calls. When I drag from one of the subclasses to the other the mousedragged function will be called until I release the mouse button even when I drag all over the screen. I though the default behavior was for the mousedragged function to be called only when the mouse was over the bounds of the receiver.
Iam also using NSTrackingArea for mouse enter, exit and move events, but from what I've been reading does not involve drag events
Thank you for your time,
Jose.
You could subclass the NSWindow and override sendEvent:. That way, you can intercept the NSLeftMouseDragged events and dispatch them in whatever way you wish.