I have mkmapview with move up, move down, zoom which works out of the box on key press. However move left/move right doesn't.
It works in default Maps.app and Maps doesn't use any subclassing.
Tried it with map that display compass without success.
I have subclassed MKMapView and I am not getting right/left keypress
- (void)keyDown:(NSEvent *)event
{
[super keyDown:event];
}
What do I miss?
Tested on macOS 10.12.
In the subclass of MKMapView you need to overwrite acceptsFirstResponder
override var acceptsFirstResponder: Bool {
return true
}
Then all will work. I just made a test.
Related
I'm trying to perform a certain action by pressing, for example the spacebar (anywhere). In my code, I've got the acceptsFirstResponder method and the keyDown method but I'm not getting an NSLog-message
Here the code:
- (BOOL)acceptsFirstResponder
{
return YES;
}
- (void)keyDown:(NSEvent *)theEvent {
NSLog(#"test");
}
You need to put your -keyDown: method on an NSView subclass, and that NSView subclass has to be put in a window, and that window has to be on-screen, and you have to click on your view before you hit a key. Then the key will go to your view.
Check the diagram “The Path of Key Events” on this page.
In addition to implementing keyDown in your NSView, you also need to implement acceptsFirstResponder and have it return YES:
- (BOOL)acceptsFirstResponder
{
return YES;
}
[self.scrollView scrollRectToVisible:textField.bounds animated:YES];
I can't seem to get my UIScrollView to scroll at all so that it doesn't obscure my UITextField. I thought that scrollRectToVisible would be my savior but it looks like a no go. Maybe I'm missing something like translating the coordinates of my textField to my scrollView. Either way check out my sample project.
https://github.com/stevemoser/Programming-iOS-Book-Examples/tree/master/ch20p573scrollViewAutoLayout2
Oh, and this project might be missing the delegate connection but I checked that and it still doesn't scroll.
I've seen other questions similar to this but none that mention Autolayout.
I was having issues with scrollRectToVisible:: as well after converting to Auto Layout. I just changed it to a direct call to setContentOffset:: and it started working again.
I had the same problem, I wanted to scroll an autolayouted UITextEdit into view without making it the first responder.
For me the issue was that the bounds of the UITextField were set later on during the auto layout pass, so if you do it immediately after setting up the layout the bounds are not valid yet.
To workaround I did create a descendant of UITextField, did overwrite setBounds: and added a 0 timer to scroll into view "later on" (You can't scroll in that moment because the auto layout pass of the system might no be finished at that point)
#interface MyTextField: UITextField
{
bool _scrollIntoView;
}
..
#end
#implementation MyTextField
-(void)setBounds:(CGRect)bounds
{
bool empty=CGRectIsEmpty(self.bounds);
bool isFirstResponder=self.isFirstResponder;
[super setBounds:bounds];
if (empty && !isFirstResponder && _scrollIntoView)
[self performSelector:#selector(scrollIntoViewLater) withObject:nil afterDelay:0];
else if (empty && isFirstResponder)
[self performSelector:#selector(becomeFirstResponder) withObject:nil afterDelay:0];
}
-(void)scrollIntoViewLater
{
CGRect r=[scrollView convertRect:self.bounds fromView:self];
[scrollView scrollRectToVisible:r animated:TRUE];
}
#end
If the field should be additionally editable with the on screen keyboard, simply call becomeFirstResponder later on: it scrolls automagically into view above the keyboard using the private scrollTextFieldToVisible API which in turn calls scrollRectToVisible:animated: of the scrollview.
Your sample link is broken btw...
I have a view-based NSTableView. Each view in the table has a custom text field.
I'd like to fire an action when the user clicks on the text field (label) inside the table's view (imagine having a hyperlink with a custom action in each table cell).
I've created a basic NSTextField subclass to catch mouse events. However, they only fire on the second click, not the first click.
I tried using an NSButton and that fires right away.
Here's the code for the custom label:
#implementation HyperlinkTextField
- (void)mouseDown:(NSEvent *)theEvent {
NSLog(#"link mouse down");
}
- (void)mouseUp:(NSEvent *)theEvent {
NSLog(#"link mouse up");
}
- (BOOL)acceptsFirstResponder {
return YES;
}
- (BOOL)acceptsFirstMouse:(NSEvent *)theEvent {
return YES;
}
#end
Had the same problem. The accepted answer here didn't work for me. After much struggle, it magically worked when I selected "None" as against the default "Regular" with the other option being "Source List" for the "Highlight" option of the table view in IB!
Edit: The accepted answer turns out to be misleading as the method is to be overloaded for the table view and not for the text field as the answer suggests. It is given more clearly at https://stackoverflow.com/a/13579469/804616 but in any case, being more specific feels a bit hacky.
It turned out that NSTableView and NSOultineView handle the first responder status for NSTextField instances differently than for an NSButton.
The key to get the label to respond to the first click like a button is to overwrite [NSResponder validateProposedFirstResponder:forEvent:] to return YES in case of my custom text field class.
Documentation:
http://developer.apple.com/library/mac/documentation/Cocoa/Reference/ApplicationKit/Classes/NSResponder_Class/Reference/Reference.html#//apple_ref/occ/instm/NSResponder/validateProposedFirstResponder:forEvent:
The behavior that you're seeing is because the table view is the first responder, which it should be or the row won't change when you click on the label -- this is the behavior that a user expects when clicking on a table row. Instead of subclassing the label, I think it would be better to subclass the table view and override mouseDown: there. After calling the super's implementation of mouseDown:, you can do a hit test to check that the user clicked over the label.
#implementation CustomTable
- (void)mouseDown:(NSEvent *)theEvent
{
[super mouseDown:theEvent];
NSPoint point = [self convertPoint:theEvent.locationInWindow fromView:nil];
NSView *theView = [self hitTest:point];
if ([theView isKindOfClass:[NSTextField class]])
{
NSLog(#"%#",[(NSTextField *)theView stringValue]);
}
}
#end
In the exact same situation, embedding an NSButton with transparent set to true/YES worked for me.
class LinkButton: NSTextField {
var clickableButton:NSButton?
override func viewDidMoveToSuperview() {
let button = NSButton()
self.addSubview(button)
//setting constraints to cover the whole textfield area (I'm making use of SnapKit here, you should add the constraints your way or use frames
button.snp_makeConstraints { (make) -> Void in
make.edges.equalTo(NSEdgeInsetsZero)
}
button.target = self
button.action = Selector("pressed:")
button.transparent = true
}
func pressed(sender:AnyObject) {
print("pressed")
}
You use window.makeFirstResponser(myTextfield) to begin editing the text field. You send this message from the override mouseDown(withEvent TheEvent:NSEvent) method
I am writing an application targeting OS X Lion and Snow Leopard. I have a view that I want to have respond to swipe events. My understanding is that three-finger swipes will call -[NSResponder swipeWithEvent:] if that method is implemented in my custom view. I have already looked at this question and corresponding answers, and tried the following modified stub implementation of Oscar Del Ben's code:
#implementation TestView
- (id)initWithFrame:(NSRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code here.
}
return self;
}
- (void)drawRect:(NSRect)dirtyRect
{
[[NSColor redColor] set];
NSRectFillUsingOperation(dirtyRect, NSCompositeSourceOver);
}
- (void)swipeWithEvent:(NSEvent *)event {
NSLog(#"Swipe event detected!");
}
- (void)beginGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture detected!");
}
- (void)endGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture end detected!");
}
- (void)mouseDown:(NSEvent *)theEvent {
NSLog(#"mouseDown event detected!");
}
#end
This compiles and runs fine, and the view renders as expected. The mouseDown: event is properly registered. However, none of the other events are triggered. Neither the begin/endGestureWithEvent: methods, nor the swipeWithEvent: method. Which makes me wonder: do I need to set a project/application setting somewhere to properly receive and/or interpret gestures? Thanks in advance for the help.
To receive swipeWithEvent: messages, you have to ensure that the 3 finger swipe gesture is not mapped to anything that might cause a conflict. Go to System preferences -> Trackpad -> More Gestures, and set these preferences to one of the following:
Swipe between pages:
Swipe with two or three fingers, or
Swipe with three fingers
Swipe between full-screen apps:
Swipe left or right with four fingers
Specifically, the swipe between full-screen apps should not be set to three fingers, otherwise you will not get swipeWithEvent: messages.
Together, these two preference settings cause swipeWithEvent: messages to be sent to the first responder.
Of course, you still have to implement the actual swipe logic. And if you want to perform a fluid scroll-swipe à la iOS, then you will need to do a little more work. There is an example of how to do this in the Lion App Kit release notes under the section "Fluid Swipe Tracking."
See http://developer.apple.com/library/mac/#releasenotes/Cocoa/AppKit.html
try with [self setAcceptsTouchEvents:YES]; where it says // Initialization code here.
Not sure if it's the problem, but only the key window receives Gestures. Is your window key?
Is your view accepting first responders?
- (BOOL) acceptsFirstResponder
{
return YES;
}
I have an NSMenu popping out of an NSStatusItem using popUpStatusItemMenu. These NSMenuItems show a bunch of different links, and each one is connected with setAction: to the openLink: method of a target. This arrangement has been working fine for a long time. The user chooses a link from the menu and the openLink: method then deals with it.
Unfortunately, I recently decided to experiment with using NSMenuItem's setView: method to provide a nicer/slicker interface. Basically, I just stopped setting the title, created the NSMenuItem, and then used setView: to display a custom view. This works perfectly, the menu items look great and my custom view is displayed.
However, when the user chooses a menu item and releases the mouse, the action no longer works (i.e., openLink: isn't called). If I just simply comment out the setView: call, then the actions work again (of course, the menu items are blank, but the action is executed properly). My first question, then, is why setting a view breaks the NSMenuItem's action.
No problem, I thought, I'll fix it by detecting the mouseUp event in my custom view and calling my action method from there. I added this method to my custom view:
- (void)mouseUp:(NSEvent *)theEvent {
NSLog(#"in mouseUp");
}
No dice! This method is never called.
I can set tracking rects and receive mouseEntered: events, though. I put a few tests in my mouseEntered routine, as follows:
if ([[self window] ignoresMouseEvents]) { NSLog(#"ignoring mouse events"); }
else { NSLog(#"not ignoring mouse events"); }
if ([[self window] canBecomeKeyWindow]) { dNSLog((#"canBecomeKeyWindow")); }
else { NSLog(#"not canBecomeKeyWindow"); }
if ([[self window] isKeyWindow]) { dNSLog((#"isKeyWindow")); }
else { NSLog(#"not isKeyWindow"); }
And got the following responses:
not ignoring mouse events
canBecomeKeyWindow
not isKeyWindow
Is this the problem? "not isKeyWindow"? Presumably this isn't good because Apple's docs say "If the user clicks a view that isn’t in the key window, by default the window is brought forward and made key, but the mouse event is not dispatched." But there must be a way do detect these events. HOW?
Adding:
[[self window] makeKeyWindow];
has no effect, despite the fact that canBecomeKeyWindow is YES.
Add this method to your custom NSView and it will work fine with mouse events
- (void)mouseUp:(NSEvent*) event {
NSMenuItem* mitem = [self enclosingMenuItem];
NSMenu* m = [mitem menu];
[m cancelTracking];
[m performActionForItemAtIndex: [m indexOfItem: mitem]];
}
But i'm having problems with keyhandling, if you solved this problem maybe you can go to my question and help me a little bit.
Add this to your custom view and you should be fine:
- (BOOL)acceptsFirstMouse:(NSEvent *)theEvent
{
return YES;
}
I added this method to my custom view, and now everything works beautifully:
- (void)viewDidMoveToWindow {
[[self window] becomeKeyWindow];
}
Hope this helps!
I've updated this version for SwiftUI Swift 5.3:
final class HostingView<Content: View>: NSHostingView<Content> {
override func viewDidMoveToWindow() {
window?.becomeKey()
}
}
And then use like so:
let item = NSMenuItem()
let contentView = ContentView()
item.view = HostingView(rootView: contentView)
let menu = NSMenu()
menu.items = [item]
So far, the only way to achieve the goal, is to register a tracking area manually in updateTrackingAreas - that is thankfully called, like this:
override func updateTrackingAreas() {
let trackingArea = NSTrackingArea(rect: bounds, options: [.enabledDuringMouseDrag, .mouseEnteredAndExited, .activeInActiveApp], owner: self, userInfo: nil)
addTrackingArea(trackingArea)
}
Recently I needed to show a Custom view for a NSStatusItem, show a regular NSMenu when clicking on it and supporting drag and drop operations on the Status icon.
I solved my problem using, mainly, three different sources that can be found in this question.
Hope it helps other people.
See the sample code from Apple named CustomMenus
In there you'll find a good example in the ImagePickerMenuItemView class.
It's not simple or trivial to make a view in a menu act like a normal NSMenuItem.
There are some real decisions and coding to do.