I'm currently trying to build and run an UnrealEngine4 demo app, which uses SDL2, on a Linux armv7 embedded system with X server running but no window manager.
What I'm seeing is that the app is not responsive to keyboard events, although mouse works fine.
Digging further it turns out the problem is that UE4 ignores the keyboard event if the SDL_Event.key.windowID does not match the ID of the app's SDL_Window. I verified this by calling SDL_GetKeyboardFocus() and it turns out the window ID for the keyboard focus is 0 whereas the app's window's ID is 5.
Digging a bit further inside SDL2 it looks like the keyboard focus is set to a particular SDL window after a focusIn event is received from the X server. (see X11_DispatchEvent() in SDL_x11events.c).
It looks like if you're running X11 without a window manager however this focusIn event is never generated by the X-server as already answered here:
FocusIn/FocusOut not generated
After hacking X11_RaiseWindow() in SDL_x11windows.c by adding:
X11_XSetInputFocus(display, data->xwindow, RevertToParent, CurrentTime);
SDL_SetKeyboardFocus(data->window);
was able to finally get the keyboard events processed.
Does the above assessment sound correct ? If so is there a clean solution to this ? I'm still digging around but any suggestion would be greatly appreciated.
Thanks
This turned out to be a bug in SDL, which we just pushed a fix for:
https://hg.libsdl.org/SDL/rev/aa4e4768c6c1
This fix will be in SDL 2.0.4, and should bubble over to Unreal shortly thereafter.
Related
I created a Qt application with QML which contains a TextField. If I call SDL_CreateWindow() to create a new SDL2 window, typing in the TextField will duplicate every character. For example, if I type "hello" in the TextField, it will show "hheelllloo".
This only occurs after the creation of the SDL window. Also, this happens only on macOS. I compile the same application on Windows and Linux, and this problem is not visible.
I have also seen a similar bug in my Mac game using Cocoa windows with SDL2 2.0.14.
If the text fields are in a modal window, you could,,as a workaround call, SDL_EventState(SDL_TEXTINPUT, SDL_DISABLE) before displaying the modal window and call SDL_EventState(SDL_TEXTINPUT, SDL_ENABLE) when it closes.
With a non-modal window, you could try disabling the SDL text input when the window becomes active and enable when inactive.
Hope that helped.
Hopefully someone will fix this bug soon. I may look into it myself at some point soon. Note that I am not an SDL maintainer.
Soon I will have to work with OS X and tools like hammerspoon are missing some important capabilities for me. I need to be able to intercept keyboard and mouse events completely from the focused application. Say I ctrl+alt+apple+left_click on an application, I don't want the application to know about that left click. So far the only thing I came up with was to build a transparent fullscreen application, though I'm not sure how feasible that is yet.
Any better idea or hints how to go about this in a language of your choice?
Thanks!
You will need to create an event tap. However, the application will have to run as the root user, or the user will have to authorize that the application has been granted rights to accessibility features.
Apple's documentation can be found here.
Interestingly enough, I am in the process of writing a blog post about how to use event taps (including an ObjectiveC API that I wrote for my own use), but the post won't be made available for another week or so.
I've just encountered a really bizarre scenario and can't find any info on this elsewhere. When Xcode breaks at my breakpoints, all keyboard entry for the whole system is unresponsive. I can switch to another app but no key strokes are recorded. Xcode itself is unresponsive to keyboard input.
Anybody else seen this?
I'm running 10.10.1 and Xcode 6.1.
Based on the comments above it would seem that this issue has to do with behind the scenes details of Powerbox. To explain further: my app is sandboxed and calls NSOpenPanel. When breaking (Xcode breakpoint) in the completion block of NSOpenPanel I experience system-wide keyboard input loss.
Keyboard entry behaves normally in breakpoints outside of the call to NSOpenPanel. After working past this area of code I observed that my subsequent operations (queued in the background from the completion bock) often finish before the NSOpenPanel is completely torn down (disappears from the screen). My assumption is that until NSOpenPanel is removed from the screen (and maybe further after), Powerbox won't release control of the keyboard.
Much of this is assumption since I don't have the actual Powerbox code and can't step into it but it seems to fit.
I worked around my debugging issues by utilizing print statements and stepping through code with the variable inspector open. Mouse input continues to function so you can right-click (if you have a two-button mouse) on the variable and print its description at least.
Thanks for the help Ken.
UPDATE
I am now delaying execution of any of my post-NSOpenPanel actions using dispatch_after. On my system a delay of 1 second is doing the trick. I really don't like adding arbitrary delays but this seems to work.
I have a cross platform Qt application that's running into some trouble in OSX. There's a feature that OSX has that I didn't even know existed - the 'Help' key. My MBP doesn't have one, and neither does my Apple wired keyboard purchased a year ago. It seems that this is mostly something that older Macs have. Apparently it generates the same scan code as the Insert key on PC keyboards.
Anyways, when the Help key is pressed, the cursor over our application (or any application that receives the Help key event) turns into a little question mark. This seems to be part of what's called 'context-sensitive help mode', as documented in the NSHelpManager's setContextHelpModeActive: method and in the NSApplication's activateContextHelpMode: method docs. From the docs:
In this mode, the cursor becomes a question mark, and help appears for any user interface item the user clicks.
Most applications don’t use this method. Instead, applications enter
context-sensitive mode when the user presses the Help key.
Applications exit context-sensitive help mode upon the first event
after a help window is displayed.
How many Cocoa developers actually know about this? I'm assuming that clicking on something in the application with this question mark cursor should do something like bring up a help message, but I haven't found a single Cocoa application where it actually does anything at all - not even Apple's apps do anything. In fact, it even seems to put a lot of applications into a strange mode where the cursor text selection is enabled.
The problem is that when we change the application cursor programmatically in Qt when we're in this help-question-cursor-mode, bad things happen. Specifically, our application actually crashes. The crash happens deep inside Cocoa in the NSApplication's NSHelpManager. I'd like to find out why we're seeing this crash, but I'm actually more interested in how we can suppress this 'help' mode. There's nothing in Qt or Cocoa that I can see that would stop it, other than perhaps intercepting and squashing an event, which I haven't tried yet.
Does anyone know any more about this?
I want to observe any window on OSX if it is moved. I don't own the windows so i can't get to it directly so I think I have to use the Accessibility APIs. I found a solution for the current active Application here: How can my app detect a change to another app's window? but I can't figure out how I have to modify this that it works for any window which is open. I hope anybody could give me a hint in which direction I have to look.
As I mentioned in the comments, people usually only want to detect window-move events on focused windows. (As unfocused windows seldom move.) If you want to detect application switches, you can poke into this sample project by Apple that shows how to update iChat status with the frontmost application’s name. And as you said, there’s already a solution for an active window.