I am trying to add full screen support to my cocoa application and I am having a little trouble figuring out how to handle events. I use the method enterFullScreenMode:withOptions: to get into full screen mode but when I do this it seems that a NSFullScreenWindow becomes the first responder and receives events. I am confused about how about be able to override this class to handle keyboard and mouse events (I could find no way to set what class of window becomes the full screen window). Am I totally off base? Should I use another method to achieve full screen mode?
I was able to overcome this problem by making the view that became full the screen the first responder.
Related
I want to detect when a window is being moved in real time and figured that CGDisplayStreamCreate etc. should provide just that. But I'm having difficulty deciding which window is being moved when my CGDisplayStreamFrameAvailableHandler is called. Is there a direct way to match the updated rects with with an app and its windows?
CGDisplayStream cannot tell you which applications/windows are responsible for a given screen update. You might be able to use another API like Accessibility to determine window locations and then guess which of the kCGDisplayStreamUpdateMovedRects corresponds to each window, but that will not be very reliable. If you're going to go the route of Accessibility, you may as well use Accessibility notifications for window move events: How can my app detect a change to another app's window?.
If you also need the pixel contents of the windows when they are moving, then you'll need to do some unfortunate time alignment between CGDisplayStream and Accessibility callbacks.
Imagine I save my window's position in my preferences file. Now, the user moves the window to a second monitor, then quits my app. Then he disconnects the second monitor and launches my app again.
Now my app wants to restore the window's saved location. But if it blindly restores the old window coordinates, the window will be off-screen.
I used to use ConstrainWindowToScreen for my Carbon app, but now that I'm porting it to Cocoa, I can not find an equivalent for this.
The docs suggest that, somehow, Cocoa would automatically prevent this from happening. While that might be the case when the monitors change while the window is open, in my case where I've stored the window location myself and restore them when I re-open the window at launch, this isn't going to work. I need to invoke Cocoa's magic functionality on demand, but how?
(Note: I am aware that I could iterate over all available screens, but that's quite a pain to write myself if I want to get this foolproof. Still, if you can present a complete C or ObjC function that solve it this way, that'd be appreciated, too.)
See the "Managing Window Frames in User Defaults" section in the NSWindow Class Reference. Those methods ensure that a window will be placed entirely on screen.
If you want to save and restore the window location yourself (as a string), use -stringWithSavedFrame and -setFrameFromString:.
Use -saveFrameUsingName: and -setFrameUsingName to have NSWindow save and restore its frame in the user defaults, given a window name.
I agree with Darren's suggestion to use the built-in mechanism for restoring window positions. Really, it's as easy as setting a window's frame autosave name in IB (or with -setFrameAutosaveName:).
That said, if a window has a title bar, then all of the methods which order it onto the screen (e.g. -orderFront: or -makeKeyAndOrderFront:) will automatically reposition it to make sure at least the title bar and a significant chunk of the window is on the screen. It's honestly difficult to get a titled window to be theoretically visible but actually off-screen.
Is there any way to detect and respond when a control on a window loses focus?
I want to run some code when a user leaves NSTableView.
Thanks,
You can do this in 10.6 and later by using KVO to observe the window's firstResponder. It will change when the focused control in the window changes.
Put the code you want to run in the observing object's observeValueForKeyPath:ofObject:change:context: method.
I want to write a console program for mouse events (Only mouse scroll). How do I do it in VC++? The application will listen only to scroll events.
Description: If the user scrolls down, the Desktop window fades down, and fades-in when user scrolls up.
Here I just need to know to to listen to mouse events in console app.
Note: I am developing using win32 API, and for development environment I am using VS2010.
I've never actually done this myself. It seems that a console application responding to mouse events almost belies its nature and intended purpose. Generally, you would only need to respond to keyboard input from a console app and leave the mouse stuff to a GUI app.
That being said, this tutorial indicates that it is in fact possible to capture these mouse events from a Win32 console application. Generally, the suggestion is to use the ReadConsoleInput function and extract the information of interest from the INPUT_RECORD structure that it fills. The only tricky thing is that the call to ReadConsoleInput is a blocking call, which means it will not return until there is an input event fired. You'll need to structure your application's code accordingly. Mouse events are covered in detail about 3/4 of the way down the page.
I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.