NSWindow events at kCGDesktopWindowLevel - macos

I am trying to make an interactive desktop application that replaces the wallpaper image. It sounds like it is not possible to have an NSWindow at that level with events. According to all of the documentation I can find, a window needs to be at level -598 in order to allow views to accept such events, but the desktop level is -1000. Is there a way to get past this limitation (if there even is a limitation) with a light-weight API?

Solved. I am so embarrassed!
I printed the value of kCGDesktopWindowLevel and it turned out to be -2147483623. When I set the level of the window to -1000 it started to work perfectly.

Related

Prevent screencapture from capturing app screen on MacOSX

I'm trying to figure out how to prevent users of my app from snapping a screenshot of any of my app's windows. I'm mainly concerned with users automating screenshots using /usr/sbin/screencapture with cron. At first I thought there was no way to prevent it but then I discovered that there are some apps that are doing something that causes the screenshot to be all black or the color of the desktop. If I could pull that off I would be golden. I've seen other posts that touch on the subject but nothing that actually works in my situation. I'm running catalina. Any and all insights would be greatly appreciated.
Use NSWindowSharingNone of sharingType property on NSWindow
setSharingType: specifies whether the window content can be read
and/or written from another process. The default sharing type is
NSWindowSharingReadOnly, which means other processes can read the
window content (eg. for window capture) but cannot modify it. If you
set your window sharing type to NSWindowSharingNone, so that the
content cannot be captured, your window will also not be able to
participate in a number of system services, so this setting should be
used with caution. If you set your window sharing type to
NSWindowSharingReadWrite, other processes can both read and modify the
window content.
#property NSWindowSharingType sharingType API_AVAILABLE(macos(10.5));

How to improve CGWindowListCopyWindowInfo performance

The documentation for CGWindowListCopyWindowInfo says
Generating the dictionaries for system windows is a relatively expensive operation. As always, you should profile your code and adjust your usage of this function appropriately for your needs.
My question is how can I "adjust" my use of this function? For a code automation process I frequently need to check what window is frontmost among those of document or modal level. That is, I call CGWindowListCopyWindowInfo, ignore the windows that belong to other processes or have levels that I don't care about, and identify the first window that remains.
If there were a way to ask for information about just the windows owned by my process, say, that would be nice, but I see no way to do that. Or if there were a way to be notified when my windows change. I could watch for Carbon Events when windows are hidden or shown, but of course that is a deprecated technology.
You can use [NSWindow windowNumbersWithOptions:0] to get the window numbers of just the current application's windows (on the active space) in z-order.

Using CGDisplayStream to detect window movement

I want to detect when a window is being moved in real time and figured that CGDisplayStreamCreate etc. should provide just that. But I'm having difficulty deciding which window is being moved when my CGDisplayStreamFrameAvailableHandler is called. Is there a direct way to match the updated rects with with an app and its windows?
CGDisplayStream cannot tell you which applications/windows are responsible for a given screen update. You might be able to use another API like Accessibility to determine window locations and then guess which of the kCGDisplayStreamUpdateMovedRects corresponds to each window, but that will not be very reliable. If you're going to go the route of Accessibility, you may as well use Accessibility notifications for window move events: How can my app detect a change to another app's window?.
If you also need the pixel contents of the windows when they are moving, then you'll need to do some unfortunate time alignment between CGDisplayStream and Accessibility callbacks.

How to redirect an abitrary window to be rendered to an in-memory backbuffer?

I am experimenting with a home-grown application hosting framework, and I'd like to abstract the input/output so I can gracefully handle crashes. Chrome uses a very similar model.
Is there any way I can take an arbitrary window handle, and persuaded it to start rendering to a back-buffer? Or should I create my own window first, and then reparent the client app into it?
As the comments said you can do anything if you're willing to dig in and hook the APIs themselves, but according to the remarks in the MSDN WM_PAINT page WM_PRINT is the supported way to force a window to paint on a specific DC.
It sounds like you also need to keep the window from showing up on the desktop - in that case you can use WM_SETREDRAW as described in On Win32, can I disable painting of a window for a period of time?.

How do I get keyboard events in an NSStatusWindowLevel window while my application is not frontmost?

After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.

Resources