OSX Carbon: Quartz event taps to get keyboard input - macos

I want to get Keyboard Input on OSX using c++ without using Cocoa, deprecated Carbon UPP handlers and if possible without using IOHID since that's alot of extra work.
I allready implemented a simple mouse class using quartz event taps and it works like a charm and now I'd like to use them to implement a keyboard class. Anyways as the reference states under CGEventTapCreate:
http://developer.apple.com/library/mac/#documentation/Carbon/Reference/QuartzEventServicesRef/Reference/reference.html
you can only access key events if one of the following is true:
The current process is running as the root user.
Access for assistive devices is enabled. In Mac OS X v10.4, you can
enable this feature using System Preferences, Universal Access
panel, Keyboard view.
that is a very serious limitation since I also want my application to work without any weird settings. Is there any way to work around this? If not, is there any alternative to using Taps in Carbon?
Thanks!

The easiest way is to use the semi-deprecated Carbon function RegisterEventHotKey, see this SO Q&A, for example.
If not, you need to live with that restriction. The restriction is there to prevent a bad person to install keylogger behind the scenes. You need to ask the user to open the preferences, type the admin password etc.

You could try to use AXMakeProcessTrusted. This is supposed to be the same as Access for assistive devices on a per process basis.

Related

CGEventTap mouse event position overwrite only possible when running app as root user?

I am developing a macOS app which takes control of the cursor. I am using a CGEvent Tap and I am adding some arithmetic to the CGEvents in order to offset the final mouse position. Although the app is in principle working as expected, in some cases - more specifically: when running the app with certain popular illustration software and using a stylus pen - the app is producing some flickering 'ghost' positions for the mouse at its original location. The good thing is, this problem can be resolved when running the app while being logged in as root user. I have read quite some SO posts but this particular post addresses the issue best:
https://stackoverflow.com/a/9899901/5066660
As described in this post the issue is probably:
Unfortunately, the CGEventTapCreate() doc says:
Only processes running as the root user may locate an event tap at the point where HID events enter the window server; for other users, this function returns NULL.
Well the function is definitely not returning null because the tap is in effect. Also I tried all possible combinations of arguments for that function, but they all act the same. Further down in that post it is proposed to tackle the problem as follows:
Perhaps you can spin this functionality off into a separate process that has super-user permissions, leaving the rest of your app in normal user mode? I believe there's also a way to request root permissions for just a specific action taken by your program.
Now if this is a possible solution I would love to implement it! So my question is: how? I've stumbled upon running scripts with elevated permissions, but not just CoreGraphics code as for example an CGEventTap. Is this possible? Could anybody give me an example of how this could be accomplished, or any other solution to the problem?
All help is welcome, thank you very very much.
From Apple documentation:
Event taps receive key up and key down events if one of the following conditions is true:
- The current process is running as the root user.
- Access for assistive devices is enabled. In OS X v10.4, you can enable this feature using System Preferences, Universal Access panel, Keyboard view.
So giving the the application Accessibility rights solved it for me (no need to run as admin). This can be achieved System Preferences -> Security & Privacy -> Accessibility and add you program there.

Detecting Full screen applications on mac

I am developing a simple application in Cocoa, and I want to detect whether any application is running in full screen mode. Is this possible?
Through runningApplications API, I can get various informations but there is no specific property related to full screen mode. Does any one know how to detect it? Is there any carbon event or API for this?
I ran into this in the spring and spent forever trying to get it to work. I ended up packaging my code up into a little GitHub project, but I completely forgot to share it here.
https://github.com/shinypb/FullScreenDetector
Hope this is useful for someone.
Anyways after trying out so many options and digging into the NSWorkspace i have found way through which we can achieve this their is notification
"NSWorkspaceActiveSpaceDidChangeNotification"
Apple doc says "Posted when a Spaces change has occurred." so by using we can register for it. along with this we need to use the NSWindow's property "isOnActiveSpace" , so by this we can detect when application enters full screen mode and exits from it.
You want to key-value observe -[NSApplication currentSystemPresentationOptions]. When the active app is in full-screen mode, that property will include NSApplicationPresentationFullScreen.

Control to record user hotkeys in a OSX application

I'm dealing with Hotkey registration, i can't found a way to make user able to register its hotkey.
I tried ShortcutRecorder but it seems impossible to make it works, for me it crash with some error messages related to CG.
I'd like to know if there is a way to make ShortcutRecorder works with a ARC environment and Xcode 4.0 or if you use other control to grab user Hotkeys.
You should check out this fork of ShortCut Recorder:
https://github.com/youknowone/shortcutrecorder
The original appears to have not been updated in some time but I was able to get this fork running on 10.7 without any changes.

Switching flight mode programmatically

Is there any way switch on/off flight mode programmatically in Windows Phone 7.5. What I want to do is create background task which will be check time and switch on/off flight mode.
Thanks in advance.
No, this functionality is not available.
It was a design principle behind the platform that applications should not be able to do things without the user knowing it.
If such functionality was available then it would be possible for an app (either deliberately or accidentally-though a bug) to get the devices state in a setting other than what they user may expect. In such a scenario users will typically blaim the phone/platform for what has happened, not a misbehaving application.
Though you cannot programatically do it (as others have mentioned), you can send the user directly to the proper page in the settings panel and allow them to do it. Here's an example of using the ConnectionsSettingsTask:
http://msdn.microsoft.com/en-us/library/hh394011(v=VS.92).aspx
You would want to set the ConnectionSettingsType property:
http://msdn.microsoft.com/en-us/library/microsoft.phone.tasks.connectionsettingstask.connectionsettingstype(v=VS.92).aspx
To 'AirplaneMode'.

How to get a list of all open NSWindow from all running application?

Is there a way to get list of open or visible NSWindow from mac desktop?
Note that not all windows are necessarily NSWindows, and that NSWindow only provides an interface to windows in your own address space.
The supported way to access every window is the CGWindow API. Take a look at the Son of Grab sample code to see how it's done.
You can use the accessibility API (accessibility must be enabled under System Preferences for it to work) to get information on windows (and other UI elements) from other processes. This question might be just what you're looking for.
ALL running applications? No. You can only get the NSWindows of your own app. You may be able to use Universal Access or Core Graphics APIs to get some information about windows of other apps, but not full access.

Resources