Programming for the Apple infared remote controls - cocoa

How do I get started with programming for the Apple infared remote control?
To start with, I only intend to support one control, and one type of receiver, that on the current unibody MacBooks.
What I mean by programming, is, how do I get started with writing an OSX, preferably Cocoa if there are APIs, app which intercepts commands from the control, and then sends commands to the OS.
For example, as a start, I'd like to be able to simply pick up a key press from the remote control, and then emit a keyboard command to the OS.
Like, say I've got this listener app running, if you press the menu button, and you're in the textedit app, it prints out the letter "a" for example.

Some searching around has revealed:
https://github.com/martinkahr/apple_remote_control
http://www.iospirit.com/developers/
Other resources:
http://www.cocoadev.com/index.pl?UsingTheAppleRemoteControl
a video presentation and the slides
If I find more I'll edit my answer.

Related

Responding to Exposé in MacOS

Is it possible to for an app to respond to Expose on MacOS such that it can choose what gets displayed?
I often use Expose to switch windows. The problem is if the windows look similar then it's harder to know at a glance which one I want.
Trying to switch to one of the 4 Visual Studio Code windows or one of the 4 terminal windows is not as easy as it could be if each window showed more info, for example the document name.
I can hover the mouse over each window but that's not "at a glance" so what I'd like to do is, if and only if Expose appears, choose what Expose shows for my app.
One solution I thought of is to change the display if my app's window does not have the focus. Unfortunately that's not a solution. Often I have windows side by side so I can see both windows. If I changed the display of the unfocused window that would make side by side unusable.
Does MacOS provide an API for responding to expose? Note: I'm asking a programming question. If I was to want to add this feature to Visual Studio Code or ITerm2 or my own app what is the MacOS level event I trigger on? I've never seen an app change its representation in expose so I'm guessing there is no way to do it.

How does on-screen (virtual) keyboard works in Win10

I haven't find anything relevant in Google or any Microsoft site about it so I decided to ask a question here.
Everybody knows that in Win-based OS there is a virtual keyboard. I also know that *nix based OS, have it too. So, the question is about:
HOW DOES IT WORK INSIDE?
I mean, let's have an example that I opened on screen keyboard in Windows 10. What's the actual difference between:
input via hardware keyboard: when I'm using it, like I press X button
..and using a virtual keyboard, when I press the same button
Imagine, I have an admin access to terminal/computer, is there any option to track/distinguish that in the second time I pressed button not on hardware keyboard, but on-screen (by mouse clicking) version of it?
And there are also many different software, like AutoIt (yes, it's a language, but it's relevant to this example) that emulating pressing the X button. How does they work in Win-based OS? Do they "in-common" with default on-screen keyboard and using the same driver/WinAPI or there is a difference between them?
And the second case, between:
default on-screen keyboard
compilated AutoIt script
..any other software that emulating press X button
I guess the only way to find out "how exactly button was pressed" is to check current processes list via taskmgr and find out have anything been launched or not. Or I'm totally wrong here, and missing something?
THE SCOPE
I have written a node.js script which emulates button pressing behaviour in windows app.
TL:DR business logic short => open notepad.exe and type `Hello world`
And could someone give me any advice/recommend any powershell/bat script (or any other solution) with demonstration of Get­Async­Key­State check behavior? With which I could easily check my own node.js script (not by functional of it, but by triggering press the X button event)
I found an answer for node.js case here: Detecting Key Presses Across Applications in Powershell
SendInput is the preferred method to generate user input in software. The Windows on-screen keyboard probably uses it for everything except Ctrl+Alt+Delete which I believe has some kind of special handling. The on-screen keyboard is only able to generate Ctrl+Alt+Delete in certain configurations.
Software-generated input is merged with normal hardware input in the RIT (Raw Input Thread) in the kernel.
A low-level keyboard hook can detect software-generated input.

Mac OS X 10.10 Find window by title, find button by label and press it

I use Mac OS X 10.10 and I would like to write a program that looks continuously for a window analyzing all the names of the opened windows. When the windows appear, I would like that the program will look for a button with a specific label and once found it, the app should send it a "pressed message".
I would be able to do it under windows, but I am not so familiar with Mac.
I have found a question related to mine (How do I get a list of the window titles on the Mac OSX?), but I think the most difficult part is finding the button and sending it a "pressed message".
Thank you in advance!
What you are looking for is the Accessibilty APIs. These are mostly Core Foundation style C APIs and typically prefixed with AX.
You might also want to consider additional identifiers beyond window title as window titles are not necessarily unique.
Using the AX APIs is not easy and is extremely verbose. You can use them to explore the UI and find things and interact with them but you might have more limited success observing user interaction. That might require a more fragile combination with event monitoring using NSEvent globalMonitor or CGEventTap depending on the UI widgets involved.
Also note that using the AX APIs to control anything outside your app is not sandbox capable.

How can one view the applescript code that executes on particular application?

There is one application that controls Microsoft Word 2011 for Mac using AppleScript.
It does really nice things that I want to implement in my own app.
So, is it possible to intercept AppleScript calls to particular application, and reconstruct source code of AppleScript that made that calls?
It is impossible to view source code of applescript that executes on particular app.
But debugging apple events, can make sense to cast a light on what is going on.
So I just opened Terminal.app and executed a command:
env AEDebugReceives=1 /Applications/Microsoft\ Office\ 2011/Microsoft\ Word.app/Contents/MacOS/Microsoft\ Word
That will force Microsoft Word (in fact almost any application) to print all received apple events in terminal.

OS X design decisions. Terminate the app on last window close?

Unlike Windows, GNOME and most other GUI's, OS X application programs do not all terminate if the main window (or all the windows) of that application are closed.
For example, fire up Firefox, Safari, Word, or most document based apps. Either click the red dot in the corner or type cmdW to close the window. You can see that the menu of that program is still active, and the program is still running. With OS X newbies, sometimes you will see dozens of these windowless zombies running and they wonder why their computer is getting slower.
With some document based programs, there is some sense to not terminating the application if it has no windows. For example, with Safari or Word, you can still type CmdN and get a new document window for whatever that application was designed to do: browse the web (Safari) or type a new document (Word).
Apple is mixed with their design philosophy on this. Some close on the last window closed and some do not. Third party apps are even more mixed.
There are other apps that do close when their red close button is clicked. System Preferences, Dictionary, the Mac App Store, iPhoto and Calculator do terminate when the sole or last window is closed. iCal, Address Book, iTunes, DVD Player do not terminate.
What I find particularly annoying is the applications that do not have a logical "New Document" or "Open" function yet they do not terminate when the document window is closed. Example: fire up iTunes or Address Book and terminate the main window. There sits a zombie with no window and no function other than manually selecting "Quit".
It is easy to close the application after the last window closes. Cocoa even gives you notification of that event. Just add this to your application delegate:
- (BOOL)applicationShouldTerminateAfterLastWindowClosed:(NSApplication *)sender
{
return YES;
}
My question is this: Is there any reason I should NOT terminate my application after the last window closes? Why is this so variable on OS X software? Unless the app has a "new" or "open" or some other clearly understood reason to not terminate with no window open, the failure to terminate seems like a bug to me.
Per Apple's Human Interface Guidelines (a guide for Mac developers):
In most cases, applications that are
not document-based should quit when
the main window is closed. For
Example, System Preferences quits if
the user closes the window. If an
application continues to perform some
function when the main window is
closed, however, it may be appropriate
to leave it running when the main
window is closed. For example, iTunes
continues to play when the user closes
the main window.
In general, never close a document based application when the last window closes. The user will expect to be able to open a new document without relaunching the application, and it will confuse them if they can't.
For non-document based applications, you need to consider a few things:
How long does it take for my application to open? If it takes more than a second, you should probably not quit.
Does my application need a window to be useful? If your application can do work without windows, you should not quit.
iTunes doesn't quit because, as Anne mentioned, you don't need a window to play music (question 2). It is also not based on Cocoa, so it is much more difficult to close after the last window, especially since it allows you to open windows for specific playlists so there are an indefinite number of possible windows to be open.
In my opinion, Address Book does not need to stay open. This could be a left-over design decision from older versions of OS X, or it could be that someone at Apple just thought it was better to leave it open (maybe so you can add a contact?). Both iTunes and Address Book provide access to their main interfaces through the Window menu, as well as a keyboard shortcut (Option+Command+1 for iTunes, Command+0 for Address Book).
The main iTunes window can be reopened from the 'Window' menu. Mail.app has similar behavior. I can't think of any applications that close when the last window is closed, and as such I don't think there's a good reason that your app should behave that way (in fact, i'm surprised its not in Apple's user experience guidelines, it would really bother me!).
One reason why you'd want to close e.g. the iTunes main window but keep the app open is to be able to use the app as sort of a server for third party scripts/applications. I rarely use the main iTunes interface, but instead control my music with a third party app. I also write AppleScripts for other apps that I launch instead of interacting with that app's interface.

Resources