Is X11's design tightly coupled to a program's usage capabilities? - events

Say I want to create my own button with an arduino. Would I be able to call an X11 Event with that button every time it was clicked so that my program running would see it as a mouse click? Does the window currently have to be in focus for this to happen? Is the mechanism behind graphical user interfaces (X11) tightly dependent on the conventional windowing system, mouse, keyboard, and screen?

Related

Running a Windows process while controlling the mouse and keyboard

I have a Windows program which has a GUI that runs on a PC.
In order to automate some of the GUI actions, I want to be able to move the mouse and type using the keyboard, but without interfering with the user's activity.
I know that I could simulate input events using SendMessage and PostMessage, but that requires the window to be in focus, and I want to eliminate this requirement.
My question is - is it possible to implement sort of a 'wrapper' that internally runs the original program, while patching its mouse and keyboard, providing it with a 'virtual' version of a mouse of keyboard?
I think of that as taking only the mouse and keyboard capabilities of a VM. Is something of that kind exists?
Thanks!

AS3 Fullscreen air Application looses focus

I'm currently building a kiosk program with as3+airkinect so the whole interface is built up with the Kinect and one external Button which is a keyboard input. Somehow the .exe program I've built looses the focus of the window after about one hour, which has the effect that the keyboard listener stops listening. I can detect a Event.DEACTIVATE, how can i make shure the keyboard listener always works?

Simulating Mac Mouse Events beyond CGCreateMouseEvent

I've been successfully using CG mouse events to simulate mouse down/drag/up events using a specialized hardware controller. However, I come across some applications in which using these CG mouse events has no effect- that is, I can click and drag the actual mouse to change controls within a certain area of the application, but simulating these exact same movements using CGCreateMouseEvent (tried posting to HID system state and CombinedSession state) does not work.
Perhaps these apps are listening specifically for a mouse/touchpad hardware device? Is there any way to more "realistically " simulate mouse events so that these app think the actual mouse/touchpad is dragging?

VC++ mouse events

I want to write a console program for mouse events (Only mouse scroll). How do I do it in VC++? The application will listen only to scroll events.
Description: If the user scrolls down, the Desktop window fades down, and fades-in when user scrolls up.
Here I just need to know to to listen to mouse events in console app.
Note: I am developing using win32 API, and for development environment I am using VS2010.
I've never actually done this myself. It seems that a console application responding to mouse events almost belies its nature and intended purpose. Generally, you would only need to respond to keyboard input from a console app and leave the mouse stuff to a GUI app.
That being said, this tutorial indicates that it is in fact possible to capture these mouse events from a Win32 console application. Generally, the suggestion is to use the ReadConsoleInput function and extract the information of interest from the INPUT_RECORD structure that it fills. The only tricky thing is that the call to ReadConsoleInput is a blocking call, which means it will not return until there is an input event fired. You'll need to structure your application's code accordingly. Mouse events are covered in detail about 3/4 of the way down the page.

Cocoa accessibility API, can I click a window in the background without activating it?

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

Resources