I'm brand new to Apple Script and trying to write a simple script that, while running, logs a timestamp any time the mouse is clicked anywhere.
How can I detect/listen for mouse events, and execute a command (logging a date-time) when they happen?
(This seems to me like a pretty basic thing to try to do, but I may be misunderstanding what AppleScript is for.)
Mouse-clicks are events handled and distributed by the system. In order to capture them you'd need to set up an event monitor, but there are no hooks for that in vanilla AppleScript; the language isn't designed for it.
If you wanted to switch to AppleScriptObjC and create an application, you could certainly set up an event monitor of this sort (using something like NSEvent's addGlobalMonitorForEventsMatchingMask:handler:. But I doubt AppleScript is the best language for it. At least, I can't think of a compelling reason to do this in AppleScript as opposed to writing a small cocoa app, and I anticipate a number of headaches trying to make the callback work correctly.
Related
I am not sure how to ask the question so here is a picture of some idea that came to mind
So for example, when you run my "custom launcher" it displays a window with a couple buttons on the side which you can assign values to. When you click on a button, the appropriate program will run in the big panel on the right (in window mode).
This is all from the user's perspective of course. They will just see that the program they want to run appears in that panel. The actual implementation may have nothing to do with "one program running inside another program"
My own use case is limited to windows desktop platforms only, but if it is possible to generalize it that would be nice as well.
Is this actually possible? Can I write such a program that will run another program inside a panel? The program that's launched may be someone else's, such as MS paint or calculator.
Just to expand on my comment above, here is an approach that may work for you: Fake it :)
When you launch the program, intercept all windows messages to the program that control it's position on screen. That way it 'appears' to be fixed in place, but in reality it's still attached to the normal Windows desktop.
Here's some light reading for you:
Windows Event Hooks
A hook is a mechanism by which an application can intercept events,
such as messages, mouse actions, and keystrokes. A function that
intercepts a particular type of event is known as a hook procedure. A
hook procedure can act on each event it receives, and then modify or
discard the event.
I would recommend against it in a commercial application because you are modifying the behavior of software you don't own - that software may make assumptions about what its parent window is, but for experimentation there's the SetParent Win32 function.
There is one application that controls Microsoft Word 2011 for Mac using AppleScript.
It does really nice things that I want to implement in my own app.
So, is it possible to intercept AppleScript calls to particular application, and reconstruct source code of AppleScript that made that calls?
It is impossible to view source code of applescript that executes on particular app.
But debugging apple events, can make sense to cast a light on what is going on.
So I just opened Terminal.app and executed a command:
env AEDebugReceives=1 /Applications/Microsoft\ Office\ 2011/Microsoft\ Word.app/Contents/MacOS/Microsoft\ Word
That will force Microsoft Word (in fact almost any application) to print all received apple events in terminal.
I want to write a script that takes action when a document is opened on a certain application, or before an application quits, etc.
Is there a way to attach a script to an event in an application? Does AppleScript support any form of hooks at all?
If not, can I hack my way into getting what I want?
applescript only has certain "event listeners" the are folder action script that might be considered an event listener and indesign has real event listeners which I won't get into at the moment.
if you want a blanket listener for any application to quit you may find what your looking for in a Quickeys though I'm not certain of this as it has been a long time since I have messed around with quickeys.
but all and all the answer is for the most part no.
hth
Mike
EDIT more tools that may help brought by kch
FastScripts
QuickSilver
Keyboard Maestro
"Some apps, eg. iChat, have script hooks in the preferences. In iChat, the Alerts preference pane, you can set it to run a script when a certain event is triggered, like message received, file transfer request, etc." – kch
I don't really know where to begin. Let's start with the stupid questions:
What language should I use for this? What is suited for the task at hand?
Next, the real ones:
Is there a way to stop the screensaver from starting, short of changing the cursor position? If not, will changing the cursor position even work?
SetThreadExecutionState will prevent the screensaver from coming on or the machine from automatically going to sleep if you pass the ES_CONTINUOUS and ES_DISPLAY_REQUIRED flags.
I wrote an app awhile ago that does exactly what you are asking for. It runs as an icon in the System Tray, not the Taskbar, and uses a global message hook to disable the WM_SYSCOMMAND/SC_SCREENSAVE notification from reaching any applications. If that notification does not reach the DefWindowProc() function, the screen saver will never run.
Your program does not need to be visible in the task bar at all.
You don't even need a program at all, if you can disable the screensaver in the registry.
What you want to do can perhaps be achieved by sending a MOUSE_MOVE event to the desktop window. If you want to use C# (the only language I am current with right now), you can look at this article, but maybe a simple C program using the WinAPI is better suited for this task.
.NET will easily allow you to put an application in the system tray (checkout the NotifyIcon object in System.Windows.Forms.Controls).
I believe you can use the SetCursorPos (http://msdn.microsoft.com/en-us/library/ms648394(VS.85).aspx) API call to prevent the screen saver, just make sure you set them to the current location so you don't actually move the mouse.
Using the Apple OS X Cocoa framework, how can I post a sheet (slide-down modal dialog) on the window of another process?
Edit: Clarified a bit:
My application is a Finder extension to do Subversion version control (http://scplugin.tigris.org/). Part of my application is a plug-in (a Contextual Menu Item for Finder); the bulk of my application, however, is in a separate daemon proces. For several reasons, we've chosen to put virtually all the code into the daemon; the plug-in only defines the menu itself, and Apple-Events over to the Daemon.
Sometimes, the daemon needs to prompt the user for further information. It can toss a window on-screen for this, but that's disruptive (randomly positioned), and it seems to me the work flow here is legitimately modal, for example "select a file, pick 'commit' from the menu, provide commit comments, do the operation."
Interprocess cooperation (such as passing a reference of some kind) is acceptable: both processes are mine, but I want to avoid binding the sheet's code into the primary process.
Really, it sounds like you're trying to have your inter-process communication happen at the view level, which isn't really how Cocoa generally works. Things will be much easier if you separate your layers a bit more than that.
Why don't you want to put the sheet code into the other process? It's view code, and view code is inherently process-specific. The right thing to do here is probably to add somewhat generic modal-sheet support to your plugin code, and an IPC call that your daemon can make to summon that code. Trying to ship view objects over to the remote process is going to be nightmarish if you can make it work at all.
You're fighting the frameworks with this approach.
You can't add a sheet to a window in another process, because you have at most only the most restricted access to the windows in the other process.
Please don't do this. Make the interaction nonmodal if at all possible. Especially in something like a commit, it's much nicer to be able to browse around your files while you're writing commit comments.
OS X does have window groups, but I don't think they can (easily) span applications.
Another thing to consider is that in OS X it's possible to have many Finder windows open on the same folder (unlike in OS 9). Even if you did have sufficient privileges/APIs to add a sheet to a Finder window, it's not like the modality of that window would prevent the user from being able to continue working with the files.
(My personal opinion as a long-time Mac user is that this kind of interaction would drive me right up the wall.)