Can AppleScript listen for events? - events

I want to write a script that takes action when a document is opened on a certain application, or before an application quits, etc.
Is there a way to attach a script to an event in an application? Does AppleScript support any form of hooks at all?
If not, can I hack my way into getting what I want?

applescript only has certain "event listeners" the are folder action script that might be considered an event listener and indesign has real event listeners which I won't get into at the moment.
if you want a blanket listener for any application to quit you may find what your looking for in a Quickeys though I'm not certain of this as it has been a long time since I have messed around with quickeys.
but all and all the answer is for the most part no.
hth
Mike
EDIT more tools that may help brought by kch
FastScripts
QuickSilver
Keyboard Maestro
"Some apps, eg. iChat, have script hooks in the preferences. In iChat, the Alerts preference pane, you can set it to run a script when a certain event is triggered, like message received, file transfer request, etc." – kch

Related

AppleScript — listen for mouse click?

I'm brand new to Apple Script and trying to write a simple script that, while running, logs a timestamp any time the mouse is clicked anywhere.
How can I detect/listen for mouse events, and execute a command (logging a date-time) when they happen?
(This seems to me like a pretty basic thing to try to do, but I may be misunderstanding what AppleScript is for.)
Mouse-clicks are events handled and distributed by the system. In order to capture them you'd need to set up an event monitor, but there are no hooks for that in vanilla AppleScript; the language isn't designed for it.
If you wanted to switch to AppleScriptObjC and create an application, you could certainly set up an event monitor of this sort (using something like NSEvent's addGlobalMonitorForEventsMatchingMask:handler:. But I doubt AppleScript is the best language for it. At least, I can't think of a compelling reason to do this in AppleScript as opposed to writing a small cocoa app, and I anticipate a number of headaches trying to make the callback work correctly.

Redirected to applescript editor on clicking apple notification

I am using
osascript -e 'display notification "Lorem ipsum dolor sit amet" with title "Title"'
to display notifications in Mac. However, on clicking the notification, I am getting redirected to the applescript editor. Is it possible for me to redirect the user to a url or open up a directory on clicking the notification which is generated?
The run handler will only get called if the script is saved as an app, preferably a stay-open app. In any case, the app has to be still running when someone clicks the notification. You won't get this behavior from a simple osascript string.
You could get osascript to run a compiled script file (which can store properties persistently), but you will still need to distinguish between the run event that happens when you run the script, and the run event that gets called when someone clicks the notification.
I can suggest a couple of solutions here.
Use a python library to fire off notifications and forget about
appleScript/OSA. You can find some information, and various solutions at this
stackoverflow link:
Python post osx notification
Set up a stay-open appleScript app as a kind of 'notification server' and send a message to that (possibly with OSAscript, unless you can send a raw apple event to the 'server' from python) when you want to set up some notification intercourse. This is tricky, and seems overcomplex, compared to my first suggestion. In particular, you may still need to mess about with the privacy settings (especially if on Mavericks or later) to allow OSAscript access to system events.
Here are a couple of links which may guide you with the latter approach, but I really thing the first suggestion will get you further, with fewer tears:
http://jacobsalmela.com/bash-script-enable-access-assistive-devices-programmatically-os-x-mavericks-10-9-x-simulate-keystrokes/
http://support.apple.com/kb/HT6026?viewlocale=en_US&locale=en_US
so yes there is a way to do what you would like
here is a tutorial here
this is a simplified version that does what you like, however you must save it as an application and drag a file on it.
on open theItems
display notification "Open stackoverflow ?" with title "Stackoverflow"
delay 2
end open
on run
tell application "Safari"
tell window 1
set current tab to (make new tab with properties {URL:"http://www.stackoverflow.com"})
end tell
end tell
end run

How to write a program that runs another GUI program inside it

I am not sure how to ask the question so here is a picture of some idea that came to mind
So for example, when you run my "custom launcher" it displays a window with a couple buttons on the side which you can assign values to. When you click on a button, the appropriate program will run in the big panel on the right (in window mode).
This is all from the user's perspective of course. They will just see that the program they want to run appears in that panel. The actual implementation may have nothing to do with "one program running inside another program"
My own use case is limited to windows desktop platforms only, but if it is possible to generalize it that would be nice as well.
Is this actually possible? Can I write such a program that will run another program inside a panel? The program that's launched may be someone else's, such as MS paint or calculator.
Just to expand on my comment above, here is an approach that may work for you: Fake it :)
When you launch the program, intercept all windows messages to the program that control it's position on screen. That way it 'appears' to be fixed in place, but in reality it's still attached to the normal Windows desktop.
Here's some light reading for you:
Windows Event Hooks
A hook is a mechanism by which an application can intercept events,
such as messages, mouse actions, and keystrokes. A function that
intercepts a particular type of event is known as a hook procedure. A
hook procedure can act on each event it receives, and then modify or
discard the event.
I would recommend against it in a commercial application because you are modifying the behavior of software you don't own - that software may make assumptions about what its parent window is, but for experimentation there's the SetParent Win32 function.

Can I know what called applicationShouldHandleReopen ?

I'm looking to differenciate a dock clic from a click on the app icon in the finder.
Can I know what called applicationShouldHandleReopen or is there another way to do it ?
applicationShouldHandleReopen:hasVisibleWindows: is sent to the application's delegate. Delegate messages are normally sent by the delegating object, which in this case would be the application object.
The application object sends that message to its delegate in order to handle the reopen-application Apple Event. So, to find the sender, install your own Apple Event handler for that event and get the sender from the event. (The sample code is in Pascal and uses Apple Event Manager, but you can translate it to Objective-C and NSAppleEventDescriptor.)
That said, what you're doing is very dubious from a UI perspective. Reopening is meant to do the same thing no matter which application is reopening you—and it is not limited to the Finder or the Dock. In the common case, it is literally the user trying to launch the app while it is already open.
It may make more sense to only do your “reopen” behavior when no windows are open. Cocoa's built-in document-based-apps support does this automatically; if you don't respond to applicationShouldHandleReopen:: or you return YES, the application tries to open a new document. You can perform the same check (it even tells you whether you have any windows open) and perform your desired behavior under the same condition.

Cocoa accessibility API, can I click a window in the background without activating it?

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

Resources