Is it possible to determine if application was launched from AppleScript or just by clicking the icon in Cocoa?
I would be very surprised if this actually was possible.
AppleScript is just a scripting language, meaning that its capabilities are somewhat limited as far as applications go.
So I'd say you're probably out of luck (sorry :( ).
Related
I'm a total rookie when it comes to Objective-C so please bear with me...
Been thinking of learning the basics and trying out creating some software of my own. One thing that's bothering me (and never seem to show up as an alternative in any updates) is the ability to require a double-click in order to start an app in the Dock. I always seem to manage to click at the wrong place when switching between apps...
Yes, I am very well aware of Cmd+Tab thank you :) I really want this feature and it shouldn't be too hard to set up as long as overriding the default functionality of the Dock is possible. Thoughts/suggestions? Perhaps just a Terminal command is enough...
I am looking for a lightweight solution that would allow me to detect which form/ dialog is open in an application, then emit some keystrokes / mouse moves and clicks. I do not have control over (nor the source-code for) the application.
I am familiar with MacroMaker, also testing products like SQA / Mercury offer similar functionality. The last time I had hands on exposure in this are is late 2004, I welcome any pointers to bring my knowledge up to date.
AutoIt is a scripting environment for Windows with a long history. It's quite easy to use and flexible to do things like detect the open window or dialog, change to another one, type something, etc. I would definitely recommend it.
In case anybody is curious, in the end I decided to use Microsoft UI Automation. Here is nice intro:
http://msdn.microsoft.com/en-us/magazine/cc163288.aspx
I often see all this crazy stuff with Apple scripting, involving telling menus and menu items, and UI elements and all that crazy soft of stuff to do things. I don't mind that it's kind of a crazy way to get things done - as long as it works - but my question is this: How do you debug that stuff? I mean, how do you know what your options are?
I have apple script editor and script debugger, but I'm not sure how to use them to see what the options are. I've tried searching with google but I haven't come up with anything.
I do some web development so I'm used to using Firebug to examine the DOM of a web page, I just assume that there should be something easy and similar to help with Applescript.
Thoughts?
I'm not sure if you're asking how to script the GUI or how to tell which GUI elements are available in an application. If the former, try starting with Graphic User Interface (GUI) Scripting.
If you want to find out what the GUI hierarchy is for an application, check out UI Browser, which will allow you to see the UI elements of an application and provide the information you need to target one of them.
I am writing a program to sit in the background on osx 10.6, listen to keystrokes and record them, grouping them by window title. (No, I am not writing malicious software. I do not need this program to be sneaky in any way, I just want to have a safety net for when I have typed a huge email and then accidentally refresh the page (APPLE-R) instead of opening a new tab (APPLE-T)) I have already found apple's EventMonitorTest example for the keystroke capturing code, now I just need to find the "key window" title.
Does anyone know where I can find examples for this kind of functionality? Thank you!
A couple of possibilities:
You could use the Accessibility API (though of course keep in mind that 64-bit Carbon does not support this)
You could use the CGWindow functions introduced in Leopard
I suspect the first option will be easier to do this with, since the CGWindow API is somewhat low-level and treats all windows (application windows, menu bars, dock icons, etc.) more or less equally.
Everything I know about AppleScript I taught myself and was wondering if I missed any cool features. I know you can make the computer talk to and control applications but is there anything else it can do or is it time to move on to a new language?
The coolest thing about Applescript I've recently discovered, is that you can script almost anything on your mac. So even application, which don't support Applescript natively, can be used in a workflow.
This is possible, because you can just "press" buttons as if you're sitting on the computer.
tell application "GhostReader" to activate
tell application "System Events" to keystroke "n" using command down
I used this to copy and paste a website from Safari and have it read by GhostReader, a proprietary text to speech tool.
When it comes to Applescript, application control is where the action is. There's not much of a "wow" factor within Applescript itself unless you're a real language nerd. It's really more about presenting a set of easy-to-use tools to control the "wow" factor of other applications.
I've seen (and have) examples of Applescript playing simple card games and other text-based fun (well...as much fun as one can have viewing one display dialog after another), but these are (at best) academic exercises to show off the robustness of the language itself or a specific feature of Applescript.
Simple, but I use this all the time!
tell application "System Events"
display dialog "$msg" with icon stop buttons {"Foo", "Bar", "OK"} default button "OK"
end tell
Whenever I'm doing some shell programming, it's convenient for my operation to bring awareness into Finder, via a dialog.
Very handy.
You can automate everything on your Mac, this is a great time saver.
I remember coding shell on C++ on Windows, it's just a pain to automate Windows.