How to grant Applescript permissions through Applescript? - applescript

If I give my Applescript to another person, they would have to manually allow Applescript control of their computer by going into System Preferences, clicking Security & Privacy, then clicking Privacy and then Accessibility, then finally add Applescript. Is there anyway I can make Applescript add its self so they don't have to? Is there another solution? Because without them doing this, Applescript can't click things.
Without applescript added so it can have control, I get the error, "Script Editor is not allowed assistive access."

I believe that your are trying to enable assistive access for devices. For instance, GUI scripting requires this to be enabled. You can't enable it directly, but you can point the user to the place they need to go. The following finds out if assistive access is enabled and brings up the System Preference pane where the option is.
tell application "System Events"
set UI_enabled to UI elements enabled
end tell
if UI_enabled is false then
tell application "System Preferences"
activate
set current pane to pane id "com.apple.preference.universalaccess"
display dialog "This script utilizes the built-in Graphic User Interface Scripting architecture of Mac OS x which is currently disabled." & return & return & "You can activate GUI Scripting by selecting the checkbox \"Enable access for assistive devices\" in the Universal Access preference pane." with icon 1 buttons {"Cancel"} default button 1
end tell
end if
That should be enough for your users to get the hint.

You can theoretically do it via the command line:
https://raymii.org/s/snippets/OS-X-Enable-Access-for-assistive-devices-via-command-line.html
For example, in Lion and Mountain Lion it's a simple as this:
touch /private/var/db/.AccessibilityAPIEnabled

Related

Sending keystrokes in Applescript via Automator in any application

I setup an Applescript quick action in Automator to leave a zoom meeting. No matter what application is in the forefront, I would like to check if zoom is running and if so, leave the meeting.
set appName to "Zoom.us"
if application appName is running then
tell application id (id of application appName)
activate
end tell
tell application "System Events"
keystroke "w" using command down
keystroke tab
keystroke return
end tell
end if
It works! The problem is, if I'm in a zoom meeting and another app is in the forefront, I have to get permission from system preferences for that app to access system preferences. Like if I'm in Chrome, I have to allow Chrome to send keystrokes. Then, Chrome will always work.
I have to do this for every single possible app. Is there a way to get keystrokes in there without going thru this security stuff in Big Sur? I don't mind bringing up zoom to the forefront.
There are issues at play here with using global keyboard shortcuts with Automator workflows saved as a Service/Quick Action.
The keyboard shortcut assigned to the Service/Quick Action needs to not conflict with a default keyboard shortcut for whichever application is frontmost at the time it's pressed, otherwise there may be unwanted behavior.
Every application that is frontmost when the keyboard shortcut is pressed on a Service/Quick Action using UI Scripting in the Run AppleScript action will need to have accessibility privileges granted for it (as you've already found this out and hence the question).
To workaround the accessibility privileges issue, here are three methods to achieve the goal that come to mind.
The first, my preferred method for running AppleScript scripts with a keyboard shortcut, is to use a third-party application named FastScripts, as it would not need to have every application that's frontmost, that hasn't yet been granted privileges, to be granted accessibility privileges to run the AppleScript code shown in your question. I'd imagine other similar type third-party applications that allowed assigning keyboard shortcuts and running a script would bypass the issue too, but have only tested the aforementioned.
The second method, can be done with Automator as a Service/Quick Action using a Run Shell Script action, then assigned a keyboard shortcut and will work without having to give accessibility privileges to the application that is frontmost when the keyboard shortcut is pressed.
The third method, can be done with Automator as a Service/Quick Action using a Run AppleScript action if changing a zoom.us default preference under zoom.us > Preferences… > General by unchecking [] Ask me to confirm when I leave a meeting, then when assigned a keyboard shortcut and will work without having to give accessibility privileges to the application that is frontmost when the keyboard shortcut is pressed.
All testing was done under macOS Big Sur using zoom.us, (Version: 5.4.7 (59780.1220)) with my Language & Region setting in System Preferences set to US English using the various methods presented.
Method 1
The first method using the following example AppleScript code, show further below, and FastScripts with the keyboard shortcut ⌃⌥⌘W assigned and works for me as coded.
In System Preferences > Security & Privacy > Privacy > Accessibility I had the following added and checked:
FastScripts
System Events
Then with zoom.us running and several other applications which were frontmost when the keyboard shortcut was pressed I did not have to grant accessibility privileges to those other applications, zoom.us was brought to the front and closed.
Example AppleScript code:
if application "zoom.us" is running then
tell application "zoom.us" to activate
delay 0.5
tell application "System Events" to ¬
tell application process "zoom.us"
keystroke "w" using command down
delay 0.5
key code 36
end tell
end if
Note for testing purposes, after testing Method 1, I quit FastScripts as it would have been triggered by the same keyboard shortcut which was assigned when testing the next two methods.
FastScripts can be run as a free app, up to 10 keyboard shortcuts, or upgraded for $24.95 USD to unlock unlimited keyboard shortcuts. I have no affiliation with Red Sweater Software, LLC, other then as a user of FastScripts.
Method 2
The second method was tested using Automator and a Service/Quick Action with setting Workflow receives [no input] in [any application] using a Run Shell Script action with its default settings, and the following example shell script code is all that was used:
[[ -z $(pgrep -x 'zoom.us') ]] || pkill −x 'zoom.us'
In System Preferences > Keyboard > Shortcuts > Services I assigned it the keyboard shortcut: ⌃⌥⌘W
Then with zoom.us running and several other applications which were frontmost when the keyboard shortcut was pressed I did not have to grant accessibility privileges to those other applications, zoom.us was closed.
Method 3
The third method was tested using Automator and a Service/Quick Action with setting Workflow receives [no input] in [any application] using a Run AppleScript action replacing its default code with just the following example AppleScript code:
tell application "zoom.us" to quit
In System Preferences > Keyboard > Shortcuts > Services I assigned it the keyboard shortcut ⌃⌥⌘W after removing it from the Service/Quick Action created in Method 2.
Then with zoom.us running and several other applications which were frontmost when the keyboard shortcut was pressed I did not have to grant give accessibility privileges to those other applications, zoom.us was closed.
This of course works because the [] Ask me to confirm when I leave a meeting preference was unchecked in the preferences for zoom.us.
To recap, if you do not mind changing the mentioned default preference in zoom.us, then Method 3 is probably the easiest and best way to resolve your issue as it allows for a graceful quit over Method 2, does not require any third-party application, and does not require any accessibility privileges be granted or zoom.us being frontmost, it just works.
I mentioned the other methods first as method one addresses the UI Scripting issue with Automator and a Service/Quick Action, and method two works with the default settings in zoom.us.

Automatically Add Application to Allow assistive access

Part of an application changes the scroll direction of the trackpad with this AppleScript (used as AppleScriptObjC in Xcode AppleScript application).
When running, it often pops up with a message telling me that my app has not been allowed accessibility access (the usual message...)
Here is the code:
on TrackpadIsAttached()
try
tell application "System Preferences" to quit
tell application "System Preferences"
activate
set current pane to pane "com.apple.preference.trackpad"
end tell
tell application "System Events" to tell application process "System Preferences"
tell radio button 2 of tab group 1 of window 1 to if value is 0 then click
tell checkbox 1 of tab group 1 of window 1 to if value is 1 then click
end tell
tell application "System Preferences" to quit
functionSuccessful("Mouse control optimisation completed successfully.")
on error errMsg
my errorReporting(errMsg, "Fatal Mouse Optimisation (Trackpad) Error")
tell application "System Preferences" to quit
end try
end TrackpadIsAttached
I know I can do it manually with: System Preferences > Security & Privacy > Privacy > Accessibility but can it also be done automatically with AppleScriptObjC or 'Do Shell Script'? - I don't mind if the user has to type is a password to authenticate it etc.
Any help is greatly appreciated.
Two things:
GUI Scripting is the automation option of absolutely last resort: humiliatingly klunky and brittle as hell, and enabling it either manually or automatically opens up a potential security risk—not a liability you want to place on either yourself or your users if you can possibly help it. If there is any programmatic way to perform the operation you need, use that instead. Which leads us to…
You haven't said why your users need and/or want to change their trackpad direction. What is the purpose of your app? Is it intended to give users who frequently reverse trackpad direction (e.g. while playing third-party games) with an easier way to do it? Or it trying to reverse it for its own purposes? (e.g. If your app just needs to know which way the user's finger is really moving, use -[NSEvent isDirectionInvertedFromDevice].)
Update your question to explain your overall goal, and you're a lot more likely to receive robust high-quality answers that won't further dig you into any holes of your own unintended making.

Accessing an Applescript from 2 different menus produces different results. Why?

I've got an application that has an embedded Script menu for running Applescripts. However, if you try to run scripts with certain functions (mostly UI related) they won't work unless you run them from the System script menu (in menu bar).
For example, if in theApplication you say :
tell application "System Events" to tell application process "theApp" to get all windows
it will return an empty list if run from the program's script menu, but 2 if run from the system script menu. I've also tried:
tell application "System Events"
tell application process "theApp"
set allElements to UI elements
display dialog (count of allElements)
end tell
end tell
-- returns 2 when run from System script menu but 0 when run from within theApp.
Also if you run from the Script Editor it will work fine. GUI scripting is enabled for the application in System Preferences, so I'm curious as to why this is happening, and any workarounds for it (other than run the script from the System Script menu)? The Dictionary shows Standard Suite so it should have access to windows...
Any ideas?
I believe you haven't provisioned the app in question to use assistive devices.
You should really open System Preferences, then click the button for "Security", then unlock, and click the button for Assitive Devices, (the blue icon with a white man), then drag your app onto it (reveal your app from its Dock menu), and permit it to control your machine, remember to lock the pane afterwards.
I can not guaranttee this to work, but it is certainly worth a try.

Access context menu items by name using AppleScript

I'm trying to use AppleScript to click on context menu items in Logic Pro, preferably by simply providing the name of the menu item. It seems like this should be possible because I'm able to set up keyboard shortcuts for these context menu items using system preferences and providing the name of the command.
For instance, if you right click on the main editing window in Logic, a menu pops up with an option called "Add Audio File..." If I create a system preferences keyboard shortcut for Logic and give it this menu item name, it's able to execute just fine. I'd like to recreate this with a script. I'm familiar with accessing normal menu items using the hierarchy like so:
tell process "Logic Pro"
tell menu bar 1
tell menu bar item "File"
tell menu "File"
click menu item "Save"
but as far as I know, there's no way to access the context menu (right click menu) that I want like this. It seems there should be a way to simply access a non-menu-bar menu item by name since system preferences is obviously able to do so.
Logic pro is not scriptable so my suggestion would be that you set a keyboard shortcut in the system preferences then use system events to use said shortcuts.
for example to enter find mode (assuming there is a find mode since i don't own Logic Pro)
tell application "Logic Pro" to activate
tell application "System Events"
tell application process "Logic Pro"
keystroke "f" using command down
end tell
end tell
I don’t think you need to use the context menus. “Add Audio File…” is available in other parts of the Logic Pro X user interface. If you open the Project Audio window, there is an “Audio File” menu button with an “Add Audio File…” button in it. So this AppleScript will activate the “Add Audio File…” command:
tell application "System Events"
tell application "Logic Pro X" to activate
tell process "Logic Pro X"
tell menu bar 1
click menu item "Open Project Audio" of menu "Window"
end tell
delay 1
tell window 1
click menu button "Audio File" of group 1 of group 1
click menu item "Add Audio File..." of menu 1 of menu button "Audio File" of group 1 of group 1
end tell
end tell
end tell
One thing to keep in mind if distributing a GUI script is that the above script will only work in Logic Pro X running on a Mac set to US English (and maybe other kinds of English) because the names of the menus change if the system is set to another language. What you can do is replace the names in the above script with numbers, which is a totally experimental process, as far as I know. You have to try different numbers and see which continues to work.
So you may be able to replace:
menu button "Audio File" of group 1 of group 1
… in the above script with:
menu button 1 of group 1 of group 1
… and get the same functionality, and the script would work on any Mac. Or you may need to use “menu button 2.” Same goes for the other named items in the above script.
Also keep in mind that the user you distribute this script to has to give System Events permission to control their Mac in the Security pane of System Preferences or this script won’t work. That can be a giant obstacle to distributing GUI scripts. And if you save your script as an Application, you have to digitally sign it or it won’t run on other people’s computers, and that can be complicated.

Click menu item on Mac OSX Lion using AppleScript

I have a problem using a simple AppleScript on Mac OSX 10.7.3.
With the following simple AppleScript which I find everywhere OSX raises an error 'The action "Run AppleScript" encountered an error'
I open up the Automator, create a Service, drop in a "Run AppleScript" node and enter the following code which I assume is correct because as I said it is the way a lot of people are doing it without any complaints.
AppleScript:
tell application "Terminal" to activate
tell application "System Events"
tell process "Terminal"
click menu item "New Window" of menu "Shell" of menu bar 1
tell application "Terminal" to close the front window
end tell
end tell
EDIT: When running in Automator I also get an error description:
Run AppleScript failes -1 error
Access for assistive devices is disabled"
Is Enable access for assistive devices enabled? If so, have you tried to reenable it?
Well, I guess I'll answer the question anyways (and thanks for editing your question to give a bit more useful detail).
Go to the "Universal Access" pane in System Preferences and at the bottom of the "Seeing" tab, you'll see a "Enable access for assistive devices" checkbox.
Turn that on and I suspect Automator will work.

Resources