Desired Results:
1. Open Screen Sharing.app
2. Input desired IP address and connect (changes depending on environment)
3. Auto Mute Microphone
4. Wait for session to connect
5. Auto switch to Observe Mode
6. Session is complete and Screen Sharing.app is closed
7. Auto UnMute Microphone
What's been done and needs:
1. I have successfully done steps 1, 2, 3, 6, and 7.
2. I am stuck trying to find a way to trigger the Observe Mode without using a timer. If the desired connection isn't complete within the designated time or the window is not selected, it will not work and an error will occur.
3. I am using a combination of Automator and Applescript. Most of the items are setup by Applescript.
My question:
How can I tell Automator or Applescript to wait for Screen Sharing to finish connecting before proceeding with the remaining tasks?
Except GetUrl instruction, screen sharing app is not scriptable, then you cannot ask it if the vnc connection is on or not. Screen sharing app has an internal timeout which triggers an error window ("can't open...") after sometime.
So you can't know when the screen sharing will have open the windows or just the error dialog. The work around I used is to check, before opening vnc url, that your IP address answers to ping requests. But you may have some servers which don't answer !
The second part is easier because the switch to observe mode can be done via keystroke (screen sharing menu) with script bellow :
tell application "Screen Sharing" to activate
tell application "System Events"
tell process "Screen Sharing"
keystroke "x" using {option down, command down}
end tell
end tell
Of course, this is only valid once screen sharing has open the vnc window. And you must allow, in system preferences, the GUI scripting via your application.
Related
I am trying to make a simple AppleScript to open Spotify on my computer. I have assigned Spotify to always open on Desktop 2. However, every time it opens, it switches my focus from whatever desktop I am currently using to Desktop 2 where Spotify is being opened. Is there any way for me to prevent this, that is, for the app to open without switching my focus to a different desktop?
Current code:
tell application "Spotify"
activate
delay 3
playpause
end tell
I have also tried to open Spotify using commands like open -g -a 'Spotify', but this just opens the app in the background and still switches desktops.
Thanks for the help.
I'm writing a program that executes do javascript in Safari. The only problem is that I'm trying to make the app give its self permission to do it. I'm trying to locate the file that handles the Safari developer preferences so that I can do this. Does anyone have any idea where this might be or how to change these settings?
It's in Safari's preferences plist at ~/Library/Preferences/com.apple.Safari.plist. The key you want is AllowJavaScriptFromAppleEvents. You can set it using defaults:
#to turn it on
defaults write -app Safari AllowJavaScriptFromAppleEvents 1
#to turn it off
defaults write -app Safari AllowJavaScriptFromAppleEvents 0
The virtual keyboard thing did not work for me. As StarPlayr at apple's develepoer forum has found out the problem is in something else.
For me problem occurred when i tried to do that on remote mac.
For some people plugging in a keyboard and mouse to the Server allowed to turn on JavaScript Apple Events in Safari and set the password.
However, for me that wasn't an option, so the next best thing is use an accessbility scripting feature and have the machine think a user is doing the clicks, allowing you to set the password:
-- The delays can be shorter, coordinates may vary
-- Best way to get the coordinates is with Apple screen capture (command-shift-3) from upper right to lower left (the coordinates will be shown)
-- if one spends the time, the click events can be converted to Accessibility AppleScript objects by capturing them as variables, or checking the events and using the events instead of the click coordinates
tell application "System Events"
tell application "Safari"
activate
end tell
delay 1
-- click develop menu (make sure its on first)
click at {430, 12}
delay 1
-- click Allow Javascript menu from Apple Events
click at {615, 615}
delay 1
-- Click the Allow Button
click at {1010, 386}
end tell
I'm looking for a way to lock the user screen programmatically without putting the Mac asleep.
Right now, i'm able to trigger the lock screen with the kAESleep event but it's more a hack and it put the computer asleep.
Is it possible ?
Thanks
Configure the screensaver to require a password immediately after it starts, then start the screensaver programmatically. I have it programmed to a keyboard shortcut to help my Windows folks transition to using real computers ;).
The following AppleScript will do it for you. Note that because of security limitations of OSX, AppleScript pauses for five seconds before it executes an UI function, so it takes a small while to function. I'm using Quicksilver to bind it to a hotkey.
(As a bonus, this script will also pause a couple of your music players. Feel free to remove those lines.)
#
# Tell our noisy programs to shut up
#
tell application "Spotify"
pause
end tell
tell application "iTunes"
pause
end tell
#
# Lock up the screen without going to sleep. Needs that Keychain Access
# is set up properly.
#
tell application "System Events" to tell process "SystemUIServer" to click (first menu item of menu 1 of ((click (first menu bar item whose description is "Keychain menu extra")) of menu bar 1) whose title is "Lock Screen")
You will need to set up Keychain Access so that it has the lock icon on screen though.
I was successfully able to lock the screen on macOS in python with the following
import ctypes, ctypes.util
login = ctypes.CDLL( '/System/Library/PrivateFrameworks/login.framework/login' )
login.SACLockScreenImmediate()
I discovered this by scarce information on the Internet and trial-and-error. As far as I know, Apple doesn't document the SACLockScreenImmediate() function at all.
If anyone can find the official reference documentation for the "Login Framework" library, please drop it in the comments :)
Source
The is used in the BusKill app, which locks the screen when a magnetic breakaway connection in a USB Dead Man Switch is severed:
https://github.com/BusKill/buskill-app/tree/master/src/packages/buskill
When my system boots I would like to run applescript that opens files in three different desktops/spaces.
First Space: Mail and Things (my to do list program)
Second Space: Textmate and a Safari for my first project
Third Space: Textmate and a Safari for my second project
First, in Mission Control I created two more desktops which will remain there the next time my system boots unless they are manually removed. Instead of creating one long script, I chained three applescripts (boot1, boot2 and boot3) to break it up into simpler blocks of code. At the end of boot1 you will see:
run script file "<drive name>:Users:<username>:boot2.scpt"
In boot2 and boot3 you will see a bunch of delay lines. One thing I dislike about applescript is that it often starts processing the next command before the OS finishes responding to the prior one. This causes inconsistencies and errors. Delays are a hack to force things to slow down. They help, but even when you use them things are still a bit dicey. In boot2.script:
# this emulates the keyboard shortcut to move to desktop 2
# there doesn't seem to be any way to modify an `open` command to open a file on desktop 2
tell application "System Events"
delay 2
# key code 19 is the key code for the number 2.
# <cntl> 2 is the shortcut to get to desktop 2
key code 19 using control down
end tell
tell application "TextMate"
activate
# 'sites' is the name of the directory my projects are in
open "/users/<username>/sites/project1/"
end tell
tell application "Terminal"
activate
do script "cd /users/<username>/sites/project1/"
delay 2
do script "rails s" in front window
delay 2
tell application "System Events" to tell process "Terminal" to keystroke "t" using command down
tell application "System Events" to tell process "Terminal" to keystroke return
delay 2
do shell script "open -a Safari http://localhost:3000"
end tell
OK... so this mostly works to get desktop 2 in place except for inconsistencies when the delays aren't long enough. Boot3.script is almost the same as boot2 but when trying to open an application on desktop 3, because there is a window on desktop 2 the system jumps back to that desktop. This is the next problem. How do I overcome that?
2305491 is no longer relevant because space preferences are gone.
Thanks.
Boot3.script is almost the same as boot2 but when trying to open an application on desktop 3, because there is a window on desktop 2 the system jumps back to that desktop.
There is an option in the Mission Control Preferences called "When switching to an application, switch to a Space with open windows for the application". Uncheck this.
OK... so this mostly works to get desktop 2 in place except for inconsistencies when the delays aren't long enough.
Better solution is always something like this
repeat until something exists
delay 0.1
end repeat
I have 2 air applications that I wrote. They auto fullscreen after 10 seconds. Before then, they need to be sent to their proper displays. "app_1" needs to run on display 1, "app_2" needs to run on display 2.
Essentially, I have this code:
do shell script "cd /Applications/app_1.app/Contents/MacOS/ ; open app_1;"
which works for me flawlessly. Both apps are launched that way, and there is some code for ensuring that the apps weren't already open, and closing them if they were.
I tried to add in a script to position the app after it is launched:
do shell script "cd /Applications/app_1.app/Contents/MacOS/ ; open app_1;"
tell first window of application "app_1" to set bounds to {0,0,1920,1080}
This gives me an error:
app_1 got an error: Can't set bounds of window 1 to {0,0,1920,1080}
I tried adding a delay of a couple seconds before the set bounds, in case the application hadn't yet launched when the set bounds fired off, however this didn't change anything.
I also tried setting the bounds to something like {100,100,200,200} just to see if I had the screen coordinates wrong or something, but still the exact same error, only with the {100,100,200,200} instead of the original 1920x1080 coordinates.
Anyone have any insight on this? I've been trying to find the solution on google for a couple of hours now.
It sounds like your app isn't exposing the standard "window" class. I don't know if AIR apps are supposed to automatically take care of this and it's not working—if so, you'll want to debug that.
But another alternative is to use UI Scripting to control its windows externally. Instead of this:
tell first window of application "app_1" to set bounds to {0,0,1920,1080}
Do this:
tell application "System Events"
set position of first window of application process "app_1" to {0, 0}
set size of first window of application process "app_1" to {1920,1080}
end tell
However, this will only work if you've gone to the Universal Access pane of System Preferences and checked "Enable access for assistive devices" (or done the same via API, "sudo touch /var/db/.AccessibilityAPIEnabled", etc.).