How can I intercept what is being said to dictation on OSX 10.11 - macos

I am using a MacBook Pro, Early 2011 running OS X El Capitan. They re-added the dictation feature which allows constant listening, which I have been using with Applescript (like "Computer, quit" and it runs the script). I would love to expand though.
My question: Where can I find what the computer is interpreting? I want to use the entire phrase, like "Computer, I want to listen to music" and have it search through that phrase for the keyword music (then open iTunes or something). This way, instead of having short commands like "quit" or "shut down" it can interpret sentences. Also, how would I create a program that continues to run in the background, searching for these keywords.
Thank you for the help (I am quite the novice),
rednaxelaf7

Related

How to differentiate between repeating events in apple calendar

on OSX 10.12.4
I have a list of repeating events in my apple calendar that are set to run a applescript at certain times. However, if i happen to be away (vacation etc.)
then when I switch my mac back on it runs all the scripts that did not run while the computer was off. Annoying and long.
I tried to use some applescript code that checks that the script that wants to run is the one for today and today only (not for the days before etc.), but i could not find a way to discriminate between repeating events: they all have the same properties!
Any one has a way around?
thanks

Mac OS X: interacting with an application programmatically

I am working on a project where I need to call methods on an existing application (my own) and use some of its functionality. For e.g. my application ThunderBolt runs on Mac OS X 10.10. It also provides a dictionary of events that can be called externally through Apple Script or some other way that I don't know yet.
My question is what are the different (and better) ways of interacting with an application programmatically on Mac OSX? If I use something like the following code in Apple Script Editor:
tell application "ThunderBolt"
set open_file to (choose file with prompt "Choose the file you wish to parse")
set theContents to read open_file as data
set retPict to (image convert theContents)
end tell
then it is going to launch ThunderBolt with a splash screen and then call "image convert". This can be done via NSAppleScript but still it would launch the application and call methods/events on it.
Is it possible to somehow create an instance of (or get a pointer to) one of the class inside the application and use that? Something similar to COM or automation on a Windows system?
If you're working on OS X 10.10, you might consider taking a look at JavaScript for Automation (JXA).
With it you can apparently build methods into your app that can be invoked from client scripts written in JS (although I'm not yet familiar with the particulars of how to handle implementation of such a thing on the app side). But many of the apps that ship as part of OS X Yosemite have such APIs built in (e.g. iTunes and Finder).
Here's a great tutorial on JXA written by Alex Guyot: http://www.macstories.net/tutorials/getting-started-with-javascript-for-automation-on-yosemite/
The JXA-Cookbook repo also appears to be a nice resource: https://github.com/dtinth/JXA-Cookbook/wiki
Here's a brief example - this script makes iTunes go back one track. Try it while iTunes is playing (by putting the text into Script Editor, with the language option set to JavaScript, and hitting the Run button):
iTunes = Application('iTunes')
state = iTunes.playerState()
// Console msgs show up in the Messages tab of the bottom view:
console.log("playerState: " + state)
iTunes.backTrack()
Alternatively, you can place the code into a .js file and run it on the command line:
$ osascript itunes-backTrack.js
playerState: playing
The way you specify the 'tell application' is the best way, in my opinion.
What do you do with your app that needs to be called? Maybe some of the functionalities can be done with Applescript? It would simplify things a lot.

Mac OSX daemon for a task to maximize windows

I want a small functionality for all my Mac OSX application windows. When I double click on the title bar, either nothing happens or the application will get minimised(if the appropriate option is checked) instead, I want to create a functionality where it will get maximised completely(not full screen).
I am assuming that I should write a daemon for this but I am quite new to coding.
So my question is:
Can my "goal" be achieved by a daemon?
No, it cannot. There is no way to implement this functionality with public APIs on Mac OS X.

Is there a way to capture console controls (keyboard/mouse/remote) in OSX so they can be replicated on another machine?

I need to mirror GUI console activity happening on one Macbook so that it is duplicated on a second identical Macbook.
The idea is to control an application that will run on two Macbooks simultaneously. The application is sort of a presentation with two variations in content, but identical controls. Think of it as two versions of PowerPoint presentation with some slides that are different.
I'm thinking that it may be possible to capture the keypresses and mouse events on one Mac, then use RFB protocol to send these across the network to the other Mac. I'm looking at rfbproxy and rfbplaymacro, but these are somewhat inelegant hacks, and any solution built on these will also be a bit of a hack. And of course, I'd prefer to avoid a solution that requires me to compile and perhaps debug software that hasn't been touched in half a decade. :-)
I could conceivably use Cliclick or xdotool (from MacPorts) to initiate console events on the "slave" Mac. But then I don't know what I'd use to capture the events on the "master". Or would an xdotool-based solution require that both Macs be slaves, and then use some other device as a master?
Input devices could be a presentation mouse, an Apple remote, or in a pinch, the keyboard of on of the Macbooks or even a third device.
Can you suggest tools? Or is there another strategy I haven't thought of?
If the computers are in the same room, a single Apple Remote can control both Macs as long as the remote is not paired to either one. I'm assuming you need a solution that will work over any arbitrary distance, though.
Have you considered AppleScript? It's pretty good at sending keystrokes to ssh-accessible Macs. The receiving application doesn't even need to be aware of AppleScript (i.e. scriptable). You'll just have to be sure GUI scripting is enabled on the targets by checking the Enable access for assistive devices option in the Universal Access system prefs panel.
Here's an example of a shell command that will send a keystroke to the frontmost app via applescript:
osascript -e "tell application \"System Events\" to keystroke \"a\""
If you set up key-based ssh auth between the master and slaves you can simply tack ssh onto the front of this command:
ssh slave osascript -e "tell application \"System Events\" to keystroke \"a\""
For elegance, you could wrap any number of desired keystrokes into a menu-based bash script and run it from a third computer.
I have tried to synchronise systems like this that are NOT Macs a few years ago using rfbproxy and rfbplaymacro which you already know about. The systems were both X terminals running at the same resolution. We still had problems because of different font size settings putting application controls in different places, but the basic VNCiness of the solution seemed to work just fine.
That said, if you want to write a stand-alone application to send stuff using osascript or cliclick or xdotool, and you have a wii, you might get some joy from DarwiinRemote.
Kindof convoluted, but you could use clusterSSH for OSX to start shell sessions from a third machines master window, and then send commands to the two slaves. This could be paired with a screen control utility similar to the ones you list above, another of which is pymaCursor.
If everything could instead be recorded in advance, you could try good ol' applescript/automator recording, or a newer project like sikuli - http://sikuli.org/

Calling a command line in MONO on MAC OS X

I want to be able to call the automator or unix commands like ls from a mono app and ge the results back.
This can be accomplished on windows easily. The question is how is this done on the mac??
caveat: I've never written a char of mono in my life.
I imagine it's a matter of redirecting stdout and firing up a process. this linux forum shows that you can do pretty much that - OSX will behave mostly as a UNIX-like system for you, I reckon.
Oh by the way, if you want to fire up an OSX application, have a dig around inside the ".app" bundle. OSX shows these as a file, but they're actually directories. In the finder you can right-mouse click and "show package contents", or you can open up a terminal / command prompt and cd into them. For instance, you can launch the Automator like this from the terminal:
/Applications/Automator.app/Contents/MacOS/Automator
I don't know if you would want to go down this route, but if you're going to be interfacing with OSX (gui) apps, you might want to look at using Applescript as some "glue" between Mono and the app.

Resources