Add "Close" button to macOS notifications - macos

I'm using a plugin for my shell that displays a notification when a long running command has completed. To do this on macOS, it's using AppleScript:
osascript -e "display notification \"$message\" with title \"$title\""
When this notification pops up, it has to time out to disappear; I cannot dismiss it.
This blocks me often as it covers my browser tabs in the top right of the screen.
I've seen similar notifications that have buttons to close them, e.g. Slack message notifications.
How can I add a "Close" button to the notification?

The "Mac Automation Scripting Guide" gives an example of this.
The guide states the difference is determined by your settings.
Notifications are shown as alerts or banners, depending on the user’s settings
To change notification settings to include buttons, go to
System Preferences > Notifications > Script Editor > Alert Style > Alerts
It's unclear how to change the AppleScript command to include the buttons when using the default "banner" notification style. It may be possible as other applications always show their notifications with buttons.
For more info on AppleScript commands and its parameters, (1) launch Script Editor and (2) open the Standard Additions dictionary, then (3) navigate to the command’s definition.

Possibly of interest, there's an open-source command line utility on GitHub, https://github.com/vjeantet/alerter, which allows you to do this, and to specify many more notification options on an alert-by-alert basis. Alerter gives you much more control over alerts than AppleScript's display notification.
Alerter allows you to create an alert with numerous options: a custom icon, custom button labels, up to three different kinds of actions to replace the default "show" button on the alert, a specified application or URL to open when the alert is clicked, a Reply box like when you receive an iMessage, a way to group and remove previously display alerts, and many more.
Alerter can give you complete control over the buttons displayed, per alert, by specifying options in the shell command you enter to create each alert. You can specify apps or urls to open when the alert is clicked, or, if you use multiple options in the alert, your script will have to process the text returned to determine which option was selected.
(Note: https://github.com/julienXX/terminal-notifier is the original project Alerter was forked from. Terminal-notifier includes a few minor extra abilities that Alerter doesn't, but doesn't give you the same control over the buttons. If you want to include a lot more functionality, it might be worth reviewing both projects.)
I'm not involved with Alerter or Terminal-notifier, just a user.

To expand on Dennis' answer:
You can set the style of notification on a per app basis in:
System Preferences > Notifications > Script Editor > Alert Style > Alerts
In the screenshot in his answer he's changing the setting for notifications generated by the 'Script Editor' app.
I had to change it in 'Automator' to get my custom scrips to have close buttons.

Related

Applescripts cannot seem to target the Preferences Pane (Ventura)

I'm trying to use Automator to record an action on the system preferences, in the accessibility display pane. I want to change the value of the slider for the color filter intensity. (There are many similar questions asked here on SO, but they seem to be from 4-11 yr ago. I think the issue may be changes with Ventura, etc.)
I use automator to Watch Me Do and record the action. I reset the preferences manually. Clicking run does not cause the setting to change in preferences.
So I drag the actions from the Watch Me Do panel to the workflow timeline, which pop-out the action into an AppleScript. Running this is also not successful.
Can I manually set this preference in some way via the cli? When I search Apple developer docs, all I'm finding is Swift code to write an application. ((I realize that the app that will do this change, Automator, Terminal, or otherwise, will need to be granted accessibility permissions under privacy/security system preferences.))
Or how do I get/write an AppleScript to do this?
Also, I have checked the OSAX dictionary for System Preferences, and it does not have more specific controls registered besides at the basic window/pane/etc level

Is it possible to launch the taskpane if certain conditions are met?

I want the user to press on a command button which will run an API. If the API returns results, I want this to launch the taskpane and then display the result of the API.
Is this possible?
If we speak about web add-ins the task pane is launched by the button click independently of API results. At runtime you may decided what to display on the task pane.
But if you mean a custom task pane as a part of COM add-in you can do whatever you like - hide, show and etc.
For web add-ins, launching a task pane after running some code/API is not possible today. We track Outlook add-in feature requests on our Tech Community Page. Please submit your request there and choose the appropriate label(s). Feature requests on Tech Community are considered, when we go through our planning process.
Here are two alternatives I would suggest considering to see if they can work for your scenario
adding a command with ExecuteFunction as an action https://learn.microsoft.com/en-us/office/dev/add-ins/reference/manifest/functionfile and launching a dialog (displayDialogAsync)
Or, run ExecuteFunction that adds notification message with an action link that user can click to open a taskpane https://learn.microsoft.com/en-us/javascript/api/outlook/office.notificationmessageaction?view=outlook-js-preview

Controlling the "Open in ..." button in the Quick Look (preview) panel

The QLPreviewPanel window has a button that will allow the user to open the quick look document they are currently previewing by launching its original application.
Is it possible to (a) disable this button for some documents and (b) learn if the user has clicked that button.
My problem is that some of the QLPreviewItem objects I'm passing to QLPreviewPanel are actually placeholders that aren't intended to be opened, while others are temporary documents that get created spontaneously.
In the later case, I normally delete these when the preview is done, but obviously I don't want to do this if the user has opened them in an application.
I've looked at the API for QLPreviewItem, QLPreviewPanel, and QLPreviewPanelDelegate and don't see any notifications or messages that occur when the user opens an item.
If there's no API, I might just try to hack the UI by searching the QLPreviewPanel for an NSButton and hooking its action, but I don't like hacks and I'm sure this would be a fragile one.

Applescript Notifications with Buttons

Notification Alerts are fairly easy to program in Applescript:
osascript -e 'display notification "notification text content" with title "Alert!" sound name "Purr"'.
However i am yet to find an internet page or StackOverflow question which explains how to program them with buttons, similar to the type which appears when the App Store pesters you to upgrade to OS X Yosemite. In addition, the syntax would be used through the command osascript -e.
How do i program an Applescript alert through the notification centre with buttons?
Thanks in Advance.
The display notification command of Standard Additions provides only the parameters sound name, subtitle, with title and the body of the notification as direct parameter because it's not possible to handle the callbacks when the user presses a button in the AppleScript runner environment.
You can go to Preferences > Notifications, and scroll the applications to find the application Script Editor which is the application that sends the notification when you execute osascript -e '...'. Select Alerts as the alert style and you are all set!
Documentation is pretty clear here
Notifications are shown as alerts or banners, depending on the user’s settings in System Preferences > Notifications
But it doesn't provide an example code to catch the button clicked. (if you try it just opens the applescript editor)

How do I get keyboard events in an NSStatusWindowLevel window while my application is not frontmost?

After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.

Resources