Using the Apple OS X Cocoa framework, how can I post a sheet (slide-down modal dialog) on the window of another process?
Edit: Clarified a bit:
My application is a Finder extension to do Subversion version control (http://scplugin.tigris.org/). Part of my application is a plug-in (a Contextual Menu Item for Finder); the bulk of my application, however, is in a separate daemon proces. For several reasons, we've chosen to put virtually all the code into the daemon; the plug-in only defines the menu itself, and Apple-Events over to the Daemon.
Sometimes, the daemon needs to prompt the user for further information. It can toss a window on-screen for this, but that's disruptive (randomly positioned), and it seems to me the work flow here is legitimately modal, for example "select a file, pick 'commit' from the menu, provide commit comments, do the operation."
Interprocess cooperation (such as passing a reference of some kind) is acceptable: both processes are mine, but I want to avoid binding the sheet's code into the primary process.
Really, it sounds like you're trying to have your inter-process communication happen at the view level, which isn't really how Cocoa generally works. Things will be much easier if you separate your layers a bit more than that.
Why don't you want to put the sheet code into the other process? It's view code, and view code is inherently process-specific. The right thing to do here is probably to add somewhat generic modal-sheet support to your plugin code, and an IPC call that your daemon can make to summon that code. Trying to ship view objects over to the remote process is going to be nightmarish if you can make it work at all.
You're fighting the frameworks with this approach.
You can't add a sheet to a window in another process, because you have at most only the most restricted access to the windows in the other process.
Please don't do this. Make the interaction nonmodal if at all possible. Especially in something like a commit, it's much nicer to be able to browse around your files while you're writing commit comments.
OS X does have window groups, but I don't think they can (easily) span applications.
Another thing to consider is that in OS X it's possible to have many Finder windows open on the same folder (unlike in OS 9). Even if you did have sufficient privileges/APIs to add a sheet to a Finder window, it's not like the modality of that window would prevent the user from being able to continue working with the files.
(My personal opinion as a long-time Mac user is that this kind of interaction would drive me right up the wall.)
Related
So my question is: Is it possible to open, for instance, Discord, Chrome, Spotify or any files in "one process/window"? What I mean by that is, if I wanted to be able to open 3 apps but I don't want it to take so many tabs and so much space, I would want to have those 3 apps in one window. So lets say I were working on 3 different projects and I don't want to always switch desktops or look for the apps everytime I switch projects, I would just want to alt+tab to that project. It would also be nice to be able to prioritise one app in task manager instead of 5. If that is not possible then my thought was to create a "screen sharing program" that captures the other apps and displays it into a UI that I could design. If it were possible, how would I go about starting that?
Simply said, no. You could however try creating different desktops on Windows.
See this page for more information:
https://support.microsoft.com/en-us/windows/multiple-desktops-in-windows-36f52e38-5b4a-557b-2ff9-e1a60c976434
Yes, it is possible to put a process's window inside another process's window. I did it in a very limited case once, but that involved placing a non-interactive (display-only) window inside of an application I wrote which was used by only a few people. I have no idea how well it would work with multiple interactive windows.
Use SetParent in your "screen sharing program" to place the desired applications inside your window. However, take note of Adrian McCarthy's answer to Good or evil - SetParent() win32 API between different processes, and note Raymond Chen's blog posts from the comments:
Is it legal to have a cross-process parent/child or owner/owned window relationship?
Sharing an input queue takes what used to be asynchronous and makes it synchronous, like focus changes.
I have a friend who's working at a company that offers pretty poor support for its developers (scoring a 1/12 on the Joel Test).
Their build process is locked down pretty tight, and depending on the size of project it could take 40+(x2) mouse clicks to deploy. So I thought, "Hey, why not automate it the clicks using the win32api?" (Specifically using Python). I've got him a real nice tool that works just fine except for one issue - the tool that they use has a navigation pane that may or may not be open.
You can open and close it with a button press, but I'm not sure how I could make sure it was either open or closed. It's irrelevant to the build process - the only problem is that it alters where the mouse needs to click on the screen depending on its open status. The application is written in .NET and it exposes a function call that applications are able to use to toggle the panel, so I've been looking around for ideas and so far I've got two of them:
Attach to the process via a debugger and execute the function call somehow.
Take a screenshot at the location of the panels titlebar (which I've got through the win32 API and doesn't appear to change regardless if the panel is hidden or not).
Is there an easier way to figure out the state of this panel? The developers are given an admin account on their machine in addition to their regular account, so I can entertain ideas that require admin access, though I don't think that should be necessary?
UPDATE:
It looks like there's a button that can close the pane. In UIAVerify something shows up as "text" "Navigation" "btnClose". It says its AutomationId is btnClose but it's a ControlType.Text
What technology is this panel built from? Is it standard GDI or WPF? If its GDI, it should have a HWND. You should be able to find this HWND through either a class name or window title. Once you have the HWND, you can get its width.
If its built with WPF, er, I have no idea, but Snoop does this kind of thing, so I know its possible.
I'm facing a problem for an application I'm writing (http://code.google.com/p/blazingstars/issues/detail?id=25), where my program is a menulet (menu bar) application that uses the Accessibility API to interact with and control another program. I do the usual things like registering for the API notifications and getting the window list through API calls, etc., but I realized a while ago that if my program is started in a second Space (virtual desktop) after the program I'm interacting with is started in the first, my program will crash and burn because it can't access any information about its target. (Is there a way around that problem I'm missing?)
A simple solution would be to popup a dialog asking the user to restart the program in the correct Space, but for the life of me I can't figure out how to tell which Space my target is in, either through NSWorkspace or the Accessibility API, so that I can compare it to the Space that I'm in. Any ideas?
Note that setting the collection behaviour to NSWindowCollectionBehaviorCanJoinAllSpaces isn't going to do me any good because I have to do a bunch of work upon launch, so I have to be in the same space as my target right from the start.
I think you can do this with the APIs in CGWindow.h..
Specifically see CGWindowListCopyWindowInfo() and kCGWindowWorkspace.
I've used these APIs to do all types of things like getting window contents, window frames, etc...
If that doesn't work then you might want to try this private API:
extern CGSError CGSGetWindowWorkspace(const CGSConnectionID cid,
CGSWindowID wid,
CGSWorkspaceID *workspace);
The trick would be getting the connection ID of the target process.
You should probably redesign your app so that it delays its initialization until the app you want to control is in the current space.
There is no easy way to do this under Leopard because there are no official "space change" notifications, but the blog post and comments on this page may help.
Is it possible to embed Finder functionality in a cocoa app, now that Finder is itself cocoa (assuming the app were to function only in snow leopard)?
What I mean is to have a file browser pane as part of the app, actually browsing the file system itself (to edit in another pane), but without writing all the functionality of the Finder. Thanks!
The Finder itself is just an application. It is not a components library nor a framework. While you cannot "embed" Finder functionality in your application, you can influence Finder functionality and invoke Finder functionality.
First off, you can attach Folder Actions to folders. These will trigger when a user does something to the contents of a folder - for instance, they drop a file into it. You set this up in the Finder. You should to learn a little AppleScript, if interacting with the Finder is something you want to do.
Second, since the Finder supports AppleEvents, you can affect the Finder using AppleScript. Take a look at My First AppleScript and My First AppleScript Part II to see how to do this. Here is much more in-depth information, in AppleScript Overview: Scripting with AppleScript. Here is some More Finder Scripting.
Third, there is also support for developing ways for the Finder to do complex things for the user at the click of a button, using Automator (Mac OS X 10.5). You can also create a Service in Automator, beginning in Mac OS X 10.6 ("Snow Leopard"). Take a look at Automator and Finder Actions in Mac OS X 10.6 for an introduction to this latter technique.
Even though Finder windows themselves are not an embeddable component, if you really want to provide the ability to pen, Print, Delete, Duplicate, etc. Files/Folders, and navigate from Folder to Folder, you can develop a simple Folder browser in your application.
It should not be a huge amount of work to do this so long as you do not set your sites on mimicking the finder or duplicating all of its functionality, just the essential basics I have mentioned.
You would need to know how to program the Macintosh, however - not just use AppleScript. The normal way to do this would be to learn the Objective-C programming language and the Cocoa framework. You would need to get familiar with writing applications using a Model-View-Controller architecture.
You would create a subclass of NSObject named something like MyFile, and a subclass of a collection class named something like MyFolder. When the application creates the browsing Windows, and each time the application activates (becomes the frontmost application), you8 would refresh the contents of the browsing menu.
You could put a menu in your menu bar with commands in it: Open, Print, Delete, Duplicate. When the user does one of those commands, your application performs the appropriate actions itself or sends the request to the Finder. After the action has been completely carried out, then you refresh the browsing window for currently displayed Folder, or newly displayed Folder if the user navigated to a different Folder.
If you are familiar with design patterns, object-oriented programming, and frameworks in general - reading up on Cocoa Design Patterns will speed your learning curve immensely.
These are various techniques you can use to harness some of the power of the Finder. As you look these over, I suggest getting very clear in your mind just what benefit this brings to the user of your application. Writing down what the overall objective of this feature is, and what commands you wish to support, will make it easier to choose the path you take in developing it.
The user can always click on a Finder folder window at the click of a button, since Finder is always running. So avoid simply duplicating that functionality for its own sake. Focus on the benefit you are providing the user. Make sure that you do handle the situations where the user updates the Folder you are showing the contents of from another application and then switches back to your application.
No, they have not made Finder simply a host for a framework, like Preview. You still have to write this yourself.
I'm writing a routine to provide user definable keyboard short-cuts for any menu item in my Windows Mobile 5 application, which is in C++/MFC. To do this I am getting all of the available menu command IDs, and using the CWnd::PostMessage(WM_COMMAND,MyMenuID) to post it to the application. I use this technique to good effect elsewhere for inter-thread comms, but not with menu command IDs. Any ideas why this doesn't work. The app is document view, and I have tried posting to the CMainFrame and CView derived windows. I could write a god awful switch statement but I feel posting a message should work.
Edit: Ok, i've tried a number of things, including suggestions from this post, to no avail. Big ugly switch statement it is for now, I'll update again if i find anything better.
The only reason I can think of is the message is going to the wrong window. Don't forget that not all menu commands are always processed by a particular window. Some menu commands like Cut are usually processed by a view window. Others are processed by frame windows and some possibly by the application object.
Hope this helps.