I'm writing an application with plugin support. I use C++ and JUCE for that stuff and I want my application to run on windows and MAC OSX (and maybe linux some day).
My plugins have their own GUI. The usual way to display a GUI from a shared library seems to be :
create a new window
get the native handle for it
pass it on to the library
let the library attach its GUI to that handle.
AFAIK that always requires to create a new window for the plugins GUI. My problem is: I would like to have the GUI of the plugins to appear inside the GUI of my host application (= not as a separate window).
I think this is a common thing to do, but I just can't find any concepts for that. How would you solve such a problem?
Thank you very much for ideas and hints!
Related
I am trying to find the best way to handle application windows on Mac OS from a script. I am open to any language, but I want it to have the following capabilities
Get list of open applications
Get location of specific application
Change focus, position, and size of specific application
Send key events to only specific application
I have managed to get a list of applications using JNA, Apple's Core Graphics library, and some code from this question, but the capabilities seem to be limited. Is there a way to implement the functionality I am looking for, or is Mac OS too closed down for this? Like I said, I am open to all languages.
Thanks so much!
for a project i have been assigned, I have been given 2 robots...one has ROS and the other basically uses Windows. So my task is to develop one Graphic User Interface that can be used for both robots.
From the GUI , a user should be able to.
- Connect to the Robot
- Move and control the robot.
- Change speed...etc
I will like to ask for advice as i am about to start this project.
How can i go about this? and which has better support for my requirements?
From my research i have read people recommend QT...for cross platform developmens. Are there any other alternatives? any book recommendations?
The goal will be to have a GUI that is compatible for both systems. Any Recommendations or help is welcomed.
First you setup ROS On windows Using WSL (or any other ways to do it WSL is most stable).
after that you need achieve everything you want the GUI does on robot using ROS terminal.
after that your write the GUI. You can Choose any framework You want(You need C++ or Python for compatibility issues with ROS) but QT Framework is most used in multi platform Application and has a lot of support.
the compatibility with non-ROS is what You should Implement in your Application Like Choosing or something Like that.
PySimpleGUI is a framework built on top of tkinter that runs on the Pi. There are some example programs written to do robot remote control. There are GUI buttons designed specifically for "realtime" control of hardware that will provide immediate and constant feedback when a button is held.
It runs on Python 2.7 and 3 (recommend 3).
There is a Recipe in the Cookbook that matches your problem located here.
If you use PySimpleGUI in your project, post in the Issues area on GitHub if you have any questions and you'll get support.
I have a Qt application that needs to be used from a VST plugin. However embedding a Qt application into a plugin seems like an incredibly complex task (because of the QCoreApplication event loop, because the host might also use a conflicting version of Qt, and because the plugin needs to find its own set of Qt libraries).
As a workaround I'd like to render my standalone Qt application to the VST plugin's window (for which I know the HWND/NSView).
It's easy to do on Windows, but a little more tricky on macOS.
tldr: I've read about NSWindowSharingType / NSWindowSharingReadWrite which seems to offer what I need on macOS (rendering a process' window into another process' window), but I can't find any example using that.
Does anybody know about that and how to use it ? Or any other way that would allow me to render my Qt widgets into a NSView from a different process ?
The solutions for this are quite nominal:
Your copy of Qt must be put into its own unique namespace - i.e. you have to build your own Qt. In a professional setting you're supposed to be doing this anyway.
The QCoreApplication event loop is fully integrated with NSRunLoop. You don't need to call exec() other than to prime the event loop: i.e. quit the event loop as soon as it is started, and let the host application do the rest. The idiom for this is:
QTimer::singleShot(0, app, &QCoreApplication::quit);
app->exec();
// return to the host app here
The plugin can and should bundle its own Qt, either as a bundled framework, or through static linking.
You can also pass an NSView* to QWindow::fromWinID, IIRC.
I am new to Max OSX Plugin thing. I saw a few plugins in "/Library/Internet Plug-ins" directory on my Mac 10.10 and was wondering if it is possible to use any of these directly? By directly I mean if I can load and create an instance of a class inside these plug-ins and use in my Cocoa OSX application.
The plug-in that I want to use is NPAPI based. Is it possible to load this plug-in directly (outside of Browser) and use as a component?
You could create your own NPAPI host, and if it does a good enough job of implementing the NPAPI architecture then you would be able to load plugins with it.
It's not going to be a trivial undertaking, however.
I'd like to be able to switch the sound output source in Mac OS X without any GUI interaction.
There are tools to do control the sound output, such as SoundSource and an applescript to open the preferences dialog.
What I am looking for is something that switches the preference instantly, like SoundSource but it has to be scriptable. The goal is to switch between my digital and analog output with one keystroke. I have a helper application that will launch a program or applescript on one keypress. All I need now is the applescript or application that switches the sound source quickly without any user interaction.
I'm willing to write some Objective-C if that is what it takes, but I'm pretty much a newbie at Cocoa development.
Do you have a one-click solution or can point me to a good tutorial on controlling sound system preferences from a Cocoa App or command line?
EDIT: I created a command-line application to do exactly this. You may download it at http://code.google.com/p/switchaudio-osx/downloads. Source code is available on the project site as well.
I created a command-line application to do exactly this.
You may download it at http://code.google.com/p/switchaudio-osx/downloads. Source code is available on the project site as well.
UPDATE (Dec. 2014): the code is now hosted on github -- https://github.com/deweller/switchaudio-osx. And works just fine in Yosemite.
Don’t think of it in terms of preferences; there’s no centralized system preference framework for this sort of thing. I believe what you need to do is use Core Audio to set the kAudioHardwarePropertyDefaultOutputDevice and kAudioHardwarePropertyDefaultSystemOutputDevice properties of the AudioSystemObject (using AudioHardwareSetProperty()).