OpenMP with OpenCV on OS X - xcode

I'm having a problem getting OpenMP and OpenCV to play nicely with a new project in Xcode. The project in its current state does nothing but grab frames from the default camera and put them into a window. This functionality works. However, I would like to grab the frames in a separate thread, and I was hoping I could get some experience with OpenMP.
Merely checking the checkbox to enable OpenMP in Xcode wreaks havoc. The program, while it will compile, load and run just fine, will not respond to any events -- period. It just sits there, drawing grabbed frames. (I do get the OS X beachball, too, even though it's running fine.) I eventually have to force quit the app or kill it from Xcode. Keep in mind I get this behavior even without any OpenMP #pragmas -- I have only to enable the option in Xcode.
Any ideas on how to solve this?

I'm just guessing here. You might need to make sure that all OpenGL drawing commands are called from one thread.

Related

MacOS App proc_listpids() fails, but works on other program

I am new to MacOS and especially its lower level stuff, I built a CLI program to inject a dylib into a vulnrable process. It works fine, but I wanted to make a GUI program to do this, but now on my call to
proc_listallpids(NULL, 0);
the return value is always 0. Looking into some source code I found:
if ((error = proc_security_policy(PROC_NULL, PROC_INFO_CALL_LISTPIDS, type, NO_CHECK_SAME_USER)))
return (error);
But I was not able to find the source code for this function to find out why it fails(assuming this is the reason it does fail)
Is proc_listallpids() not allowed to be called from Apps? If so is there a way I can still make a GUI program but use this function? I was kind of thinking this might be to do with restrictions for the appstore but I wouldn't want my program on the appstore
Also I was wondering if there is a better site to go for topics like this.
The answer was App Sandbox, not sure if this is worth keeping around or not. I turned it off in the project settings and the function works now

Broken text in Qt Creator UI

Sometimes (not always), certain text items in the Qt Creator UI are broken:
Any idea what causes it? Or a workaround?
Maybe the problem is my cheap video card (an Intel on-board one). This theory is supported by the fact that Creator is probably QML-powered by now, meaning it's running on OpenGL.
I tried restarting Creator and that fixed it, but after switching between the Welcome and Edit tabs a few times, it happens again.
I tried making Creator's UI use the software QML renderer, rather than OpenGL, as that might have made it work correctly. I did it by running the following in cmd.exe:
> set QMLSCENE_DEVICE=softwarecontext
> qtcreator.exe
But that didn't fix it.
It turns out the solution is the same as that to the problem I posted next, i.e. to make the app (Qt Creator in this case) use ANGLE. Except I don't want to recompile Qt Creator just to add one line of code, so I use the alternative approach in the Qt doc page, which is to set the env var QT_OPENGL to angle, before launching Creator.
At least so far it hasn't bugged out on me.
Note: The setting of an env var can be done conveniently with a batch file as described here.

Mac OS X App fails to execute completion handler in release build but works in debug build

I know this is rather vague but does anyone have any pointers as to why a completion handler does not get run in the release build but does run in the debug build.
I have not idea where to begin trying to solve this other than changing the code to not use a completion handler.
Any pointers would be welcome.
App is written in Swift and built for OS X 10.10
UPDATE
I just modified the code to send a NSNotification rather than run the completion handler and now things work fine. I still have the completion handler code in but its not being called.
We can't read minds. Can't help you if you don't show us the real code.
BUT. It it possible that you've come across a compiler bug. I've had a similar problem, where the Swift compiler would generate incorrect code in optimized (Release) build — but the same thing would work fine in Debug. (rdar://18906781) I've also heard of other production apps written in Swift that simply do not work correctly in optimized builds (and so are shipped unoptimized).
Try reworking the code to achieve the same thing with different code — maybe you'll work around the bug.
If you have the time, I encourage you to try narrowing the bug as much as possible to make a test case and reporting it on http://bugreport.apple.com.

Same QtOpenGL code runs as about 15 times slower when going to Carbon (vs Cocoa)

I'm developing a very simple application for the Mac OSX platform making use of Qt and OpenGL (and QtOpenGL) so crossplatform gets easier.
The application receive a variable number of video streams that have to be rendered to the screen. Each frame of these video streams is used as a texture for mapping a rectangle in 3D space (very similar to a videowall).
Apart from the things such as receiving, locking, uploading video data, synchronizing threads... i consider it is clear that it's a quite simple application.
The fact is that all behaves ok when using cocoa based Qt 4.7 binaries (the default ones) in a 10.5 Mac.
But my code has to run fine at all of the OSX versions starting from (and including to) 10.4. So i tried the code in a 10.4 machine and it crashed just when starting. After a few hours of internet reading, i discovered that for a Qt Application to be targeted at 10.4, carbon Qt based has to be used. So i rebuild the whole project with the new framework.
When the new resulting binary gets run, all works well except by the fact that application's fps fall to about 2 fps!! And it behaves the same at both machines (10.5 computer has sensibly better features)
I've spent quite time working on this but i have not reached a solution. Any suggest?
More information about the application and things i've tried
code has not been modified when recompiling carbon based
only two (256x256 textures) videos ar used in order to assure it's not a bandwidth limit problem (although i know it shouldn't because the first code worked)
the 2 video streams arrive from network (local)
when a video stream arrives, a signal is emmited and the data will be uploaded to an OpenGL texture (glTexSubImage2D)
a timer makes render (paintGL) happen at about 20ms (~50 fps)
the render code use the textures (updated or not) to draw the rectangles.
rendering only when a video arrives won't work because of having 2 (asynchronous) video streams; besides more things have to be draw at screen.
only basic OpenGL commands are used (no PBO,FBO,VBO,...) The only one problematic thing could be the use of shaders (available only from Qt 4.7), but its code is trivial.
i've made use of OpenGLProfiler and Instruments. Nothing special/strange was observed.
Some things i suspect (conclusions)
it's clear it's not a hardware issue. The same computer behave differently
it gives me the sensation it's a threading/locking problem but, why?
carbon is 32 bits. The 10.5 application was 64. It's not possibly develop 64 bits in carbon.
for giving away the 32 bits possible cause, i also rebuild the first project for 32 bits. It worked partically the same.
i've read something about carbon having problems (more than usual) with context switching.
maybe OpenGL implementation is Multithread and code is not? (or the opposite?) That could cause a lot of stalls.
maybe carbon handle events differently from cocoa's? (i mean signal/event dispatching, main loop...)
Ok, this is (sorry for the so long writing) my actual headache. Any suggestion, idea.. would be very appreciated.
Thx in advance.
May I ask a diagnostic question? Can you ensure that it's not being passed to the software renderer?
I remember that when 10.4 was released, there was some confusion about quartz extreme, quartz and carbon, with some of it disabled, and hardware renderers disabled by default on some of them, which required configuration by the end user to get it working correctly. I'm not sure whether this information is pertinent, because you say that, having targetted 10.4, the problem exhibits on both the 10.4 and the 10.5, yes?
It's possible (though admittedly I'm grasping at straws here) that even in 10.5 carbon doesn't use the hardware renderers by default. I'd like to think though that OSX prefers hardware renderers to software renderers in all scenarios, but it may be worth spending a little time looking into, given how thoroughly you're already looking into other options.
Good luck.
If you are using Qt, I guess your code would work on a windows or linux platform. Have you tried your application under these platforms ?
This would quickly reveal if it comes from Qt or the mac OSX version.

How to use GLUT not in main thread on OS X?

I once tried to open a GLUT window from a sub-thread and got lots of nasty problems. I remember this post on lists.apple.com:
GLUT functions may only be called from the application's main thread
Has anything changed in this regard with GLUT on Mac OS X ? Is there a thread-safe GLUT that let's you open windows from any thread ?
If GLUT is not an option, is there a tiny library that replaces GLUT and would work from any thread ?
[edit]
Here is the result of my tests triggered by the various solutions proposed as answers:
GLFW looked nice but did not compile (current branch is 3 years old)
Agar was another pretender but it's too big for the tiny need I had
SDL is not BSD-license compatible and it's a huge library for code that should fit on a single file
GLUT cannot run in any thread.
I decided to reinvent the wheel (yes, that's good sometimes) and the final class is just 200 lines of code. It let's me open and close a window from any thread (openGL draw in new thread) and I have full control on vertical sync and such (SDL uses double buffering = slow for openGL). I had to trick around the NSApp to properly start and stop the application (which does not use an event loop otherwise).
To those telling me that OpenGL is not thread-safe, that's not exactly true: you can run multiple OpenGL threads and the draw commands will be executed in the OpenGL state assigned to that thread. OpenGL is thread-specific.
If anyone needs some bare-bones code to create OpenGL windows using Cocoa: gl_window.mm
GLUT is not thread safe. You'll need locking primitives with whatever solution you choose to implement. I'd recommend setting up your own GL view in Cocoa and rewriting the plumbing that GLUT provides.
Take a look at SDL as a modern GLUT replacement. It should give you all the cross-platform you want. As far a cross-platform threading, Boost provides a portable library.
As a replacement for GLUT, have a look at GLFW. It's similar in purpose and workings, but better. And it does not have a glfwMainLoop that your program is stuck with; it allows you full control. Never since I discovered GLFW have I had a need to switch back to GLUT.
Note that GLFW is not thread-safe, in the sense that it is unsafe to call GLFW functions from different threads (FAQ entry). However, as long as you call all GLFW functions from the same thread, it's your choice which thread that will be.
Not only is GLUT is not thread safe, but OpenGL is a state machine, and therefore isn't thread safe. Having said that, you can have multithreaded applications that use OpenGL. Just make sure all your OpenGL calls are made from the same thread.
The next step up from GLUT on Mac OS X is the Cocoa OpenGL Sample Code. This is a true Cocoa application that demonstrates the Cocoa way of setting up an OpenGL window, with interactivity using the Cocoa event model. From this starting point, it's fairly easy to add code to handle your program logic in a separate thread (or threads) from your OpenGL drawing code.

Resources