Corona SDK get OpenGL context from native code - opengl-es

Is there a way to get OpenGL context from native code? Lets say, I need to draw something from my obj-c code. Anything, some object, complex bezier curve etc. I know that I need enterprise account. So I'm asking just about the OpenGL context. How it will look like and how to do it?

Corona Enterprise has APIs that allow you to interact with the Corona "Environment" from the native code. But I don't think it is possible to add something inside the Corona OpenGL (i.e, possible may even be, but Corona doesn't make that easy for you).
Usually when you draw or add something from the native code, like an image, you add that image to overlay view that is above the Corona OpenGL view. (In fact, that is why when using Corona Pro, all native objects always appear above of your corona elements)

Related

layer control in iWatch using Xamarin

Is there any way to layer a control on top of another. I wanted to put one label on top of another to build out a word. I am using Xamarin Studio and programming for the iWatch, watchOS2.
Unlike UIKit, there is no z-index, or hierarchy, in WatchKit for views. As far as I know, the closest you can get to this is to use a WKInterfaceGroup with a custom background. The hard way, which may get you the result you are looking for, is to render an image and display it.

OpenGL ES in SmartEyeGlass

I'm looking at the Sony SmartEyeGlass and it seems like the only way to interact with the "augmented reality layer" (what's drawn on the glasses) is through the proprietary Sony APIs.
I'm wondering whether there is a way to let OpenGL ES manage this layer as a GLSurfaceView ?
Or is there an alternative way to do 3D rendering on the glasses?
At the moment, there isn't a special API to connect with OpenGL. The way how to achieve OpenGL rendering with SmartEyeglass is to render your content directly to a Bitmap and show it using SmartEyeglassControlUtils.showBitmap(Bitmap bitmap)
Here you will find a solution, how to render OpenGL to a Bitmap:
Run Android OpenGL in Background as Rendering Resource for App?
Please let us know in comment, for what kind of application you need this OpenGL feature.
Good luck.

MONODROID camera preview with opengl overlay

I have created an augmented reality application using Monodroid and it works fine on a technical basis. However, the graphics I used were drawn on a canvas and are really too slow.
The application is a simple heads-up compass and speed display a-la luke-skywalkers binoculars.
I am trying to get a camera preview going with an openGL translucent/transparent overlay and yes, I have read whats available but its all pure Android SDK / Java.
Does anyone know of a method of getting this effect in C# and Monodroid possibly using the AndroidGameView? Whatever I do I can see one or the other but never both at the same time.
Unhelpful jerks are a pleasure to work with.
http://bobpowelldotnet.blogspot.fr/2012/10/monodroid-camera-preview-as-opengl.html

Render a Win32 widget in the OpenGL context

Is it possible to intercept a control's paint event and make it draw in the opengl context?
I dont know if this is possible, but this tends to writing your own gui.
It whould be simpler to use a complete openGl Gui library.
http://libufo.sourceforge.net/
If you're using Qt there's a fun demo showing (working) Qt widgets rendering 3D in an OpenGL context. How useful that is to you depends how how hooked you are on the native win32 controls specifically.

What is the best way to create a nested opengl canvas

I would like to write a library that draws some opengl on a given window handle.
How can I initialize an opengl-context inside a given window?
Is it possible to do that platform independent using SDL or some other library?
Qt provides a very good, cross platform mechanism for opening an OpenGL context and drawing into it. For details, see QtOpenGL.
You could use Open Producer or the OpenSceneGraph to do it as well.

Resources