How can I install libEGL.dylib on MacOS? - macos

I'm trying to use EGL via LWJGL on MacOS to use offscreen rendering.
It seems to throw an exception of the missing library libEGL.dylib.
I couldn't find this library on package distribution service like brew.
However, I could find several minor repositories on Github containing libEGL.dylib. I don't think this is safe to use.
How can I find the binary officially, or find the source to build?

There isn't any canonical libEGL implementation. The header files are part of the standard, but the implementation isn't.
MoltenGL has a binary download available (the second one, "OpenGL ES 2 for macOS").
If you prefer an open-source implementation, Google's ANGLE also implements EGL and supposedly supports macOS. And for what it's worth, it appears that Apple is using this in WebKit for iOS as well.

Related

Can I use pre-"profiles" OpenGL versions on Macs?

I want to use a deprecated GL function - glPushAttrib. Ideally, I'd do that by using the Compatibility profile of a recent GL version, but Macs don't support that. So I'm happy to settle for using an old GL version, from before GL profiles existed. My question, though, is: does Mac support that?
Note: Don't tell me that I shouldn't use glPushAttrib unless you're able to link to a good library that replaces it. I don't want to write my own, and using a full-blown engine would be much more trouble than it's worth for my usecase ;)
I found the answer in the OpenGL wiki:
MacOSX gives you a choice: core profile for versions 3.2 or higher, or just version 2.1

Is Simple DirectMedia Layer's SDL_GL_GetProcAddress working with OpenGL ES 2.0 for embedded systems

I'm building an application with OpenGL ES 2.0 and SDL2 for Android. Is SDL_GL_GetProcAddress working with OpenGL ES 2.0 on Android? Also i know OpenGL ES 2.0 is a subset of OpenGL, so with this method can it run on desktop systems too?
From a quick browse of the SDL repository it should be.
SDL_video.c defines the implementation of SDL_GL_GetProcAddress simply to check that you've started OpenGL and then to call _this->GL_GetProcAddress, where _this is a global instance of the video driver.
SDL_androidvideo.c sets its GL_GetProcAddress to be Android_GLES_GetProcAddress, which is a preprocessor substitution for SDL_EGL_GetProcAddress.
So, so far: if you call SDL_GL_GetProcAddress, you'll get through to SDL_EGL_GetProcAddress.
SDL_egl.c implements SDL_EGL_GetProcAddress but declines to call eglGetProcAddress on Android. This looks like it's probably an error — the reason given is this bug but the status for that bug switched to 'Released' in June 2013, which I believe means that this has been fixed in Android for more than three years.
That aside, the fallback is to use SDL_LoadFunction, first with the direct function name, then with it proceeded by an underscore provided it's short enough to fit into the statically-declared buffer. Which this one is.
(so, caveat: SDL_GL_GetProcAddress is definitely not thread-safe, even if you've taken appropriate share group steps to use multiple GL contexts, but if you're writing an SDL program you probably don't care)
Android should be using the dlopen version of SDL_sysloadso so it looks like SDL_LoadFunction is implemented directly as a call to dlsym. Which has no issues that I'm aware of under Android.
So, in summary: yes, that call should work. It'll use the platform-specific dynamic library loader rather than the EGL call though it probably doesn't need to, but that's just an implementation detail.

glReadBuffer vs glReadBufferNV in OpenGL ES 2.0

I'm trying to build OpenSceneGraph 3.2 for the Ubuntu armhf architecture, but I'm getting a compile error about a symbol not found. The symbol in question is glReadBuffer. I looked at GLES2/gl2.h header, and indeed, that symbol is not there. However, the symbol is present in GLES3/gl3.h, and documentation online suggests that the function was added in OpenGL ES 3.0. However, I did find a function named glReadBufferNV in GLES2/gl2ext.h (which is not #include'd in the source files.
I'm wondering if glReadBufferNV can be used instead of glReadBuffer, and what might be the possible side effects. I'm suspecting that the NV stands for Nvidia, and that it is a Nvidia-only implementation. Is this correct? If so, is there any way to get glReadBuffer in OpenGL ES 2.0 (I am under the impression that OpenSceneGraph can be built under OpenGL ES 2.0)?
Edit: As it turned out, the code that builds this portion of OpenSceneGraph was excluded when building with OpenGL ES or OpenGL 3.0. However, I'm still interested in what's special about glReadBufferNV.
As your research suggests, glReadBuffer was added to ES for 3.0; it is not defined in 2.0. Prior to that, as per the header file you found, an extension defined glReadBufferNV — specifically the NV_read_buffer extension.
So what's happened is that something wasn't in the spec, at least Nvidia thought it would be useful, so they've implemented an OpenGL extension, which has subsequently been discussed at Khronos, had all the edge cases or ambiguities dealt with and has eventually made its way into the core spec.
That's generally how GL development proceeds: extensions come along to provide functionality that's not yet in the main library, they're discussed and refined and adopted into the core spec if appropriate.
Comparing the official specification for glReadBuffer with the extension documentation, the extension has a few ties into other extensions that you wouldn't expect to make it into the core spec (e.g. COLOR_ATTACHMENTi_NV is supported as a source) but see resolved issue 7:
Version 6 of this specification isn't compatible with OpenGL ES 3.0.
For contexts without a back buffer, this extension makes FRONT the
default read buffer. ES 3.0 instead calls it BACK.
How can this be harmonized?
RESOLVED: Update the specification to match ES 3.0 behavior. This
introduces a backwards incompatibility, but few applications are
expected to be affected. In the EGL ecosystem where ES 2.0 is
prevalent, only pixmaps have no backbuffer and their usage remains
limited.
So the extension has retroactively been modified to bring it into line with what was agreed for the core spec.

Problems with glext.h

I was just looking through the OpenGL updates on OS X Lion when I found something that now has me scared to use glext.h.
So, here's the bug. Lion's OpenGL.framework has a glext.h with the following definition.
typedef void *GLhandleARB;
But the glext.h from the OpenGL registry has the following instead.
typedef unsigned int GLhandleARB;
Now, the trouble is that when building for x86_64 on Lion we have sizeof(void*)==8, but sizeof(unsigned int)==4. So what do you trust? Lion's header? Or the OpenGL registry's header? Well, of course you trust the system headers, because apparently they claim to know that the ABI on 64-bit Lion has a 64-bit GLhandleARB type.
Now, this raises a few issues in my mind about various platforms:
If you must use Apple's glext.h, but Apple's glext.h doesn't provide access to anything later than OpenGL 2.1, then how do you get at 3.0+ features on newer cards?
Is it unsafe to use the OpenGL registry's glext.h on Linux? Or must you use the system's glext.h there as well? In that case, question #1 applies here as well.
How the heck do you handle things on Windows, where there is never a glext.h on the system? You clearly can't use a driver vendor's glext.h, because different vendors may disagree on the sizes of various types. (Or is that not true?) What's the deal here?
I see no problem.
Just use OS/drivers provided headers.
Or better use multi-platform OpenGL extension loader, that will do the trick for you.
(eg. GLEW)
On the other hand in code you will use only GLhandleARB, not other things, so on Mac it will be void* - no problem, on Linux - something different - no problem, on Linux with AMD header - something entirely different - no problem.
Source code is portable across different platforms, not binaries, so I see no problem here.
1) You cant get better OpenGL if you use version served by Apple. So currently you can get max OpenGL 3.2 core profile on 10.7. (heard that Nvidia on some gpus bypassed it with its own headers with OpenGL 3.3, but have no way to check it myself).
2) It depends. If you target OpenGL 2.1 and below, open-source drivers support it, but higher versions are supported only by proprietary drivers, so you should use their headers.
But in code you just put "#include ", and then link against appropriate header and .so library.
3) Do not know how things stand on Win. But probably vendors use glext from OpenGL registry.
But all of this is based on wrong assumption. You DO NOT have to know answers for them. Just use software that already know how to handle this burden. (eg. GLEW).
You should use the official OpenGL function to get the extensions supported by the instance of OpenGL you are running with: glGetString(GL_EXTENSIONS)
As for which type you should use, I think this has already been answered Apple's mailing lists: http://lists.apple.com/archives/mac-opengl/2005/Nov/msg00182.html
Both; the spec doesn't make any claims about what a GLhandleARB is,
other than that it's at least 32 bits wide. Note that in the OpenGL
2.0 shading language API there is no GLhandle type, it uses GLuint
like textures. Also note that GLuint is not an unsigned int on Mac OS
X, it's an unsigned long, so you're still screwed :)

Haskell or Ocaml with OpenGL and SDL precompiled distribution for Windows

I want to learn Ocaml or Haskell and I want to do it by writing a simple game. Apparently, there's one small problem: nobody cares about Windows and I want to do it on Windows, natively.
Haskell has Cabal, which has SDL, but it doesn't build due to a trivial problem with no workarounds (order of parameters passed to gcc). Ocaml doesn't even have that, it's all in source packages, be it GLCaml or OcamlSDL or whatever.
Is there a place where I can get a working SDL for Haskell or Ocaml on Windows without fighting with a dozen versions of compilers?
The Haskell Platform comes with a binding to OpenGL which should work out of the box on Windows.
Concerning the SDL package on hackage, you can use cabal unpack SDL to get the source code and fix things yourself. To install the package with your changes, run cabal install in the unpacked directory. In any case, drop a line to the maintainer, I'm sure he'll help out.
It's not related to SDL, but you've mentioned OpenGL. There is LablGL binding for OpenGL in OCaml which works out of the box. Wiki example (http://en.wikipedia.org/wiki/Objective_Caml#Triangle_.28graphics.29) compiles and works just fine.
The best instructions I've found for getting SDL to work in windows with a the most recent Haskell platform can be found at this blog. I followed everything step-by-step and it worked perfectly, despite some configure error messages.
It isn't SDL but GLFW works on Windows with Haskell through Cabal.
My article High-fidelity graphics with OpenGL 2 (25th Feb 2008) explained how the GLCaml bindings can be used to write OpenGL-based applications in OCaml that use vertex and fragment shaders (a phong shader is given as an example). There are 9 articles in the OCaml Journal on OpenGL, albeit mostly using the older LablGL library for OpenGL 1.1.
I tried and failed to get OpenGL working from Haskell under Linux in 2007. The Haskell Platform may have changed that but I have neither had time to try it yet myself nor ever heard of anyone using it for this.
However, both OCaml and Haskell must rely upon fragile low-level bindings to OpenGL because they are standalone languages and nobody has ever managed to get any significant commercial software using them to work. As you're on Windows, F#+XNA is a far more logical choice because XNA is tried and tested and F# has a safe high-level interface to it. A Google fight gives you a good idea of what a pioneer you'll be: +haskell +opengl gives 437 hits on Google and +ocaml +opengl gives only 347 hits.

Resources