GLEW crashing in XCode - xcode

I'm trying to run a simple OpenGL program using GLFW (version 3.0.2) and GLEW (version 1.10.0) in XCode (version 4.6.3) on OS X 10.8.4. The entire code is shown below.
#include <GLFW/glfw3.h>
#include <OpenGL/OpenGL.h>
#include <iostream>
using namespace std;
void RenderScene()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
}
void InitGL()
{
glClearColor(1, 0, 0, 1);
}
void ErrorFunc(int code, const char *msg)
{
cerr << "Error " << code << ": " << msg << endl;
}
int main(void)
{
GLFWwindow* window;
/* Report errors */
glfwSetErrorCallback(ErrorFunc);
/* Initialize the library */
if (!glfwInit())
return -1;
/* Window hints */
glfwWindowHint (GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint (GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint (GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint (GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
/* Create a windowed mode window and its OpenGL context */
window = glfwCreateWindow(640, 480, "Hello World", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
/* Make the window's context current */
glfwMakeContextCurrent(window);
/* Initialize OpenGL */
InitGL();
/* Loop until the user closes the window */
while (!glfwWindowShouldClose(window))
{
/* Render here */
RenderScene();
/* Swap front and back buffers */
glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}
Most of this came straight from GLFW's documentation; only the rendering function and GLEW initialization are mine. I have added frameworks for OpenGL, Cocoa and IOKit and linked against libGLEW.a and libglfw3.a. The program compiles successfully but appears to crash when attempting to execute functions GLEW was supposed to take care of. Here, the program crashes on glClearBufferfv. If I comment that out, I get a window with a black background. My guess is GLEW is secretly not working, since it reports no errors but doesn't seem to be doing its job at all.
The exact error message XCode throws at me is error: address doesn't contain a section that points to a section in a object file with an error code of EXC_BAD_ACCESS. If I replace glClearBufferfv with glClearColor the program doesn't crash, but still has a black background when it should actually be red. When queried, OpenGL returns the version string 2.1 NVIDIA-8.12.47 310.40.00.05f01, which explains why calls to newer functions aren't working, but shouldn't GLEW have set up the correct OpenGL context? Moreover, GLFW's documentation says that they've been creating OpenGL 3+ contexts since GLFW 2.7.2. I really don't know what to do.

glClearBuffer (...) is an OpenGL 3.0 function, it is not implemented in all versions of OS X (some only implement OpenGL 2.1). Because OS X does not use runtime extensions, GLEW is not going to fix this problem for you.
You will have to resort to the traditional method for clearing buffers in older versions of OS X (10.6 or older). This means setting the "clear color" and then clearing the color buffer as a two-step process. Instead of a single function call that can clear a specific buffer to a specific value, use this:
#define USE_GL3 // This code requires OpenGL 3.0, comment out if unavailable
void RenderScene()
{
GLfloat color[] = {1.0f, 0.0f, 0.0f};
#ifdef USE_GL3 // Any system that implements OpenGL 3.0+
glClearBufferfv (GL_COLOR, 0, color);
#else // Any other system
glClearColor (color [0], color [1], color [2]);
glClear (GL_COLOR_BUFFER_BIT);
#endif
}
This is not ideal, however. There is no point in setting the clear color multiple times. You should set the clear color one time when you initialize the application and replace the ! USE_GL3 branch of the code with glClear (GL_COLOR_BUFFER_BIT);
Now, because you mentioned you are using Mac OS X 10.8, you can ignore a lot of what I wrote above. OS X 10.8 actually implements OpenGL 3.2 if you do things correctly.
You need two things for glClearBuffer (...) to work on OS X:
Mac OS X 10.7+ (which you have)
Tell glfw to create an OpenGL 3.2 core context
Before you create your window in glfw, add the following code:
glfwWindowHint (GLFW_OPENGL_VERSION_MAJOR, 3);
glfwWindowHint (GLFW_OPENGL_VERSION_MINOR, 2);
glfwWindowHint (GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint (GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
Once you have an OpenGL 3.2 core context, you can also eliminate the whole ! USE_GL3 pre-processor branch from your code. This was a provision to allow your code to work on OS X implementations that do not support OpenGL 3.2.

GLEW doesn't really work on mac unless you enable the experimental option. Enable it after setting all your stuff in GLFW.
glewExperimental = GL_TRUE;
Edit:
And you also set to use OpenGL Core with
glfwOpenWindowHint( GLFW_OPENGL_VERSION_MAJOR, 3 );
glfwOpenWindowHint( GLFW_OPENGL_VERSION_MINOR, 2 );
Slightly different from yours.

Related

How to use Cmake for OpenGL + Qt 5.8 in OS X Sierra?

I just tried to setup CMake for a project using Qt 5.8 to do OpenGL stuff in OS X Sierra. Here you can see my main CMakeLists
cmake_minimum_required (VERSION 3.8)
set (PROJECT_NAME "FluidEngine")
project (${PROJECT_NAME})
set (CMAKE_PREFIX_PATH "/Users/BRabbit27/Qt/5.8/clang_64")
set (CMAKE_AUTOMOC ON)
find_package (Qt5Widgets)
find_package (Qt5Gui)
find_package (Qt5OpenGL)
set (CPP_SOURCES "")
set (HPP_SOURCES "")
set (INCLUDE_PATHS "")
add_subdirectory (src)
include_directories (${INCLUDE_PATHS} ${OPENGL_INCLUDE_DIRS})
add_executable (${PROJECT_NAME} ${CPP_SOURCES} ${HPP_SOURCES})
target_link_libraries (${PROJECT_NAME} Qt5::Widgets Qt5::Gui Qt5::OpenGL )
It perfectly configures the xcode project, no errors.
Then my code for rendering a basic triangle looks like:
GLWindow::GLWindow(QWidget* parent) : QOpenGLWidget(parent)
{}
void GLWindow::initializeGL()
{
initializeOpenGLFunctions();
GLfloat verts[] =
{
0.f, 1.f,
-1.f, -1.f,
1.f, -1.f
};
GLuint myBufferID;
glGenBuffers(1, &myBufferID);
glBindBuffer(GL_ARRAY_BUFFER, myBufferID);
glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW);
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 0, 0);
}
void GLWindow::resizeGL(int w, int h)
{
glViewport(0, 0, w, h);
}
void GLWindow::paintGL()
{
glDrawArrays(GL_TRIANGLES, 0, 3);
}
And what I get is a black window. I got this code from this video tutorial
Am I missing something in my cmake file or something subtle in OS X to use OpenGL? Since OS X is now promoting Metal, perhaps something must be enabled but I do not know what.
I already tried setting the version of OpenGL used in the main function
int main(int argc, char** argv)
{
QApplication app(argc, argv);
QSurfaceFormat format;
format.setVersion(4, 1);
format.setProfile(QSurfaceFormat::CoreProfile);
QSurfaceFormat::setDefaultFormat(format);
GLWindow glwindow;
glwindow.show();
return app.exec();
}
Any idea?
UPDATE
Added project to github, you can clone it here to test in your machine.
The way I build the project is just cmake -GXcode .. assuming i'm in /path/to/project/build_xcode directory.
Hope this can help you reproduce and perhaps give me a clue on what I could be doing wrong.
I checked all the error messages and from there make my way to found the solution. Fortunately someone already replied in SO with a related problem and had a solution to the problem. You can find the complete answer here OpenGL: INVALID_OPERATION following glEnableVertexAttribArray
For short
You're seeing this error on OS X because it only supports the OpenGL
Core Profile if you're using OpenGL 3.x or higher. Your code is not
Core Profile compliant. You were most likely using the Compatibility
Profile on Windows.
Specifically, the Core Profile requires a Vertex Array Object (VAO) to
be bound for all vertex related calls. So before calling
glEnableVertexAttribArray(), or other similar functions, you will
need to create and bind a VAO.

Why can't I create an OpenGL ES 3.0 context using SDL2?

I'm using SDL2 2.0.2 on Debian stable, and I'm trying to acquire an OpenGL ES 3.0 context with it. This works if I request an OpenGL ES 2.0 context, but not if I directly request an OpenGL ES 3.0 context.
Consider the following program:
#include <GLES3/gl3.h>
#include <SDL2/SDL.h>
int main(
int argc,
char **argv
) {
SDL_GLContext context;
int rc;
const GLubyte *version;
SDL_Window *window;
rc = SDL_Init(SDL_INIT_VIDEO);
if (rc < 0) {
return EXIT_FAILURE;
}
atexit(SDL_Quit);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_ES);
#ifdef THIS_SHOULD_WORK_TOO
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 0);
#else
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 2);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 0);
#endif
window = SDL_CreateWindow("OpenGL", 100, 100, 800, 600, SDL_WINDOW_OPENGL);
context = SDL_GL_CreateContext(window);
version = glGetString(GL_VERSION);
if (version == 0) {
printf(
"Unable to get OpenGL ES version string: %d\n",
glGetError()
);
return EXIT_FAILURE;
}
printf("Version string: %s\n", version);
SDL_GL_DeleteContext(context);
return EXIT_SUCCESS;
}
When I compile this normally, it requests an OpenGL ES 2.0 context, and the program receives an OpenGL ES 3.0 context:
$ c99 example.c -lSDL2 -lGLESv2
$ ./a.out
Version string: OpenGL ES 3.0 Mesa 10.3.2
However, when I request an OpenGL ES 3.0 context, this program fails without a clear error message:
$ c99 -DTHIS_SHOULD_WORK_TOO example.c -lSDL2 -lGLESv2
$ ./a.out
Unable to get OpenGL ES version string: 0
Why is this?
I doubt it is because of -lGLESv2, mostly because the OpenGL ES context reports that it is version 3.0, and because platforms that do ship a libGLESv3.so ship it as a symlink it to libGLESv2.so (Android does this, for example).
This is a bug in SDL2:
SDL's EGL code (used for OpenGL ES context creation on Windows / Linux / Android / etc.) only seems to properly support OpenGL ES 1 and 2 contexts.
In particular, the current code sets the EGL_RENDERABLE_TYPE config attribute to EGL_OPENGL_ES2_BIT if GLES2 is requested and EGL_OPENGL_ES_BIT in all other cases. It never uses EGL_OPENGL_ES3_BIT / EGL_OPENGL_ES3_BIT_KHR even when GLES3.0+ is requested. It also doesn't use EGL_CONTEXT_MINOR_VERSION_KHR to set the version to 3.1 rather than 3.0 in SDL_EGL_CreateContext, when 3.1 is requested.
This means that a request for OpenGL ES 3.0 is translated into a request for OpenGL ES 1.0. Since OpenGL ES 3.0 is backwards incompatible with OpenGL ES 1.0, the request eventually fails (I think).
The fix has been merged into the master branch, and is scheduled to be released as part of SDL2 version 2.0.4:
Added EGL_KHR_create_context support to allow OpenGL ES version selection on some platforms

Test Program Using GLEW Crashing on Launch

I am trying to work with OpenGL on Windows in Eclipse using MinGW.
So far I've been able to set up GLFW correctly and now I am trying to load OpenGL pointers with GLEW.
I downloaded the GLEW source and built it using the instructions at Building glew on windows with mingw and that didn't appear to cause any problems.
I have a test program that I am trying to run. The program runs and opens the window if I comment out the calls to GLEW (lines 32 to 40)
/*USE OUR OPENING LIBRARY GLEW TO LOAD OPENGL FUNCTIONS*/
glewExperimental = GL_TRUE;
glewInit();
/*Test that GLEW was loaded by calling the OpenGL function glGenBuffers*/
GLuint vertexBuffer;
glGenBuffers(1, &vertexBuffer);
printf("%u\n", vertexBuffer);
However when I try to make any calls to glew - either set glewExperimental or call glewInit() - the program builds fine but when I run it the program crashes with the error message "HelloGLEW.exe has stopped working". This is the full error message and there are no other error messages.
Can anyone explain/theorise as to the cause of the problem and suggest how to fix them? Or possibly someone had the same problem and can explain how they solved it.
This is the full program code:
//Opening Toolkit: GLEW (OpenGL Extention Wrangler)
#define GLEW_STATIC
#include <GL/glew.h>
//Window Toolkit: GLFW
#include <GLFW/glfw3.h>
#include <stdio.h>
int main(void)
{
/* USE GLFW TO CREATE OUR CONTEXT AND WINDOW*/
GLFWwindow* window;
/* Initialize the library */
if (!glfwInit())
return -1;
/* Create a windowed mode window and its OpenGL context */
window = glfwCreateWindow(640, 480, "Hello GLEW", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
/* Make the window's context current */
glfwMakeContextCurrent(window);
/*USE OUR OPENING LIBRARY GLEW TO LOAD OPENGL FUNCTIONS*/
glewExperimental = GL_TRUE;
glewInit();
/*Test that GLEW was loaded by calling the OpenGL function glGenBuffers*/
GLuint vertexBuffer;
glGenBuffers(1, &vertexBuffer);
printf("%u\n", vertexBuffer);
/* Loop until the user closes the window */
while (!glfwWindowShouldClose(window))
{
/* Render here */
/* Swap front and back buffers */
glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}
I've linked it with glew32, glu32, glfw3, opengl32 and gdi32 in that order. Here are the compilation commands:
g++ "-IC:\\Libraries\\GLFW3\\include" "-IC:\\Libraries\\GLEW\\include" -O0 -g3 -Wall -c -fmessage-length=0 -o main.o "..\\main.cpp"
g++ "-LC:\\Libraries\\GLFW3\\i386\\lib-mingw" "-LC:\\Libraries\\GLEW\\lib" -o HelloGLEW.exe main.o -lglew32 -lglu32 -lglfw3 -lopengl32 -lgdi32
Like I said can anyone explain/theorise as to the cause of the problem and suggest how to fix them? Or possibly someone had the same problem and can explain how they solved it.

Xcode executable cannot find glsl files

This is the first time I try to learn OpenGL, I'm following the examples of a book. I'm doing it under OS X 10.8 with Xcode. The code is the following:
#include "Angel.h"
const int numPoints = 5000;
typedef vec2 point2;
void init(){
point2 points[numPoints];
point2 vertices[3] = {
point2(-1.0, -1.0), point2(0.0, 1.0), point2(1.0, -1.0)
};
points[0] = point2(0.25, 0.5);
for (int k = 1; k < numPoints; k++) {
int j = rand()%3;
points[k] = (points[k-1]+vertices[j])/2.0;
}
GLuint program = InitShader("vertex.glsl", "fragment.glsl");
glUseProgram(program);
GLuint abuffer;
glGenVertexArraysAPPLE(1, &abuffer);
glBindVertexArrayAPPLE(abuffer);
GLuint buffer;
glGenBuffers(1, &buffer);
glBindBuffer(GL_ARRAY_BUFFER, buffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(points), points, GL_STATIC_DRAW);
GLuint location = glGetAttribLocation(program, "vPosition");
glEnableVertexAttribArray(location);
glVertexAttribPointer(location, 2, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(0));
glClearColor(1.0, 1.0, 1.0, 1.0);
}
void display(){
glClear(GL_COLOR_BUFFER_BIT);
glDrawArrays(GL_POINTS, 0, numPoints);
glFlush();
}
int main(int argc, char** argv){
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA);
glutInitWindowSize(640, 480);
glutCreateWindow("Sierpinski Gasket");
init();
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
It compiles. But when I try to execute it the window does not appear. The problem arises when I call the init() function. Without it the window appears but with a black background. With it, there's no window. The code can be found here.
UPDATE
Apparently the program is exiting in the line GLuint program = InitShader("vertex.glsl", "fragment.glsl"); because it's not finding the shader-files. How can I tell the program to use the files? I mean I have the .glsl files in the same folder as the .h and .cpp but when Xcode builds the project the executable is not in the same place as the .glsl files. How to solve this within Xcode?
The GLSL files are loaded at the runtime of the program. So it's not XCode that doesn't find the files, but your program. The most likely cause is, that you used a relative path for the files (like in the code snippet you provided), but started your program with a working path that doesn't match up with the hardcoded file locations. Usually your program binary is built into a dedicated build directory.
A quick fix is copying the GLSL files into the same directory as the binary. The proper solution would be to place the filed in a well known location. In MacOS X you can use Application bundles for this. See in the MacOS X developer docs how to place application resources into the Application bundle and how to access them. XCode also provides tools to automatically copy files into the generated bundle.
Follow Below Step:
Select the project on the left panel.
Select the target and then select Build Phases
There you should fin a button called Add Build Phase
There will appear a box where you have to select the files (there's a little +sign). And be sure you selected Destination: Products directory
Build the project, run it and now it should work !!
If Xcode isn't importing the files, then check if it's adding it to the resource folder by going to Your project Name in the file chooser, build phases, Copy Bundle Resources and make sure your 2 files are in there.

OpenGL 3.x context creation using SDL2 on OSX (Macbook Air 2012)

As far as I'm aware, the Macbook Air 2012 supports OpenGL 3.2. When using SDL 2.0 to create the OpenGL context, however, the context is only opengl version 2.1.
Is it possible for SDL 2.0 to create a GL 3.2 context?
For everyone who still has this problem as of Wednesday, Oct 10th, SDL2 allows you to create an OpenGL 3.2 context on Macs running Mac OS X 10.7 (Lion) and up.
You just need to add this code:
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);
If you're running OS X 10.10, or the solution above still does not resolve the problem, changing the second line of the above example from
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
to
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
may work for you.
EDIT: See https://stackoverflow.com/a/13095742/123387 -- SDL2 should support this natively now. The note below is for versions before the latest SDL2 release.
The current version (as of Wed Aug 15 21:00:33 2012 -0400;
6398:c294faf5fce5) does not support 10.7 series. However, there is a
way to add support if you're willing to run an unstable SDL for kicks.
Give this a shot:
src/video/cocoa/SDL_cocoaopengl.m +90 (Cocoa_GL_CreateContext)
if(_this->gl_config.major_version == 3 &&
_this->gl_config.minor_version == 2)
{
attr[i++] = NSOpenGLPFAOpenGLProfile;
attr[i++] = NSOpenGLProfileVersion3_2Core;
}
Then in your application, something along these lines.
SDL_Init(SDL_INIT_EVERYTHING);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
window = SDL_CreateWindow("3.2", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
640, 480, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
context = SDL_GL_CreateContext(window);
I am running 10.7.4 under my Mac Air from 2011 and when I run a few GL
diagnostics from a 3.2 SDL enable application I get:
Driver : cocoa
Renderer : Intel HD Graphics 3000 OpenGL Engine
Vendor : Intel Inc.
Version : 3.2 INTEL-7.18.18
GLSL : 1.50
I haven't tested much outside of this, but hopefully someone else can
give it a stab and have a bit more success.
EDIT: If you're using GLEW you will want to enable glewExperimental.
Note that there is a bug in the 3.2 core profile when you query
GL_EXTENSIONS. It will report 1280 (as if it wasn't
supported)--however that shouldn't impact you from using 1.50 shaders
and so on.
That should be possible:
#include <stdio.h>
#include <stdlib.h>
/* If using gl3.h */
/* Ensure we are using opengl's core profile only */
#define GL3_PROTOTYPES 1
#include <GL3/gl3.h>
#include <SDL.h>
#define PROGRAM_NAME "Tutorial1"
/* A simple function that prints a message, the error code returned by SDL,
* and quits the application */
void sdldie(const char *msg)
{
printf("%s: %s\n", msg, SDL_GetError());
SDL_Quit();
exit(1);
}
void checkSDLError(int line = -1)
{
#ifndef NDEBUG
const char *error = SDL_GetError();
if (*error != '\0')
{
printf("SDL Error: %s\n", error);
if (line != -1)
printf(" + line: %i\n", line);
SDL_ClearError();
}
#endif
}
/* Our program's entry point */
int main(int argc, char *argv[])
{
SDL_Window *mainwindow; /* Our window handle */
SDL_GLContext maincontext; /* Our opengl context handle */
if (SDL_Init(SDL_INIT_VIDEO) < 0) /* Initialize SDL's Video subsystem */
sdldie("Unable to initialize SDL"); /* Or die on error */
/* Request opengl 3.2 context.
* SDL doesn't have the ability to choose which profile at this time of writing,
* but it should default to the core profile */
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
/* Turn on double buffering with a 24bit Z buffer.
* You may need to change this to 16 or 32 for your system */
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
/* Create our window centered at 512x512 resolution */
mainwindow = SDL_CreateWindow(PROGRAM_NAME, SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
512, 512, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
if (!mainwindow) /* Die if creation failed */
sdldie("Unable to create window");
checkSDLError(__LINE__);
/* Create our opengl context and attach it to our window */
maincontext = SDL_GL_CreateContext(mainwindow);
checkSDLError(__LINE__);
/* This makes our buffer swap syncronized with the monitor's vertical refresh */
SDL_GL_SetSwapInterval(1);
/* Clear our buffer with a red background */
glClearColor ( 1.0, 0.0, 0.0, 1.0 );
glClear ( GL_COLOR_BUFFER_BIT );
/* Swap our back buffer to the front */
SDL_GL_SwapWindow(mainwindow);
/* Wait 2 seconds */
SDL_Delay(2000);
/* Same as above, but green */
glClearColor ( 0.0, 1.0, 0.0, 1.0 );
glClear ( GL_COLOR_BUFFER_BIT );
SDL_GL_SwapWindow(mainwindow);
SDL_Delay(2000);
/* Same as above, but blue */
glClearColor ( 0.0, 0.0, 1.0, 1.0 );
glClear ( GL_COLOR_BUFFER_BIT );
SDL_GL_SwapWindow(mainwindow);
SDL_Delay(2000);
/* Delete our opengl context, destroy our window, and shutdown SDL */
SDL_GL_DeleteContext(maincontext);
SDL_DestroyWindow(mainwindow);
SDL_Quit();
return 0;
}

Resources