Related
All.
Now I am using unity3D to develop the game. And I want to save the content of each frame by AVFoundation as mp4 file. But I met some problem while I process the snapshot. After I use glReadPixels to obtain the data saved in render buffer, vertex shader and fragment shader is used to help me turn update side down the pixels content. But, after flipping each frame, I found that the quality of each frame has been decreased a lot. So, anyone has met this kind of case before.
Here is the code related.
The snapshot part,
- (void *)snapshot
{
// NSLog(#"snapshot used here");
GLint backingWidth1, backingHeight1;
glBindRenderbufferOES(GL_RENDERBUFFER_OES, mainDisplaySurface->systemColorRB);
// Get the size of the backing CAEAGLLayer
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth1);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight1);
NSInteger x = 0, y = 0, width = backingWidth1, height = backingHeight1;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));
// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
if (transformData == NULL)
{
NSLog(#"transformData initial");
transformData = new loadDataFromeX();
transformData->setupOGL(backingWidth1, backingHeight1);
}
NSLog(#"data %d, %d, %d", (int)data[0], (int)data[1], (int)data[2]);
transformData->drawingOGL(data);
return data;
}
Here, transformData is an c++ class to help me to the flipping work.
in the function, setOGL(), all the textures and framebuffers have been constructed.
in the function drawingOGL(), the flipping work has been done by passing through the vertex shader and fragment shader. The details of this function is listed below,
int loadDataFromeX::drawingOGL(unsigned char* data)
{
//load data to the texture;
glDisable(GL_DEPTH_TEST);
glBindFramebuffer(GL_FRAMEBUFFER ,transFBO.frameBuffer);
glClear(GL_COLOR_BUFFER_BIT);
glClearColor(1., 0., 0., 1.);
glViewport(0, 0, imageWidth, imageHeight);
GLfloat vertex_postions[] = {
-1.0f, -1.0f, -10.0f,
1.0f, -1.0f, -10.0f,
-1.0f, 1.0f, -10.0f,
1.0f, 1.0f, -10.0f
};
GLfloat texture_coords[] = { //left up corner is (0.0)
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f,
};
glUseProgram(gl_program_id);
glVertexAttribPointer(gl_attribute_position, 3, GL_FLOAT, GL_FALSE, 0,vertex_postions);
glEnableVertexAttribArray(gl_attribute_position);
glVertexAttribPointer(gl_attribute_texture_coordinate, 2, GL_FLOAT, GL_FALSE, 0,texture_coords);
glEnableVertexAttribArray(gl_attribute_texture_coordinate);
// Load textures
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texture);
if(flag)
{
flag = false;
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, imageWidth, imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
}
else
{
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, imageWidth, imageHeight, GL_RGBA, GL_UNSIGNED_BYTE, data);
}
glUniform1i(glGetUniformLocation(gl_program_id, "inputImageTexture"), 0);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glUniformMatrix4fv(mvpLocation, 1, 0, gComputeMVP);
glDrawArrays(GL_TRIANGLE_STRIP ,0 ,4);
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(0, 0, imageWidth, imageHeight,GL_RGBA, GL_UNSIGNED_BYTE, data);
glFinish();
cout<<"data: "<<(int)data[0]<<"; "<<(int)data[1]<<", "<<(int)data[2]<<endl;
return 1;
}
The vertex shader ans fragment shader have been provided below,
static float l = -1.f, r = 1.f;
static float b = -1.f, t = 1.f;
static float n = 0.1f, f = 100.f;
static float gComputeMVP[16] = {
2.0f/(r-l), 0.0f, 0.0f, 0.0f,
0.0f, 2.0f/(t-b), 0.0f, 0.0f,
0.0f, 0.0f, -2.0f/(f-n), 0.0f,
-(r+l)/(r-l), -(t+b)/(t-b), -(f+n)/(f-n), 1.0f
};
// Shader sources
const GLchar* vertex_shader_str =
"attribute vec4 position;\n"
"attribute vec4 inputTextureCoordinate;\n"
"varying mediump vec2 textureCoordinate;\n"
"uniform mat4 mvpMatrix;\n"
"void main()\n"
"{\n"
" gl_Position = position;\n"
" gl_Position = mvpMatrix * position;\n"
" textureCoordinate = inputTextureCoordinate.xy;\n"
"}";
const char* fragment_shader_str = ""
" varying mediump vec2 textureCoordinate;\n"
"\n"
" uniform sampler2D inputImageTexture;\n"
" \n"
" void main()\n"
" {\n"
" mediump vec4 Color = texture2D(inputImageTexture, textureCoordinate);\n"
" gl_FragColor = vec4(Color.z, Color.y, Color.x, Color.w);\n"
" }";
I don't know why the quality of each frame has been decreased. And also, when I compare the output of variable, data, before and after using drawingOGL, as these two lines shown below,
cout<<"data: "<<(int)data[0]<<"; "<<(int)data[1]<<", "<<(int)data[2]<<endl;
NSLog(#"data %d, %d, %d", (int)data[0], (int)data[1], (int)data[2]);
The first line gave the right pixel value. But, the second line always gave ZERO. It's really strange, right?
I have found out the reason to this strange problem. It was caused by the context of unity3D. I'm not familiar with unity3d. So, maybe, only someone like me will do something stupid like this. There is some special setting related with OpenGL ES in the context, belonging to unity3d. So, in order to finish the task in snapshot, a new context has to be established and only activate it when the snapshot is working.
In order to solve the problem, I construct an individual context (EAGLContext*) for the snapshot task like this,
- (void *)snapshot
{
// NSLog(#"snapshot used here");
GLint backingWidth1, backingHeight1;
glBindRenderbufferOES(GL_RENDERBUFFER_OES, mainDisplaySurface->systemColorRB);
// Get the size of the backing CAEAGLLayer
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth1);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight1);
NSInteger x = 0, y = 0, width = backingWidth1, height = backingHeight1;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));
// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_BGRA, GL_UNSIGNED_BYTE, data);
NSLog(#"data %d, %d, %d", (int)data[0], (int)data[1], (int)data[2]);
NSLog(#"backingWidth1 : %d, backingHeight1: %d", backingWidth1, backingHeight1);
if (transformData == NULL)
{
mycontext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
[EAGLContext setCurrentContext: mycontext];
NSLog(#"transformData initial");
transformData = new loadDataFromeX();
transformData->setupOGL(backingWidth1, backingHeight1);
[EAGLContext setCurrentContext: mainDisplaySurface->context];
}
{
[EAGLContext setCurrentContext: mycontext];
transformData->drawingOGL(data);
[EAGLContext setCurrentContext: mainDisplaySurface->context];
}
}
When the resources used for snapshot is to be released, the codes is written like this,
if (transformData != NULL)
{
{
[EAGLContext setCurrentContext: mycontext];
transformData->destroy();
delete transformData;
transformData = NULL;
[EAGLContext setCurrentContext: mainDisplaySurface->context];
}
[mycontext release];
mycontext = nil;
}
I have used the legacy openGL with cocoa for years, but I'm now struggling to make the transition to openGL 3.2. There are several examples in the internet, but they are all too complex (and many don't even compile any more under XCode 5.1). Could someone write an example of the simplest, minimalistic, minimum cocoa code just to draw a read triangle to a NSOpenGLView? (no fancy shaders, no displayCallbacks, the fewer the code lines, the better).
Here's an answer based on the code in https://github.com/beelsebob/Cocoa-GL-Tutorial
I changed these things: (1) The openGL context is created in a custom NSOpenGLView and not directly appended to the window. (2) I all the initialisation in one single function. (3) I deleted all the error verification code. This is not something you should do for a product, but I find it easier to understand the code with less clutter... (look at Cocoa-GL-Tutorial for proper error handling).
The steps (tested with Xcode 5.1):
Make a new cocoa application
Add a Custom View to the app window in the interface builder
Add an Objective-C class, subclassing NSOpenGLView, I called it MyOpenGLView
In the interface builder, select the CustomView, select the Identity Inspector (one of the icons top-right), and in Custom Class select MyOpenGLView
Now, this is the code for MyOpenGLView.h:
#import <Cocoa/Cocoa.h>
#import <OpenGL/OpenGL.h>
#import <OpenGL/gl3.h>
#interface MyOpenGLView : NSOpenGLView
{
GLuint shaderProgram;
GLuint vertexArrayObject;
GLuint vertexBuffer;
GLint positionUniform;
GLint colourAttribute;
GLint positionAttribute;
}
#end
And this is the code for MyOpenGLView.m:
#import "MyOpenGLView.h"
#implementation MyOpenGLView
- (id)initWithFrame:(NSRect)frame
{
// 1. Create a context with opengl pixel format
NSOpenGLPixelFormatAttribute pixelFormatAttributes[] =
{
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
NSOpenGLPFAColorSize , 24 ,
NSOpenGLPFAAlphaSize , 8 ,
NSOpenGLPFADoubleBuffer ,
NSOpenGLPFAAccelerated ,
NSOpenGLPFANoRecovery ,
0
};
NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:pixelFormatAttributes];
self = [super initWithFrame:frame pixelFormat:pixelFormat];
// 2. Make the context current
[[self openGLContext] makeCurrentContext];
// 3. Define and compile vertex and fragment shaders
GLuint vs;
GLuint fs;
const char *vss="#version 150\n\
uniform vec2 p;\
in vec4 position;\
in vec4 colour;\
out vec4 colourV;\
void main (void)\
{\
colourV = colour;\
gl_Position = vec4(p, 0.0, 0.0) + position;\
}";
const char *fss="#version 150\n\
in vec4 colourV;\
out vec4 fragColour;\
void main(void)\
{\
fragColour = colourV;\
}";
vs = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(vs, 1, &vss, NULL);
glCompileShader(vs);
fs = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(fs, 1, &fss, NULL);
glCompileShader(fs);
printf("vs: %i, fs: %i\n",vs,fs);
// 4. Attach the shaders
shaderProgram = glCreateProgram();
glAttachShader(shaderProgram, vs);
glAttachShader(shaderProgram, fs);
glBindFragDataLocation(shaderProgram, 0, "fragColour");
glLinkProgram(shaderProgram);
// 5. Get pointers to uniforms and attributes
positionUniform = glGetUniformLocation(shaderProgram, "p");
colourAttribute = glGetAttribLocation(shaderProgram, "colour");
positionAttribute = glGetAttribLocation(shaderProgram, "position");
glDeleteShader(vs);
glDeleteShader(fs);
printf("positionUniform: %i, colourAttribute: %i, positionAttribute: %i\n",positionUniform,colourAttribute,positionAttribute);
// 6. Upload vertices (1st four values in a row) and colours (following four values)
GLfloat vertexData[]= { -0.5,-0.5,0.0,1.0, 1.0,0.0,0.0,1.0,
-0.5, 0.5,0.0,1.0, 0.0,1.0,0.0,1.0,
0.5, 0.5,0.0,1.0, 0.0,0.0,1.0,1.0,
0.5,-0.5,0.0,1.0, 1.0,1.0,1.0,1.0};
glGenVertexArrays(1, &vertexArrayObject);
glBindVertexArray(vertexArrayObject);
glGenBuffers(1, &vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, 4*8*sizeof(GLfloat), vertexData, GL_STATIC_DRAW);
glEnableVertexAttribArray((GLuint)positionAttribute);
glEnableVertexAttribArray((GLuint)colourAttribute );
glVertexAttribPointer((GLuint)positionAttribute, 4, GL_FLOAT, GL_FALSE, 8*sizeof(GLfloat), 0);
glVertexAttribPointer((GLuint)colourAttribute , 4, GL_FLOAT, GL_FALSE, 8*sizeof(GLfloat), (char*)0+4*sizeof(GLfloat));
return self;
}
- (void)drawRect:(NSRect)dirtyRect
{
[super drawRect:dirtyRect];
glClearColor(0.0, 0.0, 0.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glUseProgram(shaderProgram);
GLfloat p[]={0,0};
glUniform2fv(positionUniform, 1, (const GLfloat *)&p);
glDrawArrays(GL_TRIANGLE_FAN, 0, 4);
[[self openGLContext] flushBuffer];
}
#end
https://stackoverflow.com/a/22502999/4946861
In xcode 6.3.2 I got the example running afterreplacement of
(id)initWithFrame:(NSRect)frame
with
(void)awakeFromNib
and replacement of
self = [super initWithFrame:frame pixelFormat:pixelFormat];
with
super.pixelFormat=pixelFormat;
and deletion of
return self;
or written out in detail:
#import "MyOpenGLView.h"
#implementation MyOpenGLView
- (void)awakeFromNib
{
NSLog(#"...was here");
// 1. Create a context with opengl pixel format
NSOpenGLPixelFormatAttribute pixelFormatAttributes[] =
{
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
NSOpenGLPFAColorSize , 24 ,
NSOpenGLPFAAlphaSize , 8 ,
NSOpenGLPFADoubleBuffer ,
NSOpenGLPFAAccelerated ,
NSOpenGLPFANoRecovery ,
0
};
NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:pixelFormatAttributes];
super.pixelFormat=pixelFormat;
// 2. Make the context current
[[self openGLContext] makeCurrentContext];
// 3. Define and compile vertex and fragment shaders
GLuint vs;
GLuint fs;
const char *vss="#version 150\n\
uniform vec2 p;\
in vec4 position;\
in vec4 colour;\
out vec4 colourV;\
void main (void)\
{\
colourV = colour;\
gl_Position = vec4(p, 0.0, 0.0) + position;\
}";
const char *fss="#version 150\n\
in vec4 colourV;\
out vec4 fragColour;\
void main(void)\
{\
fragColour = colourV;\
}";
vs = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(vs, 1, &vss, NULL);
glCompileShader(vs);
fs = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(fs, 1, &fss, NULL);
glCompileShader(fs);
printf("vs: %i, fs: %i\n",vs,fs);
// 4. Attach the shaders
shaderProgram = glCreateProgram();
glAttachShader(shaderProgram, vs);
glAttachShader(shaderProgram, fs);
glBindFragDataLocation(shaderProgram, 0, "fragColour");
glLinkProgram(shaderProgram);
// 5. Get pointers to uniforms and attributes
positionUniform = glGetUniformLocation(shaderProgram, "p");
colourAttribute = glGetAttribLocation(shaderProgram, "colour");
positionAttribute = glGetAttribLocation(shaderProgram, "position");
glDeleteShader(vs);
glDeleteShader(fs);
printf("positionUniform: %i, colourAttribute: %i, positionAttribute: %i\n",positionUniform,colourAttribute,positionAttribute);
// 6. Upload vertices (1st four values in a row) and colours (following four values)
GLfloat vertexData[]= { -0.5,-0.5,0.0,1.0, 1.0,0.0,0.0,1.0,
-0.5, 0.5,0.0,1.0, 0.0,1.0,0.0,1.0,
0.5, 0.5,0.0,1.0, 0.0,0.0,1.0,1.0,
0.5,-0.5,0.0,1.0, 1.0,1.0,1.0,1.0};
glGenVertexArrays(1, &vertexArrayObject);
glBindVertexArray(vertexArrayObject);
glGenBuffers(1, &vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, 4*8*sizeof(GLfloat), vertexData, GL_STATIC_DRAW);
glEnableVertexAttribArray((GLuint)positionAttribute);
glEnableVertexAttribArray((GLuint)colourAttribute );
glVertexAttribPointer((GLuint)positionAttribute, 4, GL_FLOAT, GL_FALSE, 8*sizeof(GLfloat), 0);
glVertexAttribPointer((GLuint)colourAttribute , 4, GL_FLOAT, GL_FALSE, 8*sizeof(GLfloat), (char*)0+4*sizeof(GLfloat));
}
- (void)drawRect:(NSRect)dirtyRect {
[super drawRect:dirtyRect];
// Drawing code here.
glClearColor(0.0, 0.0, 0.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glUseProgram(shaderProgram);
GLfloat p[]={0,0};
glUniform2fv(positionUniform, 1, (const GLfloat *)&p);
glDrawArrays(GL_TRIANGLE_FAN, 0, 4);
[[self openGLContext] flushBuffer];
}
#end
I am trying to draw a very simple blue square with a red background in OpenGL using an NSOpenGL on Mountain Lion. The code is simple and should work, I'm assuming it's a problem with me setting up the context.
Here is my GLView interface:
#import <Cocoa/Cocoa.h>
#import <OpenGL/gl.h>
#import <GLKit/GLKit.h>
typedef struct {
char *Name;
GLint Location;
} Uniform;
#interface MyOpenGLView : NSOpenGLView {
Uniform *_uniformArray;
int _uniformArraySize;
GLKMatrix4 _projectionMatrix;
GLKMatrix4 _modelViewMatrix;
IBOutlet NSWindow *window;
int height, width;
}
-(void)drawRect:(NSRect)bounds;
#end
The implementation:
GLfloat square[] = {
-0.5, -0.5,
0.5, -0.5,
-0.5, 0.5,
0.5, 0.5
};
- (id)initWithFrame:(NSRect)frame {
self = [super initWithFrame:frame];
if (self) {
}
return self;
}
-(void)awakeFromNib {
NSString *vertexShaderSource = [NSString stringWithContentsOfFile:[[NSBundle mainBundle]pathForResource:#"VertexShader" ofType:#"vsh"] encoding:NSUTF8StringEncoding error:nil];
const char *vertexShaderSourceCString = [vertexShaderSource cStringUsingEncoding:NSUTF8StringEncoding];
NSLog(#"%s",vertexShaderSourceCString);
NSString *fragmentShaderSource = [NSString stringWithContentsOfFile:[[NSBundle mainBundle]pathForResource:#"FragmentShader" ofType:#"fsh"] encoding:NSUTF8StringEncoding error:nil];
const char *fragmentShaderSourceCString = [fragmentShaderSource cStringUsingEncoding:NSUTF8StringEncoding];
NSLog(#"%s",fragmentShaderSourceCString);
NSOpenGLContext *glContext = [self openGLContext];
[glContext makeCurrentContext];
GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(fragmentShader, 1, &fragmentShaderSourceCString, NULL);
glCompileShader(fragmentShader);
GLint compileSuccess;
glGetShaderiv(fragmentShader, GL_COMPILE_STATUS, &compileSuccess);
if (compileSuccess == GL_FALSE) {
GLint logLength;
glGetShaderiv(fragmentShader, GL_INFO_LOG_LENGTH, &logLength);
if(logLength > 0) {
GLchar *log = (GLchar *)malloc(logLength);
glGetShaderInfoLog(fragmentShader, logLength, &logLength, log);
NSLog(#"Shader compile log:\n%s", log);
free(log);
}
exit(1);
}
GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(vertexShader, 1, &vertexShaderSourceCString, NULL);
glCompileShader(vertexShader);
GLuint program = glCreateProgram();
glAttachShader(program, fragmentShader);
glAttachShader(program, vertexShader);
glLinkProgram(program);
glUseProgram(program);
const char *aPositionCString = [#"a_position" cStringUsingEncoding:NSUTF8StringEncoding];
GLuint aPosition = glGetAttribLocation(program, aPositionCString);
glVertexAttribPointer(aPosition, 2, GL_FLOAT, GL_FALSE, 0, square);
glEnable(aPosition);
GLint maxUniformLength;
GLint numberOfUniforms;
char *uniformName;
glGetProgramiv(program, GL_ACTIVE_UNIFORMS, &numberOfUniforms);
glGetProgramiv(program, GL_ACTIVE_UNIFORM_MAX_LENGTH, &maxUniformLength);
_uniformArray = malloc(numberOfUniforms * sizeof(Uniform));
_uniformArraySize = numberOfUniforms;
for (int i =0; i <numberOfUniforms; i++) {
GLint size;
GLenum type;
GLint location;
uniformName = malloc(sizeof(char*)*maxUniformLength);
glGetActiveUniform(program, i, maxUniformLength, NULL, &size, &type, uniformName);
_uniformArray[i].Name = uniformName;
location = glGetUniformLocation(program, uniformName);
_uniformArray[i].Location = location;
}
_modelViewMatrix = GLKMatrix4MakeTranslation(0.0f, 0.0f, 0.0f);
_projectionMatrix = GLKMatrix4MakeOrtho(-1, 1, -1.5, 1.5, -1, 1);
}
- (void)drawRect:(NSRect)bounds {
glClearColor(1.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
glViewport(0, 0, 480, 360);
for (int i = 0; i <_uniformArraySize; i++) {
if (strcmp(_uniformArray[i].Name, "ModelViewProjectionMatrix")==0) {
// Multiply the transformation matrices together
GLKMatrix4 modelViewProjectionMatrix = GLKMatrix4Multiply(_projectionMatrix, _modelViewMatrix);
glUniformMatrix4fv(_uniformArray[i].Location, 1, GL_FALSE, modelViewProjectionMatrix.m);
}
}
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glFlush();
}
My very simple fragment shader:
void main() {
gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);
}
simple vertex shader:
attribute vec4 a_position;
uniform mat4 ModelViewProjectionMatrix;
void main() {
gl_Position = a_position * ModelViewProjectionMatrix;
}
I get no compilation errors, just a red screen and no blue square. Could someone please help me figure out what's wrong. I do get a warning though, stating Gl.h and gl3.h are both included. I'd like to be using OpenGL 2
Where is your view matrix (AKA your look-at matrix)? For my NSOpenGLView, I have at least three matrices, the model view (which consists of the multiplicative result of the rotation and translation matrices), the projection matrix (which takes into account your frustum scale, aspect ratio, near plane, and far plane), and the view matrix (which takes into account the viewers eye position in the "world", the point of focus, and the direction of the up vector). I will say big ups on placing the code in the awakeFromNib, taught me something...
I've created 3 buffers to separate vertex position, colour and index data.
The vertices correctly render as a square but it's white instead of the colour defined in the array dynamicVertexData.
I'm using OpenGL ES 2.0, but I assume I'm making a general OpenGL mistake.
Can anyone spot it?
typedef struct _vertexStatic
{
GLfloat position[2];
} vertexStatic;
typedef struct _vertexDynamic
{
GLubyte color[4];
} vertexDynamic;
enum {
ATTRIB_POSITION,
ATTRIB_COLOR,
NUM_ATTRIBUTES
};
// Separate buffers for static and dynamic data.
GLuint staticBuffer;
GLuint dynamicBuffer;
GLuint indexBuffer;
const vertexStatic staticVertexData[] = {
{0, 0},
{50, 0},
{50, 50},
{0, 50},
};
vertexDynamic dynamicVertexData[] = {
{0, 0, 255, 255},
{0, 0, 255, 255},
{0, 0, 255, 255},
{0, 0, 255, 255},
};
const GLubyte indices[] = {
0, 1, 2,
2, 3, 0,
};
- (void)setupGL {
CGSize screenSize = [UIApplication currentSize];
CGSize screenSizeHalved = CGSizeMake(screenSize.width/2, screenSize.height/2);
numIndices = sizeof(indices)/sizeof(indices[0]);
[EAGLContext setCurrentContext:self.context];
glEnable(GL_CULL_FACE); // Improves perfromance
self.effect = [[GLKBaseEffect alloc] init];
// The near and far plane are measured in units from the eye
self.effect.transform.projectionMatrix = GLKMatrix4MakeOrtho(-screenSizeHalved.width,
screenSizeHalved.width,
-screenSizeHalved.height,
screenSizeHalved.height,
0.0f, 1.0f);
self.preferredFramesPerSecond = 30;
CreateBuffers();
}
void CreateBuffers()
{
// Static position data
glGenBuffers(1, &staticBuffer);
glBindBuffer(GL_ARRAY_BUFFER, staticBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(staticVertexData), staticVertexData, GL_STATIC_DRAW);
// Dynamic color data
// While not shown here, the expectation is that the data in this buffer changes between frames.
glGenBuffers(1, &dynamicBuffer);
glBindBuffer(GL_ARRAY_BUFFER, dynamicBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(dynamicVertexData), dynamicVertexData, GL_DYNAMIC_DRAW);
// Static index data
glGenBuffers(1, &indexBuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices, GL_STATIC_DRAW);
}
void DrawModelUsingMultipleVertexBuffers()
{
glBindBuffer(GL_ARRAY_BUFFER, staticBuffer);
glVertexAttribPointer(ATTRIB_POSITION, 2, GL_FLOAT, GL_FALSE, sizeof(vertexStatic), 0);
glEnableVertexAttribArray(ATTRIB_POSITION);
glBindBuffer(GL_ARRAY_BUFFER, dynamicBuffer);
glVertexAttribPointer(ATTRIB_COLOR, 4, GL_UNSIGNED_BYTE, GL_TRUE, sizeof(vertexDynamic), 0);
glEnableVertexAttribArray(ATTRIB_COLOR);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
glDrawElements(GL_TRIANGLES, sizeof(indices)/sizeof(GLubyte), GL_UNSIGNED_BYTE, (void*)0);
}
- (void)tearDownGL {
[EAGLContext setCurrentContext:self.context];
glDeleteBuffers(1, &_vertexBuffer);
glDeleteBuffers(1, &_indexBuffer);
//glDeleteVertexArraysOES(1, &_vertexArray);
self.effect = nil;
}
- (void)viewDidLoad
{
[super viewDidLoad];
self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
if (!self.context) {
NSLog(#"Failed to create ES context");
}
GLKView *view = (GLKView *)self.view;
view.context = self.context;
// view.drawableMultisample = GLKViewDrawableMultisample4X; // Smoothes jagged lines. More processing/memory
view.drawableColorFormat = GLKViewDrawableColorFormatRGB565; // Lower colour range. Less processing/memory
[self setupGL];
}
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect {
glClearColor(0.0, 0.0, 0.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
[self.effect prepareToDraw];
DrawModelUsingMultipleVertexBuffers();
}
#end
You've enabled and bound the vertex buffers to your ATTRIB_COLOR binding point, by using the glVertexAttribPointer and glEnableVertexAttribArray entry points, but not specified what to do with them.
OpenGLES 2.0 removed most of the fixed-functionality rendering pipeline, so you will need to write a vertex shader to use the vertex buffers. In 1.X, you'd be able to use the glColorPointer entry point to specify vertex colors to the fixed-functionality pipeline.
When you manage to get openGL ES 2.0 running - which can be hard when starting - but you don't get the drawings you want, I definitively recommend running on device, which enables Extra features from XCode to debug openGL
Then you can :
Go step by step through your cycle and draw calls, and see color /
depth buffer images refreshed
See all the bounded gl objects
See VAOs content (you can see the actual data it points to, useful to find missing data / pointers)
programs : you can edit your shaders LIVE on the GPU (gl bound objects -> program : double-click!) useful to polish your shaders
This can also be very useful, if you're curious, to get an insight at GLKit's GLKBaseEffect inner-workings - which in fact just generates a openGL program, whose specific vertex and fragment shaders code depend on which properties you set...
The property you forgot is GLKBaseEffect colorMaterialEnabled
I am trying to create a GLSL example on Mac. I am trying to set up one color attribute per vertex. However, when the program runs, I just get a purple screen (the purple comes from glClearColor). I post the relevant code snippets.
-(void)drawRect:(NSRect)dirtyRect
{
// get program ID for shader program
GLuint programID = [self loadShaders];
// get new dimensions
NSSize dim = [self frame].size;
// clear the background with color
glClearColor(0.4f, 0.1f, 0.7f, 1.0f);
glDepthRange(1.0, -1.0);
glViewport(0, 0, dim.width, dim.height);
glClear(GL_COLOR_BUFFER_BIT);
// generate a buffer for our triangle
glGenBuffers(2, vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer[0]);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer[1]);
glBufferData(GL_ARRAY_BUFFER, sizeof(colors), colors, GL_STATIC_DRAW);
glUseProgram(programID);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer[0]);
glEnableVertexAttribArray(0);
glVertexAttribPointer(VERTEX_POS_INDEX, VERTEX_POS_SIZE, GL_FLOAT, GL_FALSE, VERTEX_POS_SIZE*sizeof(vertices), vertices);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer[1]);
glEnableVertexAttribArray(1);
glVertexAttribPointer(VERTEX_COLOR_INDEX, VERTEX_COLOR_SIZE, GL_FLOAT, GL_FALSE, VERTEX_POS_SIZE*sizeof(colors), colors);
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableVertexAttribArray(0);
glDisableVertexAttribArray(1);
glUseProgram(0);
// flush buffer
glFlush();
[[self openGLContext] flushBuffer];
}
And then the loading shadings code:
-(GLuint)loadShaders
{
printf("GLSL version %s\n", glGetString(GL_SHADING_LANGUAGE_VERSION));
printf("GL Version: %s", glGetString(GL_VERSION));
// Create the shaders
GLuint vertexShaderID = glCreateShader(GL_VERTEX_SHADER);
GLuint fragmentShaderID = glCreateShader(GL_FRAGMENT_SHADER);
// get handle for app bundle
NSBundle *appBundle = [NSBundle mainBundle];
// get the path for the vertex shader file
NSString *vertexFilePath = [appBundle pathForResource:vertexShaderFile ofType:nil];
// get the path for the fragment shader file
NSString *fragmentFilePath = [appBundle pathForResource:fragmentShaderFile ofType:nil];
// get the contents of the vertex shader file into a string
NSString *vertexFileContents = [NSString stringWithContentsOfFile:vertexFilePath encoding:NSUTF8StringEncoding error:NULL];
NSLog(#"%#", vertexFileContents);
// get the contents of the fragment shader file into a string
NSString *fragmentFileContents = [NSString stringWithContentsOfFile:fragmentFilePath encoding:NSUTF8StringEncoding error:NULL];
NSLog(#"%#", fragmentFileContents);
GLint Result = GL_FALSE;
int infoLogLength;
// get a pointer the vertex shader program source, compile shader program
const char *vertexSourcePointer = [vertexFileContents UTF8String];
glShaderSource(vertexShaderID, 1, &vertexSourcePointer, NULL);
glCompileShader(vertexShaderID);
// check the vertex shader
glGetShaderiv(vertexShaderID, GL_COMPILE_STATUS, &Result);
glGetShaderiv(vertexShaderID, GL_INFO_LOG_LENGTH, &infoLogLength);
char vertexShaderErrorMessage[infoLogLength];
glGetShaderInfoLog(vertexShaderID, infoLogLength, NULL, vertexShaderErrorMessage);
// print error message
NSLog(#"%#", [NSString stringWithUTF8String:vertexShaderErrorMessage]);
// get a pointer to the fragment shader program source, compile shader program
const char *fragmentSourcePointer = [fragmentFileContents UTF8String];
glShaderSource(fragmentShaderID, 1, &fragmentSourcePointer, NULL);
glCompileShader(fragmentShaderID);
// check the fragment shader
glGetShaderiv(fragmentShaderID, GL_COMPILE_STATUS, &Result);
glGetShaderiv(fragmentShaderID, GL_INFO_LOG_LENGTH, &infoLogLength);
char fragmentShaderErrorMessage[infoLogLength];
glGetShaderInfoLog(fragmentShaderID, infoLogLength, NULL, fragmentShaderErrorMessage);
// print error message
NSLog(#"%#", [NSString stringWithUTF8String:fragmentShaderErrorMessage]);
// link the program
NSLog(#"Linking program...");
GLuint programID = glCreateProgram();
glAttachShader(programID, vertexShaderID);
glAttachShader(programID, fragmentShaderID);
glBindAttribLocation(programID, 0, "position");
glBindAttribLocation(programID, 1, "inColor");
glLinkProgram(programID);
// check the program
glGetProgramiv(programID, GL_LINK_STATUS, &Result);
glGetProgramiv(programID, GL_INFO_LOG_LENGTH, &infoLogLength);
char shaderProgramErrorMessage[max(infoLogLength, (int)1)];
glGetProgramInfoLog(programID, infoLogLength, NULL, shaderProgramErrorMessage);
// pring error message
NSLog(#"%#", [NSString stringWithUTF8String:shaderProgramErrorMessage]);
// delete shaders
glDeleteShader(vertexShaderID);
glDeleteShader(fragmentShaderID);
return programID;
}
And finally the shaders:
#version 120
attribute vec4 position;
attribute vec4 inColor;
varying vec4 outColor;
void main()
{
gl_Position = position;
outColor = inColor;
}
And
#version 120
varying vec4 outColor;
void main()
{
gl_FragColor = outColor;
}
Looks fairly good, though I do see one mistake.
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer[0]);
glEnableVertexAttribArray(0);
glVertexAttribPointer(VERTEX_POS_INDEX,
VERTEX_POS_SIZE,
GL_FLOAT,
GL_FALSE,
VERTEX_POS_SIZE*sizeof(vertices),
vertices);
If you are using VBO, then the final attribute to glVertexAttribPointer should be the offset from the start of the currently bound buffer.
If you were using vertex arrays (no glBindBuffer), then you would point to vertices with the final argument of glVertexAttribPointer.
However, since you've already uploaded vertices to a buffer and bound it, the final value of glVertexAttribPointer should be the offset from the start of the vertices buffer, which in your case, should be 0.
Same mistake for the colors buffer as well.