OpenGL + VC++ - lighting randomly messed up in release configuration - visual-studio

So, I'm almost finished with my little program. The problem is that the game should look like this:
...but it sometimes looks like this:
This never happens in Debug configuration, only in Release. I'm using VS 2015.
I set up my lights like this:
GLfloat lightPos[] = { 0, 20, 0 };
glEnable(GL_NORMALIZE);
glLightfv(GL_LIGHT0, GL_POSITION, lightPos);
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
The ball and the playing field are located in (0, 0, 0) and (0, -1, 0), respectively. Does anyone know what's causing this? Does the game retain something from the last run that messes up with the settings?
The whole project is pretty large by now, so I didn't include all of the code, but I can provide more information if you need it.

GL_POSITION gets homogeneous coordinates, so it should be:
GLfloat lightPos[] = { 0, 20, 0, 1 };
or
GLfloat lightPos[] = { 0, 20, 0, 0 };
depending on whether you want a point light or a directional light.

Related

Orbiting a cube in WebGL with glMatrix

https://jsfiddle.net/sepoto/Ln7qvv7w/2/
I have a base set up to display a cube with different colored faces. What I am trying to do is set up a camera and apply a combined X axis and Y axis rotation so that the cube spins around both axis concurrently. There seems to be some problems with the matrices I set up as I can see the blue face doesn't look quite right. There are some examples of how this is done using older versions of glMatrix however the code in the examples no longer works because of some changes in vec4 of the glMatrix library. Does anyone know how this can be done using the latest version of glMatrix as I have attached a CDN to the fiddle?
Thank you!
function drawScene() {
gl.viewport(0,0,gl.viewportWidth, gl.viewportHeight);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
mat4.ortho( mOrtho, -5, 5, 5, -5, 2, -200);
mat4.identity(mMove);
var rotMatrix = mat4.create();
mat4.identity(rotMatrix);
rotMatrix = mat4.fromYRotation(rotMatrix, yRot,rotMatrix);
rotMatrix = mat4.fromXRotation(rotMatrix, xRot,rotMatrix);
mat4.multiply(mMove, rotMatrix, mMove);
setMatrixUniforms();
gl.bindBuffer(gl.ARRAY_BUFFER, triangleVertexPositionBuffer);
gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, triangleVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, triangleColorBuffer);
gl.vertexAttribPointer(shaderProgram.vertexColorAttribute, triangleColorBuffer.itemSize, gl.FLOAT, false, 0, 0);
gl.drawArrays(gl.TRIANGLES, 0, triangleVertexPositionBuffer.numItems);
yRot += 0.01;
xRot += 0.01;
}
As the name says, fromYRotation() initializes a matrix to a given rotation. Hence, you need two temporary matrices for the partial rotations, which you can then combine:
var rotMatrix = mat4.create();
var rotMatrixX = mat4.create();
var rotMatrixY = mat4.create();
mat4.fromYRotation(rotMatrixY, yRot);
mat4.fromXRotation(rotMatrixX, xRot);
mat4.multiply(rotMatrix, rotMatrixY, rotMatrixX);
And the reason why your blue face was behaving strangely, was the missing depth test. Enable it in your initialization method:
gl.enable(gl.DEPTH_TEST);
You dont need to use three matrices:
// you should do allocations outside of the renderloop
var rotMat = mat4.create();
// no need to set the matrix to identity as
// fromYRotation resets rotMats contents anyway
mat4.fromYRotation(rotMat, yRot);
mat4.rotateX(rotMat,xRot);

using cache in a complex structure

I'm using easeljs to build a certain structure.
Inside that structure, there are many containers and shapes.
I ran across a problem where I needed to change the color of a certain element when the user hovered it with his mouse. I managed to do it However there is a considerable delay until the color is drawn and return to its original color because the stage redraws itself.
I saw that I could use the cache for this purpose so I follow the example in the docs like this:
myShape.cache(150, 150, 100, 100, 1); however nothings happens and I don't see the shape.
I have to say that the shape resides inside a container which is added to the stage.
Here's the relevant code:
enter code here
var g = curShape.graphics.clone().clear();
g.beginFill("#2aa4eb");
g.drawRoundRect(0, 0, curShape.width, curShape.height, 1.5);
//g.drawRect(0, 0, curShape.width + 2, curShape.height + 2);
g.endFill();
g.endStroke();
var newShape= new createjs.Shape(g);
newShape.cache(150, 150, 100, 100, 2);
Any help would be appreciated
You are caching at x:150 and y:150, but you are drawing your shapes at 0,0. If your shape is smaller than 150x150, then it will be caching nothing. Change your cache to 0,0, and it should be fine.
Additionally, you are not providing the 5th parameter (corner radius) to the drawRoundRect call, which will make it fail. Here is a quick sample with a modified version of your code.
http://jsfiddle.net/LNXVg/
var stage = new createjs.Stage("canvas");
var g = new createjs.Graphics();
g.beginFill("#2aa4eb");
g.drawRoundRect(0, 0, 300, 200, 5);
var newShape = new createjs.Shape(g);
//newShape.cache(150, 150, 100, 100, 2);
newShape.cache(0, 0, 100, 100, 2);
stage.addChild(newShape);
stage.update();

How to use fragment shader to draw sphere ilusion in OpenGL ES?

I am using this simple function to draw quad in 3D space that is facing camera. Now, I want to use fragment shader to draw illusion of a sphere inside. But, the problem is I'm new to OpenGL ES, so I don't know how?
void draw_sphere(view_t view) {
set_gl_options(COURSE);
glPushMatrix();
{
glTranslatef(view.plyr_pos.x, view.plyr_pos.y, view.plyr_pos.z - 1.9);
#ifdef __APPLE__
#undef glEnableClientState
#undef glDisableClientState
#undef glVertexPointer
#undef glTexCoordPointer
#undef glDrawArrays
static const GLfloat vertices []=
{
0, 0, 0,
1, 0, 0,
1, 1, 0,
0, 1, 0,
0, 0, 0,
1, 1, 0
};
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, vertices);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 6);
glDisableClientState(GL_VERTEX_ARRAY);
#else
#endif
}
glPopMatrix();
}
More exactly, I want to achieve this:
There might be quite a few thing you need to to achieve this... The sphere that is drawn on the last image you posted is a result in using lighting and shine and color. In general you need a shader that can process all that and can normally work for any shape.
This specific case (also some others that can be mathematically presented) can be drawn with a single quad without even needing to push normal coordinates to the program. What you need to do is create a normal in a fragment shader: If you receive vectors sphereCenter, fragmentPosition and float sphereRadius, then sphereNormal is a vector such as
sphereNormal = (fragmentPosition-sphereCenter)/radius; //taking into account all have .z = .0
sphereNormal.z = -sqrt(1.0 - length(sphereNormal)); //only if(length(spherePosition) < sphereRadius)
and real sphere position:
spherePosition = sphereCenter + sphereNormal*sphereRadius;
Now all you need to do is add your lighting.. Static or not it is most common to use some ambient factor, linear and square distance factors, shine factor:
color = ambient*materialColor; //apply ambient
vector fragmentToLight = lightPosition-spherePosition;
float lightDistance = length(fragmentToLight);
fragmentToLight = normalize(fragmentToLight); //can also just divide with light distance
float dotFactor = dot(sphereNormal, fragmentToLight); //dot factor is used to take int account the angle between light and surface normal
if(dotFactor > .0) {
color += (materialColor*dotFactor)/(1.0 + lightDistance*linearFactor + lightDistance*lightDistance*squareFactor); //apply dot factor and distance factors (in many cases the distance factors are 0)
}
vector shineVector = (sphereNormal*(2.0*dotFactor)) - fragmentToLight; //this is a vector that is mirrored through the normal, it is a reflection vector
float shineFactor = dot(shineVector, normalize(cameraPosition-spherePosition)); //factor represents how strong is the light reflection towards the viewer
if(shineFactor > .0) {
color += materialColor*(shineFactor*shineFactor * shine); //or some other power then 2 (shineFactor*shineFactor)
}
This pattern to create lights in fragment shader is one of very many. If you don't like it or you cant make it work I suggest you find another one on the web, otherwise I hope you will understand it and be able to play around with it.

Getting a Blank Screen when Setting a variable in Vertex Shader [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 3 years ago.
Improve this question
I've just finished creating a simple rectangle in OpenGL 3.2, now I want to add lighting support. However, whenever I try to move my normals to the fragment shader, nothing appears. If I comment out that line, it works perfectly again. What would be the reason causing this? Nothing shows up in error log.
Vertex Shader:
#version 150
in vec4 position;
in vec3 inNormal;
out vec3 varNormal;
uniform mat4 modelViewProjectionMatrix;
void main()
{
//varNormal = inNormal; //If I uncomment this line, nothing shows up
gl_Position = modelViewProjectionMatrix * position;
}
Fragment Shader:
#version 150
in vec3 varNormal;
out vec4 fragColor;
void main()
{
fragColor = vec4(1, 1, 1, 1);
}
And passing the normals:
GLuint posAttrib = 0;
GLuint normalAttrib = 1;
glBindAttribLocation(program, posAttrib, "position");
glBindAttribLocation(program, normalAttrib, "normalAttrib");
//Building the VAO's/VBO's
GLfloat posCoords[] =
{
-10, 0.0, -10,
-10, 0.0, 10,
10, 0.0, 10,
10, 0.0, -10,
};
GLfloat normalCoords[] =
{
0, 0, 1,
0, 0, 1,
0, 0, 1,
0, 0, 1
};
glGenVertexArrays(1, &vaoName);
glBindVertexArray(vaoName);
GLuint posBuffer;
glGenBuffers(1, &posBuffer);
glBindBuffer(GL_ARRAY_BUFFER, posBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(posCoords), posCoords, GL_STATIC_DRAW);
glEnableVertexAttribArray(posAttrib);
glVertexAttribPointer(posAttrib, 3, GL_FLOAT, GL_FALSE, 0, 0);
GLuint normalBuffer;
glGenBuffers(1, &normalBuffer);
glBindBuffer(GL_ARRAY_BUFFER, normalBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(normalCoords), normalCoords, GL_STATIC_DRAW);
glEnableVertexAttribArray(normalAttrib);
glVertexAttribPointer(normalAttrib, 3, GL_FLOAT, GL_FALSE, 0, 0);
I haven't tried putting all of my position and normal coords in a single VBO, but I'd prefer to not resort to that method.
Not sure if that's your actual code or a cut and paste, but calling glBindAttribLocation only takes effect after the next call to glLinkProgram.
If you're not linking the program after calling glBindAttrib those won't take effect, and your attributes may be given the wrong indexes. That could explain why you get different behavior after uncommenting the normal line.
Probably the most bizarre reason to fix this, but it works.
First of all, make sure you know how the OpenGL Profiler works. There's a tutorial provided by the Apple Docs
Set a breakpoint before/after glDrawElements (or glDrawArray depending on what you're using)
Then look at your program's vertex attributes and make sure the locations are in order.
If they aren't, rearrange them.
From (or anything else):
enum
{
POSITION_ATTR = 0,
TEXTURE_ATTR=1,
NORMAL_ATTR=2,
};
To:
enum
{
NORMAL_ATTR = 0,
TEXTURE_ATTR=2,
POSITION_ATTR=1,
};
No idea how and why this is happening, but this is the solution to the problem.

Making a flat OpenGL surface to apply a texture to: strange behavior with my code (pics added for reference)

I have a very simple AR application in which I have a marker that I detect, then on this marker, I create an openGL flat surface and apply a texture to it. I'm using a simple square at the moment. The code looks like this:
#define NUM_SURFACE_OBJECT_VERTEX 12
#define NUM_SURFACE_OBJECT_INDEX 12
static const float surfaceVertices[] =
{
20, -20, 5,
20, 20, 5,
-20, 20, 5,
-20, -20, 5
};
static const float surfaceTexCoords[] =
{
1, 0,
1, 1,
0, 1,
0, 0
};
static const float surfaceNormals[] =
{
};
static const unsigned short surfaceIndices[] =
{
0, 1, 2,
2, 3, 0
};
and the code to draw it looks like this:
QCAR::Matrix44F modelViewProjection;
ShaderUtils::translatePoseMatrix(
0.0f, 0.0f, kObjectScale, &modelViewMatrix.data[0]);
ShaderUtils::scalePoseMatrix(
kObjectScale,
kObjectScale,
kObjectScale,
&modelViewMatrix.data[0]);
ShaderUtils::multiplyMatrix(
&projectionMatrix.data[0],
&modelViewMatrix.data[0],
&modelViewProjection.data[0]);
glUseProgram(shaderProgramID);
glVertexAttribPointer(
vertexHandle,
3,
GL_FLOAT,
GL_FALSE,
0,
(const GLvoid*)&surfaceVertices[0]);
glVertexAttribPointer(
textureCoordHandle,
2,
GL_FLOAT,
GL_FALSE,
0,
(const GLvoid*)&surfaceTexCoords[0]);
glEnableVertexAttribArray(vertexHandle);
glEnableVertexAttribArray(textureCoordHandle);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, [thisTexture textureID]);
glUniformMatrix4fv(
mvpMatrixHandle, 1, GL_FALSE, (const GLfloat*)&modelViewProjection.data[0]);
glDrawElements(
GL_TRIANGLES,
NUM_SURFACE_OBJECT_INDEX,
GL_UNSIGNED_SHORT,
(const GLvoid*)&surfaceIndices[0]);
ShaderUtils::checkGlError("EAGLView renderFrameQCAR");
And here is an array of test textures that I'm rotating through (actually, all the png's are the same, I just copied them over multiple times. They are each 512x512px in size):
const char* textureFilenames[] = {
"test.png", "test2.png", "test3.png", "test4.png", "test5.png",
"test6.png", "test7.png", "test8.png", "test8.png", "test10.png"}
My apologies if these arent the correct code segments to show, but this is the draw section of the program. Essentially, this is the sample code that I'm trying to work off of and I've just replaced the teapot model that they use with a flat surface and textures.
So, some points to note:
For some reason, when I use the above code with 1 to about 5 texture only, everything works out fine. I can see the texture normally and can move the camera around to see different perspectives. I have a function that changes the texture every 10 seconds, and when using just a few textures, everything is still normal.
When using more than that, like say rotating between 10 or more different textures, I begin to get strange behavior, like a plane that is jutting outwards from the screen.
If you notice my code above, I commented out anything that uses the normals. I actually don't know what the normals array would look like for a flat surface, so I just tried commenting all the code out.
I've uploaded some pics for reference:
This is a shot of the marker in the background (some wood chips) and the texture applied to the surface. Eveything seems ok.
This is a shot taken at an angle. At this point, I'm only rotating between 5 textures in the code. Everything ok here too.
Now, here I'm trying to rotate 10 textures every 10 seconds. Even on the first texture, I get a strange plane that juts out like so:
Why might this be the case? The sample code isn't giving me any warnings on the loading of the textures and I've tried to even make the textures really small, but still the same behavior. Is there something wrong with how I've declared the vertices and indices? Is it because I've left out normals?
Thank you!
Clarification
These pictures are taken from my iPad 2. My AR application is running on my iPad 2. I am using my macbook air to display the marker of the wood chips. So my ipad is recognizing the marker image being displayed on my macbook air and applying a texture on top of the marker.
It seems to be fixed by adding in the normal vectors!

Resources