I am rendering a simple torus in WebGL. Rotating the vertices works fine but I have a problem with the normals. When rotated around a single axis, they keep the correct direction but when the rotation around a second axis increases, the normals start rotating the wrong way up until one of the rotations are 180°, then the normals are rotating in the complete opposite of what they should.
I assume the problem lies with the quaternion used for rotation, but I have not been able to determine what is wrong.
Here is a (slightly modified, but it still shows the problem) jsfiddle of my project: https://jsfiddle.net/dt509x8h/1/
In the html-part of the fiddle there is a div containing all the data from the obj-file I am reading to generate the torus (although a lower resolution one).
vertex shader:
attribute vec4 aVertexPosition;
attribute vec3 aNormalDirection;
uniform mat4 uMVPMatrix;
uniform mat3 uNMatrix;
varying vec3 nrm;
void main(void) {
gl_Position = uMVPMatrix * aVertexPosition;
nrm = aNormalDirection * uNMatrix;
}
fragment shader:
varying vec3 nrm;
void main(void) {
gl_FragColor = vec4(nrm, 1.0);
}
Updating the matrices (run when there has been input):
mat4.perspective(pMatrix, Math.PI*0.25, width/height, clipNear, clipFar); //This is actually not run on input, it is just here to show the creation of the perspective matrix
mat4.fromRotationTranslation(mvMatrix, rotation, position);
mat3.normalFromMat4(nMatrix, mvMatrix);
mat4.multiply(mvpMatrix, pMatrix, mvMatrix);
var uMVPMatrix = gl.getUniformLocation(shaderProgram, "uMVPMatrix");
var uNMatrix = gl.getUniformLocation(shaderProgram, "uNMatrix");
gl.uniformMatrix4fv(uMVPMatrix, false, mvpMatrix);
gl.uniformMatrix3fv(uNMatrix, false, nMatrix);
Creating the rotation quaternion (called when mouse has moved):
var d = vec3.fromValues(lastmousex-mousex, mousey-lastmousey, 0.0);
var l = vec3.length(d);
vec3.normalize(d,d);
var axis = vec3.cross(vec3.create(), d, [0,0,1]);
vec3.normalize(axis, axis);
var q = quat.setAxisAngle(quat.create(), a, l*scale);
quat.multiply(rotation, q, rotation);
Rotating the torus only around the Y-axis, the normals point in the right directions:
Rotating the torus around two axes. The normals are pointing all over the place:
I am using glMatrix v2.3.2 for all matrix and quaternion operations.
Update:
It seems that rotating only around the Z axis (by setting the input axis for quat.setAxisAngle explicitly to [0,0,1], or by using quat.rotateZ) also causes the normals to rotate in the opposite direction.
Zeroing the z-component of the axis does not help.
Update2:
Rotating by quat.rotateX(q, q, l*scale); quat.rotateY(q, q, l*scale); quat.multiply(rotation, q, rotation); Seems correct, but as soon as rotation around Z is introduced the z normals starts to move around.
Using the difference in x or y mouse-values instead of l causes all normals to move, and so does using largely different scale-values for x and y.
Update3: changing the order of multiplication in the shader to uNMatrix * aNormalDirection causes the normals to always rotate the wrong way.
In my case, the problem was with how I loaded the data from an .obj-file. I had inverted the z-position of all vertices, but the normals were generated from the non-inverted vertices.
Using non-inverted z-positions and flipping the normal-matrix multiplication fixed the issues.
Related
In a scenario where vertices are displaced in the vertex shader, how to retrieve their transformed positions in WebGL / Three.js?
Other questions here suggest to write the positions to a texture and then read the pixels, but the resulting value don't seem to be correct.
In the example below the position is passed to the fragment shader without any transformations:
// vertex shader
varying vec4 vOut;
void main() {
gl_Position = vec4(position, 1.0);
vOut = vec4(position, 1.0);
}
// fragment shader
varying vec4 vOut;
void main() {
gl_FragColor = vOut;
}
Then reading the output texture, I would expect pixel[0].r to be identical to positions[0].x, but that is not the case.
Here is a jsfiddle showing the problem:
https://jsfiddle.net/brunoimbrizi/m0z8v25d/2/
What am I missing?
Solved. Quite a few things were wrong with the jsfiddle mentioned in the question.
width * height should be equal to the vertex count. A PlaneBufferGeometry with 4 by 4 segments results in 25 vertices. 3 by 3 results in 16. Always (w + 1) * (h + 1).
The positions in the vertex shader need a nudge of 1.0 / width.
The vertex shader needs to know about width and height, they can be passed in as uniforms.
Each vertex needs an attribute with its index so it can be correctly mapped.
Each position should be one pixel in the resulting texture.
The resulting texture should be drawn as gl.POINTS with gl_PointSize = 1.0.
Working jsfiddle: https://jsfiddle.net/brunoimbrizi/m0z8v25d/13/
You're not writing the vertices out correctly.
https://jsfiddle.net/ogawzpxL/
First off you're clipping the geometry, so your vertices actually end outside the view, and you see the middle of the quad without any vertices.
You can use the uv attribute to render the entire quad in the view.
gl_Position = vec4( uv * 2. - 1. , 0. ,1.);
Everything in the buffer represents some point on the quad. What seems to be tricky is when you render, the pixel will sample right next to your vertex. In the fiddle i've applied an offset to the world space thing by how much it would be in pixel space, and it didn't really work.
The reason why it seems to work with points is that this is all probably wrong :) If you want to transform only the vertices, then you need to store them properly in the texture. You can use points for this, but ideally they wouldn't be spaced out so much. In your scenario, they would fill the first couple of rows of the texture (since it's much larger than it could be).
You might start running into problems as soon as you try to apply this to something other than PlaneGeometry. In which case this problem has to be broken down.
I draw the particles in my game as a capsule (Two GL_POINTS, two GL_TRIANGLES). Everything is nicely batched so that I draw the triangles first, then the points second (two draw calls total).
My problem is that in OpenGL es you have to round GL_POINTS yourself, and I have been doing it like this in the fragment shader.
precision highp float;
varying float outColor;
vec3 hsv2rgb(vec3 c)
{
vec4 K = vec4(1.0, 2.0 / 3.0, 1.0 / 3.0, 3.0);
vec3 p = abs(fract(c.xxx + K.xyz) * 6.0 - K.www);
return c.z * mix(K.xxx, clamp(p - K.xxx, 0.0, 1.0), c.y);
}
void main()
{
vec2 circCoord = 2.0 * gl_PointCoord - 1.0;
gl_FragColor = vec4( hsv2rgb( vec3(outColor / 360.0, 1.0, 1.0) ) , step(dot(circCoord, circCoord), 1.0) );
}
The problem is I also need to do depth sorting, because a particle is drawn in two separate draw calls sometimes the z position is not correct without depth buffering because they are drawn in two draw calls.
Now that I have the depth buffer going, and the rounded points they are not mixing well, and instead of having rounded particles they have a black area around them. Any ideas?
Some extra notes:
I am doing ios opengl es (tile based deferred rendering I believe)
Each particle is initially defined as two points, the current location and the location it was in last frame. These two points are drawn with GL_POINTS later. Then the rectangle part is made with two triangles decided by finding a vector perpendicular to the vector of the two points.
Also the particles are already sorted in front to back order. Technically their z position is arbitrary, I just need them to be intact.
Also the particles are already sorted in front to back order. Technically their z position is arbitrary, I just need them to be intact.
I suspect that's your problem.
Points are square. You can fiddle the blending to make them appear round, but the geometry (and hence the depth value) is still square. Things behind the point are being Z-failed by the corners which are outside of the coloured round region.
The only fix to this without changing your algorithm completely is either use a triangle mesh rather than a point (so the actual geometry is round), or discard fragments in the fragment shader for point pixels which are outside of the round region you want to keep.
Note that using discard in shaders can be relatively expensive, so check the performance of that approach ...
I'm about to project image into cylindrical panorama. But first I need to get the pixel (or color from pixel) I'm going to draw, then then do some Math in shaders with polar coordinates to get new position of pixel and then finally draw pixel.
Using this way I'll be able to change shape of image from polygon shape to whatever I want.
But I cannot find anything about this method (get pixel first, then do the Math and get new position for pixel).
Is there something like this, please?
OpenGL historically doesn't work that way around; it forward renders — from geometry to pixels — rather than backwards — from pixel to geometry.
The most natural way to achieve what you want to do is to calculate texture coordinates based on geometry, then render as usual. For a cylindrical mapping:
establish a mapping from cylindrical coordinates to texture coordinates;
with your actual geometry, imagine it placed within the cylinder, then from each vertex proceed along the normal until you intersect the cylinder. Use that location to determine the texture coordinate for the original vertex.
The latter is most easily and conveniently done within your geometry shader; it's a simple ray intersection test, with attributes therefore being only vertex location and vertex normal, and texture location being a varying that is calculated purely from the location and normal.
Extemporaneously, something like:
// get intersection as if ray hits the circular region of the cylinder,
// i.e. where |(position + n*normal).xy| = 1
float planarLengthOfPosition = length(position.xy);
float planarLengthOfNormal = length(normal.xy);
float planarDistanceToPerimeter = 1.0 - planarLengthOfNormal;
vec3 circularIntersection = position +
(planarDistanceToPerimeter/planarLengthOfNormal)*normal;
// get intersection as if ray hits the bottom or top of the cylinder,
// i.e. where |(position + n*normal).z| = 1
float linearLengthOfPosition = abs(position.z);
float linearLengthOfNormal = abs(normal.z);
float linearDistanceToEdge = 1.0 - linearLengthOfPosition;
vec3 endIntersection = position +
(linearDistanceToEdge/linearLengthOfNormal)*normal;
// pick whichever of those was lesser
vec3 cylindricalIntersection = mix(circularIntersection,
endIntersection,
step(linearDistanceToEdge,
planarDistanceToPerimeter));
// ... do something to map cylindrical intersection to texture coordinates ...
textureCoordinateVarying =
coordinateFromCylindricalPosition(cylindricalIntersection);
With a common implementation of coordinateFromCylindricalPosition possibly being simply return vec2(atan(cylindricalIntersection.y, cylindricalIntersection.x) / 6.28318530717959, cylindricalIntersection.z * 0.5);.
I want to map a texture in the form of a lower right euclidean triangle to a hyperbolic triangle on the Poincare Disk, which looks like this:
Here's the texture (the top left triangle of the texture is transparent and unused). You might recognise this as part of Escher's Circle Limit I:
And this is what my polygon looks like (it's centred at the origin, which means that two edges are straight lines, however in general all three edges will be curves as in the first picture):
The centre of the polygon is the incentre of the euclidean triangle formed by its vertices and I'm UV mapping the texture using it's incentre, dividing it into the same number of faces as the polygon has and mapping each face onto the corresponding polygon face. However the the result looks like this:
If anybody thinks this is solvable using UV mapping I'd be happy to provide some example code, however I'm beginning to think this might not be possible and I'll have to write my own shader functions.
UV mapping is a method of mapping a texture onto an OpenGL polygon. The texture is always sampled in Euclidean space using xy coordinates in the range of (0, 1).
To overlay your texture onto a triangle on a Poincare disc, keep hold of the Euclidean coordiantes in your vertices, and use these to sample the texture.
The following code is valid for OpenGL 3.0 ES.
Vertex shader:
#version 300 es
//these should go from 0.0 to 1.0
in vec2 euclideanCoords;
in vec2 hyperbolicCoords;
out vec2 uv;
void main() {
//set z = 0.0 and w = 1.0
gl_Position = vec4(hyperbolicCoords, 0.0, 1.0);
uv = euclideanCoords;
}
Fragment shader:
#version 300 es
uniform sampler2D escherImage;
in vec2 uv;
out vec4 colour;
void main() {
colour = texture(escherImage, uv);
}
Im trying to crate a shader, that converts fft-data (passed as a texture) to a bar graphic and then to on a circle in the center of the screen. Here is a image of what im trying to achieve: link to image
i experimentet a bit with shader toy and came along wit this shader: link to shadertoy
with all the complex shaders i saw on shadertoy, it thought this should be doable with maths somehow.
can anybody here give me a hint how to do it?
It’s very doable — you just have to think about the ranges you’re sampling in. In your Shadertoy example, you have the following:
float r = length(uv);
float t = atan(uv.y, uv.x);
fragColor = vec4(texture2D(iChannel0, vec2(r, 0.1)));
So r is going to vary roughly from 0…1 (extending past 1 in the corners), and t—the angle of the uv vector—is going to vary from 0…2π.
Currently, you’re sampling your texture at (r, 0.1)—in other words, every pixel of your output will come from the V position 10% down your source texture and varying across it. The angle you’re calculating for t isn’t being used at all. What you want is for changes in the angle (t) to move across your texture in the U direction, and for changes in the distance-from-center (r) to move across the texture in the V direction. In other words, this:
float r = length(uv);
float t = atan(uv.y, uv.x) / 6.283; // normalize it to a [0,1] range - 6.283 = 2*pi
fragColor = vec4(texture2D(iChannel0, vec2(t, r)));
For the source texture you provided above, you may find your image appearing “inside out”, in which case you can subtract r from 1.0 to flip it.