Depth Map is white - webgl - opengl-es

I am using the shaders to draw the depth map in my image.
Here is my shader code :
vertex shader:
void main(void) {
gl_PointSize = aPointSize;
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
vColor = aVertexColor;
visdepth = aisdepth;
vHasTexture = aHasTexture;
if (aHasTexture > 0.5)
vTextureCoord = aTextureCoord;
}
Fragement Shader:
void main(void) {
if (vHasTexture < 0.5 && visdepth < 0.5)
gl_FragColor = vColor;
if (vHasTexture > 0.5) {
vec4 textureColor = texture2D(uTexture, vec2(vTextureCoord.s, vTextureCoord.t));
gl_FragColor = vec4(textureColor.rgb, textureColor.a * uTextureAlpha);
}
if (visdepth > 0.5){
float ndcDepth = (2.0 * gl_FragCoord.z - gl_DepthRange.near - gl_DepthRange.far) /
(gl_DepthRange.far - gl_DepthRange.near);
float clipDepth = ndcDepth /gl_FragCoord.w;
gl_FragColor = vec4((clipDepth*0.5)+0.5);
}
}
I used the following link as reference for my calculations : draw the depth value in opengl using shaders
I am getting all my values to be white as shown below:
From the two images above, it is clearly seen that points to the far right of the image are behind. This is not reflected in the image I downloaded. After using drawArrays function, I use the toDataUrl function to download the canvas data. The images are a result of the download. Does anyone know of any possible reasons for this?

for anyone who seeks an answer to that question , here's a little hint :
if you want to view the depth map , you have to linearize it...
float linearize_Z(float depth , float zNear , float zFar){
return (2*zNear ) / (zFar + zNear - depth*(zFar -zNear)) ;
}

Related

Aspect Fit and Aspect Fill content mode with OpenGL ES 2.0

I need to add two new content modes to display my textures with OpenGL ES 2.0 : "Aspect Fit" and "Aspect Fill'.
Here's an image explaining the different content modes :
I already have the "Scale to fill" content mode, which is the default behavior I guess.
Here's my basic vertex shader code for textures :
precision highp float;
attribute vec2 aTexCoordinate;
varying vec2 vTexCoordinate;
uniform mat4 uMVPMatrix;
attribute vec4 vPosition;
void main() {
vTexCoordinate = aTexCoordinate;
gl_Position = uMVPMatrix * vPosition;
}
And here's my fragment shader for textures :
precision mediump float;
uniform vec4 vColor;
uniform sampler2D uTexture;
varying vec2 vTexCoordinate;
void main() {
// premultiplied alpha
vec4 texColor = texture2D(uTexture, vTexCoordinate);
// Scale the texture RGB by the vertex color
texColor.rgb *= vColor.rgb;
// Scale the texture RGBA by the vertex alpha to reinstate premultiplication
gl_FragColor = texColor * vColor.a;
//gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
For the "Aspect Fill" mode, I thought I could play with the texture coordinates to crop the image. But for the "Aspect Fit" mode, I don't have a clear idea on how I could do it in the shaders, and even better with a red background color like in the screenshot.
You have to scale the texture coordinates. I recommend to add a uniform for scaling the texture:
uniform vec2 textureScale;
void main()
{
vTexCoordinate = textureScale * (aTexCoordinate - 0.5) + 0.5;
// [...]
}
Compute the aspect ratio of the texture:
aspect = textrueHeight / textureWidth
The the textureScale has to be set dependent on the mode:
"Scale To Fill": vec2(1.0, 1.0).
landscape "Aspect Fit": vec2(1.0, aspect).
landscape "Aspect Fill": vec2(1.0/aspect, 1.0).
portrait "Aspect Fit": vec2(1.0/aspect, 1.0).
portrait "Aspect Fill": vec2(1.0, aspect).
You have to set the texture wrap parameters (GL_TEXTURE_WRAP_S, GL_TEXTURE_WRAP_T) mode to GL_CLAMP_TO_EDGE. See glTexParameter.
Alternatively you can discard superfluous fragments in the fragment shader:
void main() {
if (vTexCoordinate.x < 0.0 || vTexCoordinate.x > 1.0 ||
vTexCoordinate.y < 0.0 || vTexCoordinate.y > 1.0) {
discard;
}
// [...]
}
Thanks to #Rabbid76 and his answer here , I managed to do it after adding this part to his answer :
float textureAspect = (float)textureSize.width / (float)textureSize.height;
float frameAspect = (float)frameSize.width / (float)frameSize.height;
float scaleX = 1, scaleY = 1;
float textureFrameRatio = textureAspect / frameAspect;
BOOL portraitTexture = textureAspect < 1;
BOOL portraitFrame = frameAspect < 1;
if(contentMode == AspectFill) {
if(portraitFrame)
scaleX = 1.f / textureFrameRatio;
else
scaleY = textureFrameRatio;
}
else if(contentMode == AspectFit) {
if(portraitFrame)
scaleY = textureFrameRatio;
else
scaleX = 1.f / textureFrameRatio;
}
Then I do vec2(scaleX, scaleY)
A metal fragment shader, For making a texture aspect fill/fit within an expected size.
fragment float4 fragment_aspect_fitfill(
VertexOut vertexIn [[stage_in]],
texture2d<float, access::sample> sourceTexture [[texture(0)]],
sampler sourceSampler [[sampler(0)]],
constant float2 &size [[ buffer(0) ]],
constant float &contentMode [[ buffer(1) ]])
{
float2 uv = vertexIn.textureCoordinate;
//Calculate Aspect Ration for both Texture and Expected output texture
float textureAspect = (float)sourceTexture.get_width() / (float)sourceTexture.get_height();
float frameAspect = (float)size.x / (float)size.y;
float scaleX = 1, scaleY = 1;
float textureFrameRatio = textureAspect / frameAspect;
bool portraitTexture = textureAspect < 1;
bool portraitFrame = frameAspect < 1;
// Content mode 0 is for aspect Fill, 1 is for Aspect Fit
if(contentMode == 0.0) {
if(portraitFrame)
scaleX = 1.f / textureFrameRatio;
else
scaleY = textureFrameRatio;
} else if(contentMode == 1.0) {
if(portraitFrame)
scaleY = textureFrameRatio;
else
scaleX = 1.f / textureFrameRatio;
}
float2 textureScale = float2(scaleX, scaleY);
float2 vTexCoordinate = textureScale * (uv - 0.5) + 0.5;
return sourceTexture.sample(sourceSampler, vTexCoordinate);
}
** Note: Use MetalPetal for better understanding some struct like VertexOut etc.
** This fragment shader algorithm may not be perfect, but this works and worked for me.

Showing Point Cloud Structure using Lighting in Three.js

I am generating a point cloud representing a rock using Three.js, but am facing a problem with visualizing its structure clearly. In the second screenshot below I would like to be able to denote the topography of the rock, like the corner (shown better in the third screenshot) of the structure, in a more explicit way, as I want to be able to maneuver around the rock and select different points. I have rocks that are more sparse (harder to see structure as points very far away) and more dense (harder to see structure from afar because points all mashed together, like first screenshot but even when closer to the rock), and finding a generalized way to approach this problem has been difficult.
I posted about this problem before here, thinking that representing the ‘depth’ of the rock into the screen would suffice, but after attempting the proposed solution I still could not find a nice way to represent the topography better. Is there a way to add a source of light that my shaders can pick up on? I want to see whether I can represent the colors differently based on their orientation to the source. Using a different software, a friend was able to produce the below image - is there a way to simulate this in Three.js?
For context, I am using Points with a BufferGeometry and ShaderMaterial. Below is the shader code I currently have:
Vertex:
precision mediump float;
varying vec3 vColor;
attribute float alpha;
varying float vAlpha;
uniform float scale;
void main() {
vAlpha = alpha;
vColor = color;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
#ifdef USE_SIZEATTENUATION
//bool isPerspective = ( projectionMatrix[ 2 ][ 3 ] == - 1.0 );
//if ( isPerspective ) gl_PointSize *= ( scale / -mvPosition.z );
#endif
gl_PointSize = 2.0;
gl_Position = projectionMatrix * mvPosition;
}
and
Fragment:
#ifdef GL_OES_standard_derivatives
#extension GL_OES_standard_derivatives : enable
#endif
precision mediump float;
varying vec3 vColor;
varying float vAlpha;
uniform vec2 u_depthRange;
float LinearizeDepth(float depth, float near, float far)
{
float z = depth * 2.0 - 1.0; // Back to NDC
return (2.0 * near * far / (far + near - z * (far - near)) - near) / (far-near);
}
void main() {
float r = 0.0, delta = 0.0, alpha = 1.0;
vec2 cxy = 2.0 * gl_PointCoord.xy - 1.0;
r = dot(cxy, cxy);
float lineardepth = LinearizeDepth(gl_FragCoord.z, u_depthRange[0], u_depthRange[1]);
if (r > 1.0) {
discard;
}
// Reseted back to 1.0 instead of using lineardepth method above
gl_FragColor = vec4(vColor, 1.0);
}
Thank you so much for your help!

GLSL sparking vertex shader

I am trying to tweak this ShaderToy example for vertices to create 'sparks'
out of them. Have tried to play with gl_PointCoord and gl_FragCoord without any results. Maybe, someone here could help me?
I need effect similar to this animated gif:
uniform float time;
uniform vec2 mouse;
uniform vec2 resolution;
#define M_PI 3.1415926535897932384626433832795
float rand(vec2 co)
{
return fract(sin(dot(co.xy ,vec2(12.9898,78.233))) * 43758.5453);
}
void main( ) {
float size = 30.0;
float prob = 0.95;
vec2 pos = floor(1.0 / size * gl_FragCoord.xy);
float color = 0.0;
float starValue = rand(pos);
if (starValue > prob)
{
vec2 center = size * pos + vec2(size, size) * 0.5;
float t = 0.9 + sin(time + (starValue - prob) / (1.0 - prob) * 45.0);
color = 1.0 - distance(gl_FragCoord.xy, center) / (0.5 * size);
color = color * t / (abs(gl_FragCoord.y - center.y)) * t / (abs(gl_FragCoord.x - center.x));
}
else if (rand(gl_FragCoord.xy / resolution.xy) > 0.996)
{
float r = rand(gl_FragCoord.xy);
color = r * ( 0.25 * sin(time * (r * 5.0) + 720.0 * r) + 0.75);
}
gl_FragColor = vec4(vec3(color), 1.0);
}
As I understand have to play with vec2 pos, setting it to a vertex position.
You don't need to play with pos. As Vertex Shader is only run by each vertex, there is no way to process its pixel values there using Pos. However, you can do processing pixel using gl_PointCoord.
I can think of two ways only for changing the scale of a texture
gl_PointSize in Vertex Shader in opengl es
In Fragment Shader, you can change the texture UV value, for example,
vec4 color = texture(texture0, ((gl_PointCoord-0.5) * factor) + vec2(0.5));
If you don't want to use any texture but only pixel processing in FS,
you can set UV like ((gl_PointCoord-0.5) * factor) + vec2(0.5)
instead of uv which is normally set as fragCoord.xy / iResolution.xy in Shadertoy

Three.js/Webgl vertex.y does not update

In effort to learn vertex/fragment shaders I decided to create a simple rain effect by updating the y position of a point in the vertex shader and resetting it back to animate through again using Three.js PointCloud. I got it to animate across the screen once but gets stuck after resetting the y position.
uniform float size;
uniform float delta;
varying float vOpacity;
varying float vTexture;
void main() {
vOpacity = opacity;
vTexture = texture;
gl_PointSize = 164.0;
vec3 p = position;
vec3 p = position;
p.y -= delta * 50.0;
vec4 mvPosition = modelViewMatrix * vec4(1.0 * p, 1.0 );
vec4 nPos = projectionMatrix * mvPosition;
if(nPos.y < -200.0){
nPos.y = 100.0;
}
gl_Position = nPos;
}
Any ideas? Thanks
shader does not change the vertex position permanently
that means
gl_Position = nPos;
will not propagate to your position attribute in geometry
shader only runs on graphics card and has no access to memory of the browser
you can change your code to this:
nPos.y = mod(nPos.y, 300.0) - 200.0;
now the y coordinate should change as you want it to(going from 100 to -200 then back to 100)

2D topdown Water Ripple Effect (Fragment Shader)

So here's the code I found:
RippleSprite.cpp
void RippleEffectSprite::update(float delta) { //called per frame
updateRippleParams();
// TODO: improve
float rippleSpeed = 0.25f;
float maxRippleDistance = 1;
m_rippleDistance += rippleSpeed * delta;
m_rippleRange = (1 - m_rippleDistance / maxRippleDistance) * 0.02f;
if (m_rippleDistance > maxRippleDistance) {
updateRippleParams();
unscheduleUpdate(); //stop updating
}
}
void RippleEffectSprite::updateRippleParams() {
getGLProgramState()->setUniformFloat("u_rippleDistance", m_rippleDistance);
getGLProgramState()->setUniformFloat("u_rippleRange", m_rippleRange);
}
Fragment Shader
varying vec4 v_fragmentColor;
varying vec2 v_texCoord;
uniform float u_rippleDistance;
uniform float u_rippleRange;
float waveHeight(vec2 p) {
float ampFactor = 2;
float distFactor = 2;
float dist = length(p);
float delta = abs(u_rippleDistance - dist);
if (delta <= u_rippleRange) {
return cos((u_rippleDistance - dist) * distFactor) * (u_rippleRange - delta) * ampFactor;
}
else {
return 0;
}
}
void main() {
vec2 p = v_texCoord - vec2(0.5, 0.5);
vec2 normal = normalize(p);
// offset texcoord along dist direction
vec2 v_texCoord2 = v_texCoord + normal * waveHeight(p);
gl_FragColor = texture2D(CC_Texture0, v_texCoord2) * v_fragmentColor;
}
Now i'll try my best to describe it in English, when run this creates a small circle (well not really circle, more like oval) at the middle of the Sprite, then it slowly expands outward, the textures below get distorted a bit, like a wave.
I've been reading stuff about Shaders for a week now and I understand how they work, but i don't understand this algorithm, can anyone explain to me how it created a oval and made it 'evenly',slowly expand?
here's the link of the tutorial: http://www.cocos.com/doc/tutorial/show?id=2121

Resources