Vertex Displacement Shader has radial distortion. How do I fix? - opengl-es

I'm writing a vertex displacement shader. I successfully mapped the vertices of a plane to the brightness values of a video with a GLSL Shader and Three.js, but the GLSL Shader is mapping the values in a radial fashion which might be appropriate for texturing a sphere, but not this plane. There is radial distortion coming from the center outward. How do I fix this radial distortion?
RuttEtraShader = {
uniforms: {
"tDiffuse": { type: "t", value: null },
"opacity": { type: "f", value: 1.0 }
},
vertexShader: [
'uniform sampler2D tDiffuse;',
'varying vec3 vColor;',
"varying vec2 vUv;",
'void main() {',
'vec4 newVertexPos;',
'vec4 dv;',
'float df;',
"vUv = uv;",
'dv = texture2D( tDiffuse, vUv.xy );',
'df = 1.33*dv.x + 1.33*dv.y + 16.33*dv.z;',
'newVertexPos = vec4( normalize(position) * df * 10.3, 0.0 ) + vec4( position, 1.0 );',
'vColor = vec3( dv.x, dv.y, dv.z );',
'gl_Position = projectionMatrix * modelViewMatrix * newVertexPos;',
'}'
].join("\n"),
fragmentShader: [
'varying vec3 vColor;',
'void main() {',
'gl_FragColor = vec4( vColor.rgb, 1.0 );',
'}'
].join("\n")
};
texture = new THREE.Texture( video );
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;
texture.generateMipmaps = true;
videoMaterial = new THREE.ShaderMaterial( {
uniforms: {
"tDiffuse": { type: "t", value: texture },
},
vertexShader: RuttEtraShader.vertexShader,
fragmentShader: RuttEtraShader.fragmentShader,
depthWrite: false,
depthTest: true,
transparent: true,
overdraw: false
} );
videoMaterial.renderToScreen = true;
geometry = new THREE.PlaneGeometry(720, 480, 720, 480);
geometry.overdraw = false;
geometry.dynamic = true;
mesh = new THREE.Mesh( geometry, videoMaterial );
mesh.position.x = 0;
mesh.position.y = 0;
mesh.position.z = 0;
mesh.visible = true;
scene.add( mesh );

Your displacement is radiating from the center of the screen due to the fact that you are using a direction vector relative to the point (0,0,0) in Normalized Device Coordinates (NDC).
Without re-defining any of your matrices, you might be able to solve this simply by using newVertexPos = vec4( normalize(position - origin) * df * 10.3, 0.0 ) + vec4( position, 1.0 );, where origin is the point you want to radiate from. In NDC, vec3 origin = vec3 (-1.0,-1.0,0.0) would have everything radiate from the lower-left corner.
You will still have a very noticeable radial displacement if you only make this change, however. Another thing you might consider is using non-uniform scaling of your displacement. Instead of * df * 10.3, you might use * df * vec3 (1.0, 1.0, 10.3). This will make the Z displacement much more pronounced.
You can mix and match both approaches to find what looks best. I suspect that increasing the scale of the Z displacement by itself would produce the results you are looking for, but it is always helpful to understand why the displacement is radiating from the center of the screen nevertheless.
I have a feeling your plane may be situated at Z=0, so the Z displacement will always be 0 unless you move it back/forward. If you add 1.0 to position.z it will move it to the far plane and if you subtract 1.0 from position.z it will move it to the near plane.

Related

Forge Viewer Autodesk v7 issues with recolouring THREE.BufferGeoemtry when using THREE.ShaderMaterial

EDIT: The Forge Viewer I'm using has customized version of Three.js release r71 (source) which is why I'm using outdated code. The current release of Three.js is r121.
I've created THREE.Group() that contains various THREE.Pointcloud(geometry, material). One of the Points is composed of THREE.BufferGeometry() and THREE.ShaderMaterial().
When I add a colour attribute to a BufferGeometry, only only red (1,0,0), white (1,1,1), or yellow (1,1,0) seem to work. This image is when I set the colour to (1,0,0). This image is when I set the colour to blue (0,0,1).
My question is, how do I resolve this? Is the issue in the shaders? Is the issue with how I build the BufferGeometry? Is it a bug? Thanks.
My shaders:
var vShader = `uniform float size;
varying vec3 vColor;
void main() {
vColor = color;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( size / (length(mvPosition.xyz) + 0.00001) );
gl_Position = projectionMatrix * mvPosition;
}`
var fShader = `varying vec3 vColor;
uniform sampler2D sprite;
void main() {
gl_FragColor = vec4(vColor, 1.0 ) * texture2D( sprite, gl_PointCoord );
if (gl_FragColor.x < 0.2) discard;
}`
My material:
var materialForBuffers = new THREE.ShaderMaterial( {
uniforms: {
size: { type: 'f', value: this.pointSize},
sprite: { type: 't', value: THREE.ImageUtils.loadTexture("../data/white.png") },
},
vertexShader: vShader,
fragmentShader: fShader,
transparent: true,
vertexColors: true,
});
How the color is added:
const colors = new Float32Array( [ 1.0, 0.0, 0.0 ] );
geometryForBuffers.addAttribute('color', new THREE.BufferAttribute( colors, 3 ));
Link to code
It looks like you may already be using parts of that sample code but if not, please refer to https://github.com/petrbroz/forge-point-clouds/blob/develop/public/scripts/extensions/pointcloud.js (live demo https://forge-point-clouds.autodesk.io). This sample code uses the color geometry attribute already to specify colors of individual points.

Setting texture to geometry in Three.js

I have this geometry: Picture
I want to add the same effect that mountains are with snow texture and so on:
Texture splatting with Three.js
Little background what info I give to shaders from Three.js:
//importing grass and snow textures:
var grassTexture = new THREE.ImageUtils.loadTexture( 'images/grass-512.jpg' );
grassTexture.wrapS = grassTexture.wrapT = THREE.RepeatWrapping;
var snowTexture = new THREE.ImageUtils.loadTexture( 'images/snow-512.jpg' );
snowTExture.wrapS = snowTExture.wrapT = THREE.RepeatWrapping;
this.customUniforms = {
grassTexture: { value: grassTexture },
snowTexture: { value: snowTexture },
};
var customMaterial = new THREE.ShaderMaterial({
uniforms: customUniforms,
side: THREE.DoubleSide,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
});
//creating mesh, geometry is the model in picture.
mesh = new THREE.Mesh(geometry, customMaterial);
Vertex and fragment shaders:
//vertexShader:
varying vec2 vUV;
void main(){
vUV = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
I get full red model with this:
//fragmentShader:
void main(){
gl_FragColor = vec4(1, 0.0, 0.0, 1.0) ;
}
I want textures that are higher with snowTexture and lower with grassTexture.
uniform sampler2D grassTexture;
uniform sampler2D snowTexture;
varying vec2 vUV;
//Something like this?:
vec4 grass = texture2D( grassTexture, vUV);
vec4 snow = texture2D( snowTexture, vUV);
gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0) + grass + snow;
This really not that hard to understand, let me walk you through the logic.
In your case, you don't want to use a displacement map. So, you need to set up a varying height on your vertexShader to map your vertices up-coordinates [0,1] to your fragmentShader.
//vertexShader:
varying vec2 vUV;
varying float height;
void main() {
vUV = uv;
float maxPosition = 30.0; // this is an example value.
height = max( 0.0, min(1.0, position.y/maxPosition ) ); // assuming +y is up
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Now you can access height from your fragmentShader and use that information to select where you want your transitions to occur.
uniform sampler2D grassTexture;
uniform sampler2D snowTexture;
varying vec2 vUV;
varying float height;
void main(){
vec4 grass = (1.0 - smoothstep( 0.48, 0.52, height)) * texture2D( grassTexture, vUV);
vec4 snow = (smoothstep(0.48, 0.52, height) - 0.0) * texture2D( snowTexture, vUV);
gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0) + grass + snow;
}
The link provided uses function smoothstep to make a gradual transition between the textures. We can create transitions using the follow pattern ( a - b ) * textureColor.
In this case, a controls when the texture starts to contribute to the fragment color.
b controls when the texture stops contributing.
In other words, your grass texture will have already started contributing at every height, so we map a to 1.0. It stops contributing around 0.5, so we give b a smooth fade-out as it approaches that 0.5.
Your snow texture, on the other hand, will only start contributing around 0.5. So, we give a a smooth fade-in as it approaches 0.5. It will never stop contributing, so we set b as 0.0.
Hope this clears things up for you.

Render only alpha values from texture

What i want to do is to load a texture with only alpha values from a PNG-picture while using the color of the material to render the RGB. To give you some context i use this for GPU-picking to find sprites that are clicked on. This way i can know if a sprite was clicked on or if the user clicked on the transparent part of the sprite.
I tried using the THREE.AlphaFormat as format and i tried all the types but
what i get is a sprite with correct alpha, but the color from the texture is combined with the color of the material.
Here is the code i tried so far
var type = THREE.UnsignedByteType;
var spriteMap = new THREE.TextureLoader().load( url );
spriteMap.format = THREE.AlphaFormat;
spriteMap.type = type;
var spriteMaterial = new THREE.SpriteMaterial( { map: spriteMap , color: idcolor.getHex() } ); //
var sprite = new THREE.Sprite( spriteMaterial );
sprite.position.set( this.position.x , this.position.y , this.position.z );
sprite.scale.set( this.scale.x , this.scale.y , this.scale.z );
Selection.GpuPicking.pickingScene.add( sprite );
Any ideas on how to achieve this?
three.js r.91
I didn't manage to do what i wanted with combining texture and material. My solution was to create a plane and add my own custom shaders handling the Sprite functionality. I copied the shaders for sprites from three.js library and removed the code i didn't need since i only needed the correct alpha and one color to be visible.
My code for creating a sprite with color from material and alpha values from texture
//Create the position, scale you want for your sprite and add the url to your texture
var spriteMap = new THREE.TextureLoader().load( url , function( texture ){
var geometry = new THREE.PlaneGeometry( 1.0, 1.0 );
uniforms = {
color: {type:"v3", value: color },
map: {value: texture },
opacity: {type:"f", value: 1.0 },
alphaTest: {type:"f", value: 0.0 },
scale: {type:"v3", value: scale }
};
var material = new THREE.ShaderMaterial(
{
uniforms: uniforms,
vertexShader: vertexShader, //input the custom shader here
fragmentShader: fragmentShader, //input the custom shader here
transparent: true,
});
var mesh = new THREE.Mesh( geometry, material );
mesh.position.set( position.x , position.y , position.z );
mesh.scale.set( scale.x , scale.y ,sprite.scale.z );
scene.add(mesh);
} );
This is my vertex shader:
uniform vec3 scale;
varying vec2 vUV;
void main() {
float rotation = 0.0;
vUV = uv;
vec3 alignedPosition = position * scale;
vec2 rotatedPosition;
rotatedPosition.x = cos( rotation ) * alignedPosition.x - sin( rotation ) * alignedPosition.y;
rotatedPosition.y = sin( rotation ) * alignedPosition.x + cos( rotation ) * alignedPosition.y;
vec4 mvPosition;
mvPosition = modelViewMatrix * vec4( 0.0, 0.0 , 0.0 , 1.0 );
mvPosition.xy += rotatedPosition;
gl_Position = projectionMatrix * mvPosition;
}
my fragment shader:
varying vec2 vUV;
uniform vec3 color;
uniform sampler2D map;
uniform float opacity;
uniform float alphaTest;
void main() {
vec4 texture = texture2D( map, vUV );
//showing the color from material, but uses alpha from texture
gl_FragColor = vec4( color , texture.a * opacity );
if ( gl_FragColor.a < alphaTest ) discard;
}

How to increase line thickness in three.js edges geometry using shaders?

I'm trying to replicate the effect shown in this Three.js example but instead of showing the wireframe and an opaque box, I'd like to show just the edges without any faces (like what is shown when using the THREE.EdgesGeometry.) I know that setting the linewidth property doesn't work and that using shaders is necessary but I'm not really sure where to begin. For reference, these are the shaders being used in the above Three.js example:
Vertex Shader:
attribute vec3 center;
varying vec3 vCenter;
void main() {
vCenter = center;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Fragment Shader:
varying vec3 vCenter;
float edgeFactorTri() {
vec3 d = fwidth( vCenter.xyz );
vec3 a3 = smoothstep( vec3( 0.0 ), d * 1.5, vCenter.xyz );
return min( min( a3.x, a3.y ), a3.z );
}
void main() {
gl_FragColor.rgb = mix( vec3( 1.0 ), vec3( 0.2 ), edgeFactorTri() );
gl_FragColor.a = 1.0;
}
I've gotten as far as figuring out that changing what d gets multiplied by (1.5 in the example) is what determines the thickness of the line but I'm completely lost as to how the vCenter variable is actually used (it's a vec3 that is either [1, 0, 0], [0, 1, 0] or [0, 0, 1]) or what I could use to make the THREE.EdgesGeometry render with thicker lines like in the example.
Here is what happens when I try rendering the edges geometry with these shaders:
<script type="x-shader/x-vertex" id="vertexShader">
attribute vec3 center;
varying vec3 vCenter;
void main() {
vCenter = center;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<script type="x-shader/x-fragment" id="fragmentShader">
varying vec3 vCenter;
uniform float lineWidth;
float edgeFactorTri() {
float newWidth = lineWidth + 0.5;
vec3 d = fwidth( vCenter.xyz );
vec3 a3 = smoothstep( vec3( 0.0 ), d * newWidth, vCenter.xyz );
return min( min( a3.x, a3.y ), a3.z );
}
void main() {
gl_FragColor.rgb = mix( vec3( 1.0 ), vec3( 0.2 ), edgeFactorTri() );
gl_FragColor.a = 1.0;
}
</script>
Javascript:
size = 150
geometry = new THREE.BoxGeometry(size, size, size);
material = new THREE.MeshBasicMaterial({ wireframe: true });
mesh = new THREE.Mesh(geometry, material);
mesh.position.x = -150;
scene.add(mesh);
//
// geometry = new THREE.BufferGeometry().fromGeometry(new THREE.BoxGeometry(size, size, size));
geometry = new THREE.EdgesGeometry(new THREE.BoxGeometry(size, size, size));
setupAttributes(geometry);
material = new THREE.ShaderMaterial({
uniforms: { lineWidth: { value: 10 } },
vertexShader: document.getElementById("vertexShader").textContent,
fragmentShader: document.getElementById("fragmentShader").textContent
});
material.extensions.derivatives = true;
mesh = new THREE.Mesh(geometry, material);
mesh.position.x = 150;
scene.add(mesh);
//
geometry = new THREE.BufferGeometry().fromGeometry(new THREE.SphereGeometry(size / 2, 32, 16));
setupAttributes(geometry);
material = new THREE.ShaderMaterial({
uniforms: { lineWidth: { value: 1 } },
vertexShader: document.getElementById("vertexShader").textContent,
fragmentShader: document.getElementById("fragmentShader").textContent
});
material.extensions.derivatives = true;
mesh = new THREE.Mesh(geometry, material);
mesh.position.x = -150;
scene.add(mesh);
jsFiddle
As you can see in the fiddle, this is not what I'm looking for, but I don't have a good enough grasp on how the shaders work to know where I'm going wrong or if this approach would work for what I want.
I've looked into this answer but I'm not sure how to use it as a ShaderMaterial and I can't use it as a shader pass (here are the shaders he uses for his answer.)
I've also looked into THREE.MeshLine and this issue doesn't seem to have been resolved.
Any guidance would be greatly appreciated!
You want to modify this three.js example so the mesh is rendered as a thick wireframe.
The solution is to modify the shader and discard fragments in the center portion of each face -- that is, discard fragments not close to an edge.
You can do that like so:
void main() {
float factor = edgeFactorTri();
if ( factor > 0.8 ) discard; // cutoff value is somewhat arbitrary
gl_FragColor.rgb = mix( vec3( 1.0 ), vec3( 0.2 ), factor );
gl_FragColor.a = 1.0;
}
You can also set material.side = THREE.DoubleSide if you want.
updated fiddle: https://jsfiddle.net/vy0we5wb/4.
three.js r.89

Shadow won't update when geometry is changed using VertexShader

I'm using a custom shader to curve a plane. My custom shader extends the Lambert shader so it supports lights and shadows. It all works as expected, but when the vertexShader changes the geometry of the plane, the shadow doesn't update. Is there anything I'm missing to flag that the geometry has updated in my vertexShader and the shadow needs to change?
[Here is a screenshot of the problem. The plane is curved with a vertexShader, but the shadow doesn't update][1]
[1]: http://i.stack.imgur.com/6kfCF.png
Here is the demo/code: http://dev.cartelle.nl/curve/
If you drag the "bendAngle" slider you can see that the shadow doesn't update.
One work-around I thought was to get the bounding box of my curved plane. Then use those points and create a new Mesh/Box and use that object to cast the shadow. But then I wasn't sure how to get the coordinates of the new curved geometry. When I would check geometry.boundingBox after the shader was applied, it would also just give me the original coordinates every time.
Thanks
Johnny
If you are modifying the geometry positions in the vertex shader, and you are casting shadows, you need to specify a custom depth material so the shadows will respond to the modified positions.
In your custom depth material's vertex shader, you modify the vertex positions in the same way you modified them in the material's vertex shader.
An example of a custom depth material can be seen in this three.js example, (although vertices are not modfied in the vertex shader in that example; they are modified on the CPU).
In your case, you would create a vertex shader for the custom depth material using a pattern like so:
<script type="x-shader/x-vertex" id="vertexShaderDepth">
uniform float bendAngle;
uniform vec2 bounds;
uniform float bendOffset;
uniform float bendAxisAngle;
vec3 bendIt( vec3 ip, float ba, vec2 b, float o, float a ) {
// your code here
return ip;
}
void main() {
vec3 p = bendIt( position, bendAngle, bounds, bendOffset, bendAxisAngle );
vec4 mvPosition = modelViewMatrix * vec4( p, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
</script>
And fragment shader like this:
<script type="x-shader/x-fragment" id="fragmentShaderDepth">
vec4 pack_depth( const in float depth ) {
const vec4 bit_shift = vec4( 256.0 * 256.0 * 256.0, 256.0 * 256.0, 256.0, 1.0 );
const vec4 bit_mask = vec4( 0.0, 1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0 );
vec4 res = fract( depth * bit_shift );
res -= res.xxyz * bit_mask;
return res;
}
void main() {
gl_FragData[ 0 ] = pack_depth( gl_FragCoord.z );
}
</script>
Then in your javascript, you specify the custom depth material:
uniforms = {};
uniforms.bendAngle = { type: "f", value: properties.bendAngle };
uniforms.bendOffset = { type: "f", value: properties.offset };
uniforms.bendAxisAngle = { type: "f", value: properties.bendAxisAngle };
uniforms.bounds = { type: "v2", value: new THREE.Vector2( - 8, 16 ) };
var vertexShader = document.getElementById( 'vertexShaderDepth' ).textContent;
var fragmentShader = document.getElementById( 'fragmentShaderDepth' ).textContent;
myObject.customDepthMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: vertexShader,
fragmentShader: fragmentShader
} );
three.js r.74

Resources