Setting up WebGL shaders on Mac, Three.js, & vite-plugin-glsl - macos

I am working on learning shaders through Three.js but I am having a bit trouble trying to get the setup to work. I am using vites plugin glsl for my shader which I have set up. At first I tried following along some more advance videos, but the glsl/frag/vert files didn't seem to work so I found a video that brought it down to the basicss. Thankfully I can get the shader to visualize and change color, but it looks like my vertex shader does not want to work. Originally I placed them in separate GLSL files, but that was giving me more problems that way, so I opted towards embedding them inside of my JS files. Here is my current basic project.
import * as THREE from 'three';
import { OrbitControls } from "three/addons/controls/OrbitControls.js";
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight)
camera.position.set(-10, 10, -1)
const renderer = new THREE.WebGLRenderer();
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(window.innerWidth, window.innerHeight);
const canvas = document.body;
canvas.appendChild(renderer.domElement);
const controls = new OrbitControls(camera, canvas);
// Lights
const ambientLights = new THREE.AmbientLight(0xffffff, 0.6);
scene.add(ambientLights)
const directionalLight = new THREE.DirectionalLight('#ffffff', 1);
directionalLight.castShadow = true;
directionalLight.receiveShadow = true;
directionalLight.shadow.mapSize.set(window.innerWidth, window.innerHeight);
directionalLight.shadow.camera.far = 0.01;
directionalLight.shadow.normalBias = 1.05;
directionalLight.position.set(200, 400, 10.25);
scene.add(directionalLight);
const shaderMaterial = new THREE.RawShaderMaterial({
vertexShader: `
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
uniform float uFrequency;
uniform float uAmplitude;
attribute vec3 position;
void main() {
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
// anything I place in here doesn't update
modelPosition.y += sin(modelPosition.x * uFrequency) * uAmplitude;
vec4 viewPosition = viewMatrix * modelPosition;
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);
}
`,
fragmentShader: `
precision mediump float;
void main() {
gl_FragColor = vec4(0.0, 0.5, 0.5, 1.0);
}
`,
wireframe: true,
side: THREE.DoubleSide,
uniforms: {
uFrequency: { value: 10.0 },
uAmplitude: { value: 0.1 }
}
})
const plane = new THREE.Mesh(
new THREE.PlaneGeometry(10, 10, 10, 10),
shaderMaterial
)
plane.rotation.set(-Math.PI / 2, 0, 0);
plane.castShadow = true;
plane.receiveShadow = true;
scene.add(plane);
const animate = () => {
controls.update();
renderer.render(scene, camera)
requestAnimationFrame(animate);
}
animate();
The fragment shader works because I see the change in color, however my vertex shader is where the issue lies. It doesn't disappear, but nothing changes and it also doesn't throw any errors to debug. Whenever I try updating my vertex shader, nothing happens. So it's semi working, but can't really do anything pass setting it up like so. I understand that WebGL was deprecated for MACs, but I always notice several videos online with mac users still using it, so I figure it has to work, there must be some things I'm just missing

The issue with your vertex shader is that you're assigning values to modelPosition, but then you're not using them! When you assign the final output to gl_Position, you're not using any of the calculations you performed previously:
void main() {
// You create and modify modelPosition...
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
modelPosition.y += sin(modelPosition.x * uFrequency) * uAmplitude;
vec4 viewPosition = viewMatrix * modelPosition;
// ...but then you don't use it on your final output!
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);
}
Instead of doing all the matrix multiplications again from scratch, just make sure you use the position you've modified.
void main() {
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
modelPosition.y += sin(modelPosition.x * uFrequency) * uAmplitude;
vec4 viewPosition = viewMatrix * modelPosition;
// Make sure you use viewPosition in your final output
gl_Position = projectionMatrix * viewPosition;
}

Related

How can I make waves from the center of a Plane Geometry in Three.JS using the vertex shader?

I've been learning Three.js and I can't seem to wrap my head around shaders. I have an idea of what I want, and I know the mathematical tools within the GLSL language and what they do in simple terms, but I don't understand how they work together.
I have a plane geometry with a shader material, I want to be able to create waves from the center of the vertex shader, but I am unsure how to accomplish this.
Also, if there is a course or documentation you can provide that could explain simple concepts regarding vertex and fragment shaders that would be great!
This is what I have done so far:
varying vec2 vUv;
varying float vuTime;
varying float vElevation;
uniform float uTime;
void main(){
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
modelPosition.y += elevation;
vec4 viewPosition = viewMatrix * modelPosition;
vec4 projectedPosition = projectionMatrix * viewPosition;
gl_Position = projectedPosition;
vuTime = uTime;
vUv = uv;
vElevation = elevation;
}
I have set up a simple animation using the sin function and a time variable passed to the shader which creates a simple wave effect without the use of noise. I am trying to create a circular wave stemming from the center of the plane geometry.
What I THINK I have to do is use PI to offset the position away from the center while the wave is moving with uTime. To get to the center of the Plane geometry I need to offset the position with 0.5 float.
That is my understanding right now and I would love to know if I'm correct in my thinking or what a correct way is of accomplishing this.
I also am passing the varying variable to the fragment shader to control the color at the elevation.
Thanks for any help you guys provide; I appreciate it!
In your shader code, try to change this line
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
to this
float elevation = sin(length(modelPosition.xz) * 10.0 - uTime) * 0.1;
You can use either UV coords or position.
let scene = new THREE.Scene();
let camera = new THREE.PerspectiveCamera(60, innerWidth / innerHeight, 1, 1000);
camera.position.set(0, 10, 10).setLength(10);
let renderer = new THREE.WebGLRenderer();
renderer.setSize(innerWidth, innerHeight);
document.body.appendChild(renderer.domElement);
let controls = new THREE.OrbitControls(camera, renderer.domElement);
scene.add(new THREE.GridHelper(10, 10, "magenta", "yellow"));
let g = new THREE.PlaneGeometry(10, 10, 50, 50);
let m = new THREE.ShaderMaterial({
wireframe: true,
uniforms: {
time: {value: 0},
color: {value: new THREE.Color("aqua")}
},
vertexShader:`
#define PI 3.1415926
#define PI2 PI*2.
uniform float time;
void main(){
vec3 pos = position;
pos.z = sin((length(uv - 0.5) - time) * 6. * PI2);
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.);
}
`,
fragmentShader:`
uniform vec3 color;
void main(){
gl_FragColor = vec4(color, 1.);
}
`
});
let o = new THREE.Mesh(g, m);
o.rotation.x = -Math.PI * 0.5;
scene.add(o);
let clock = new THREE.Clock();
renderer.setAnimationLoop(() => {
let t = clock.getElapsedTime();
m.uniforms.time.value = t * 0.1;
renderer.render(scene, camera);
});
body{
overflow: hidden;
margin: 0;
}
<script src="https://threejs.org/build/three.min.js"></script>
<script src="https://threejs.org/examples/js/controls/OrbitControls.js"></script>

instancedBuffergeometry with spritesheet doesn't show up from threejs v117

I'm using InstancedBufferGeometry to draw many .gltf objects and using spritesheet like https://imgur.com/a/G9CBldQ to apply many kinds of texture.
To use spritesheet, I'm using material based on MeshLambertMaterial using onBeforeCompile function, and before threejs v116, it was working perfectly.
But after upgrading threejs and GLTFLoader to v117, nothing is displayed.
I'm implementing onBeforeCompile like this:
class InstancedLambertMaterial extends THREE.MeshLambertMaterial {
constructor(params) {
super(params);
this.spriteGrids = params?.spriteGrids;
this.userData = {
uniforms: {
vUvScale: { value: 1 / Math.sqrt(params?.spriteGrids) }
}
};
}
onBeforeCompile(shader) {
Object.assign(shader.uniforms, this.userData.uniforms);
shader.vertexShader = `#define USE_INSTANCING_CUSTOM\n${shader.vertexShader}`;
const instancedAttributes = `
attribute vec3 translation;
attribute vec4 orientation;
attribute vec3 scale;
attribute vec2 vUvOffsets;
varying vec2 v_vUvOffsets;
uniform float vUvScale;
`;
shader.vertexShader = shader.vertexShader.replace('#include <common>', `${instancedAttributes}\n#include <common>`);
const replacedProjectVertex = `
vec4 mvPosition = vec4( transformed, 1.0 );
#ifdef USE_INSTANCING
mvPosition = instanceMatrix * mvPosition;
#endif
#ifdef USE_INSTANCING_CUSTOM
vUv = uv;
transformed *= scale;
vec3 vcV = cross(orientation.xyz, transformed);
transformed = vcV * (2.0 * orientation.w) + (cross(orientation.xyz, vcV) * 2.0 + transformed);
mvPosition = vec4(translation + transformed, 1.0);
#endif
mvPosition = modelViewMatrix * mvPosition;
gl_Position = projectionMatrix * mvPosition;
#ifdef USE_INSTANCING_CUSTOM
v_vUvOffsets = vUvOffsets;
#endif
`;
shader.vertexShader = shader.vertexShader.replace('#include <project_vertex>', replacedProjectVertex);
shader.fragmentShader = `#define USE_SPRITESHEET\n${shader.fragmentShader}`;
const spriteSheetUniforms = `
#include <map_pars_fragment>
#ifdef USE_SPRITESHEET
uniform float vUvScale;
varying vec2 v_vUvOffsets;
#endif
`;
shader.fragmentShader = shader.fragmentShader.replace('#include <map_pars_fragment>', spriteSheetUniforms);
const spriteSheetTexelColorBranch = `
#ifdef USE_SPRITESHEET
vec4 texelColor = texture2D( map, (vUv * vUvScale) + (v_vUvOffsets * vUvScale) );
texelColor = mapTexelToLinear( texelColor );
diffuseColor *= texelColor;
#endif
`;
shader.fragmentShader = shader.fragmentShader.replace('#include <map_fragment>', spriteSheetTexelColorBranch);
this.userData = shader;
}
}
and preparing each transformation attributes like this and apply it.
const scales = new THREE.InstancedBufferAttribute(new Float32Array(instances * 3), 3, false);
const translations = new THREE.InstancedBufferAttribute(new Float32Array(instances * 3), 3, false);
const orientations = new THREE.InstancedBufferAttribute(new Float32Array(instances * 4), 4, false);
const tex_vec = new THREE.InstancedBufferAttribute(new Float32Array(instances * 2), 2, false);
I checked shader output(by raising shader error deliberately) and looks like nothing is changed related to draw my objects.
I looked into relase notes of v117, but looks like nothing changed related to my projects.
I want to be able to execute these codes for the newest version of threejs.
I made an working example. Both code is same, except version of threejs and GLTFLoader.
this is the result of v116
https://jsfiddle.net/maemaemae3/o9t1wxrm/1/
and v117
https://jsfiddle.net/maemaemae3/2cgym7n3/1/
The problem is this line:
const igeo = new THREE.InstancedBufferGeometry().copy(geometry);
If you do this, properties of InstancedBufferGeometry become undefined since they do not exist in BufferGeometry. A refactoring in r117 made this error visible.
I've fixed your second fiddle by restoring the instanceCount property:https://jsfiddle.net/9k4oqerc/

How to add opacity map to ShaderMaterial

I've applied ShaderMaterial to a glb model that has opacity map (the model is human body and the opacity map is used to create hair and eyelashes), the reference for the model material was this -
So as you can see - the material is some sort of glow effect, so i was manage to find This Example which is pretty much what i need - the problem is that i can't figure out how to apply the models opacity map - if you look closely on the difference between my result (left picture) to the right picture - you'll see that the hair doesn't looks as it should - since the opacity map do not applied... i wonder is the ShaderMaterial is the good for this look or should i use other kind of shader.
Here is my material code -
let m = new THREE.MeshStandardMaterial({
roughness: 0.25,
metalness: 0.75,
opacity: 0.3,
map: new THREE.TextureLoader().load(
"/maps/opacity.jpg",
(tex) => {
tex.wrapS = THREE.RepeatWrapping;
tex.wrapT = THREE.RepeatWrapping;
tex.repeat.set(16, 1);
}
),
onBeforeCompile: (shader) => {
shader.uniforms.s = uniforms.s;
shader.uniforms.b = uniforms.b;
shader.uniforms.p = uniforms.p;
shader.uniforms.glowColor = uniforms.glowColor;
shader.vertexShader = document.getElementById("vertexShader").textContent;
shader.fragmentShader = document.getElementById(
"fragmentShader"
).textContent;
shader.side = THREE.FrontSide;
shader.transparent = true;
// shader.uniforms['alphaMap'].value.needsUpdate = true;
console.log(shader.vertexShader);
console.log(shader.fragmentShader);
},
});
Shader setting:
<script id="vertexShader" type="x-shader/x-vertex">
varying vec3 vNormal;
varying vec3 vPositionNormal;
void main()
{
vNormal = normalize( normalMatrix * normal ); // 转换到视图空间
vPositionNormal = normalize(( modelViewMatrix * vec4(position, 1.0) ).xyz);
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<!-- fragment shader a.k.a. pixel shader -->
<script id="fragmentShader" type="x-shader/x-vertex">
uniform vec3 glowColor;
uniform float b;
uniform float p;
uniform float s;
varying vec3 vNormal;
varying vec3 vPositionNormal;
void main()
{
float a = pow( b + s * abs(dot(vNormal, vPositionNormal)), p );
gl_FragColor = vec4( mix(vec3(0), glowColor, a), 1. );
}
</script>
You're creating a MeshStandardMaterial, but then you're overriding all its shader code when you assign new vertex and fragment shaders, making the Standard material useless. You should stick to ShaderMaterial like the demo you linked. It would make your code cleaner:
// Get shader code
let vertShader = document.getElementById("vertexShader").textContent;
let fragShader = document.getElementById("fragmentShader").textContent;
// Build texture
let alphaTex = new THREE.TextureLoader().load("/maps/opacity.jpg");
alphaTex.wrapS = THREE.RepeatWrapping;
alphaTex.wrapT = THREE.RepeatWrapping;
// alphaTex.repeat.set(16, 1); <- repeat won't work in a custom shader
// Build material
let m = new THREE.ShaderMaterial({
transparent: true,
// side: THREE.FrontSide, <- this is already default. Not needed
uniforms: {
s: {value: 1},
b: {value: 2},
p: {value: 3},
alphaMap: {value: alphaTex},
glowColor: {value: new THREE.Color(0x0099ff)},
// we create a Vec2 to manually handle repeat
repeat: {value: new THREE.Vector2(16, 1)}
},
vertexShader: vertShader,
fragmentShader: fragShader
});
This helps build you material in a cleaner way, since you're using its native build method without having to override anything. Then, you can sample the alphaMap texture in your fragment shader:
uniform float s;
uniform float b;
uniform float p;
uniform vec3 glowColor;
uniform vec2 repeat;
// Declare the alphaMap uniform if we're gonna use it
uniform sampler2D alphaMap;
// Don't forget to declare UV coordinates
varying vec2 vUv;
varying vec3 vNormal;
varying vec3 vPositionNormal;
void main()
{
float a = pow( b + s * abs(dot(vNormal, vPositionNormal)), p );
// Sample map with UV coordinates. Multiply by uniform to get repeat
float a2 = texture2D(alphaMap, vUv * repeat).r;
// Combine both alphas
float opacity = a * a2;
gl_FragColor = vec4( mix(vec3(0), glowColor, opacity), 1. );
}
Also, don't forget to carry over the UVs from your vertex shader:
// Don't forget to declare UV coordinates
varying vec2 vUv;
varying vec3 vNormal;
varying vec3 vPositionNormal;
void main()
{
// convert uv attribute to vUv varying
vUv = uv;
vNormal = normalize( normalMatrix * normal ); // 转换到视图空间
vPositionNormal = normalize(( modelViewMatrix * vec4(position, 1.0) ).xyz);
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Update
The error
'=' : cannot convert from 'lowp 4-component vector of float' to 'highp float'
means I made a mistake when taking the texture2D() sample in the fragment shader. It should have been texture2D().r so we only read the red channel to get a float instead of cramming all RGBA channels (yielding a vec4) into a float. See the following snippet for the final result:
var container, scene, camera, renderer, controls, torusKnot;
init()
function init() {
initBase()
initObject()
render()
}
function initBase () {
container = document.getElementById( 'ThreeJS' )
// SCENE
scene = new THREE.Scene();
// CAMERA
var SCREEN_WIDTH = window.innerWidth, SCREEN_HEIGHT = window.innerHeight
var VIEW_ANGLE = 45, ASPECT = SCREEN_WIDTH / SCREEN_HEIGHT, NEAR = 0.1, FAR = 20000
camera = new THREE.PerspectiveCamera( VIEW_ANGLE, ASPECT, NEAR, FAR)
camera.position.set(0,0,50)
camera.lookAt(scene.position)
// RENDERER
renderer = new THREE.WebGLRenderer( {antialias:true} )
renderer.setSize(SCREEN_WIDTH, SCREEN_HEIGHT)
renderer.setClearColor(0x333333)
container.appendChild( renderer.domElement )
// CONTROLS
controls = new THREE.OrbitControls( camera, renderer.domElement )
// Resize
window.addEventListener("resize", onWindowResize);
}
function onWindowResize() {
var w = window.innerWidth;
var h = window.innerHeight;
renderer.setSize(w, h);
camera.aspect = w / h;
camera.updateProjectionMatrix();
}
function initObject () {
let vertShader = document.getElementById("vertexShader").textContent;
let fragShader = document.getElementById("fragmentShader").textContent;
// Build texture
let alphaTex = new THREE.TextureLoader().load("https://threejs.org/examples/textures/floors/FloorsCheckerboard_S_Diffuse.jpg");
alphaTex.wrapS = THREE.RepeatWrapping;
alphaTex.wrapT = THREE.RepeatWrapping;
var customMaterial = new THREE.ShaderMaterial({
uniforms: {
s: {value: -1},
b: {value: 1},
p: {value: 2},
alphaMap: {value: alphaTex},
glowColor: {value: new THREE.Color(0x00ffff)},
// we create a Vec2 to manually handle repeat
repeat: {value: new THREE.Vector2(16, 1)}
},
vertexShader: vertShader,
fragmentShader: fragShader
})
var geometry = new THREE.TorusKnotBufferGeometry( 10, 3, 100, 32 )
torusKnot = new THREE.Mesh( geometry, customMaterial )
scene.add( torusKnot )
}
function render() {
torusKnot.rotation.y += 0.01;
renderer.render( scene, camera );
requestAnimationFrame(render);
}
body{
overflow: hidden;
margin: 0;
}
<script src="https://threejs.org/build/three.js"></script>
<script src="https://threejs.org/examples/js/controls/OrbitControls.js"></script>
<!-- vertext shader a.k.a. pixel shader -->
<script id="vertexShader" type="x-shader/x-vertex">
varying vec2 vUv;
varying vec3 vNormal;
varying vec3 vPositionNormal;
void main()
{
// convert uv attribute to vUv varying
vUv = uv;
vNormal = normalize( normalMatrix * normal ); // 转换到视图空间
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
vPositionNormal = normalize(( mvPosition ).xyz);
gl_Position = projectionMatrix * mvPosition;
}
</script>
<!-- fragment shader a.k.a. pixel shader -->
<script id="fragmentShader" type="x-shader/x-vertex">
uniform float s;
uniform float b;
uniform float p;
uniform vec3 glowColor;
uniform vec2 repeat;
// Declare the alphaMap uniform if we're gonna use it
uniform sampler2D alphaMap;
// Don't forget to declare UV coordinates
varying vec2 vUv;
varying vec3 vNormal;
varying vec3 vPositionNormal;
void main()
{
float a = pow( b + s * abs(dot(vNormal, vPositionNormal)), p );
// Sample map with UV coordinates. Multiply by uniform to get repeat
float a2 = texture2D(alphaMap, vUv * repeat).r;
// Combine both alphas
float opacity = a * a2;
gl_FragColor = vec4( mix(vec3(0), glowColor, opacity), 1. );
}
</script>
<div id="ThreeJS" style="position: absolute; left:0px; top:0px"></div>

WebGL - Problem with a static directional light following a rotating object and/or the camera movement

I am struggling to set a day-night cycle with a directional light in a Earth model by using custom shaders. The night and day maps as well as the light are ok as long as I do not touch the camera, i.e., the Earth rotates as the light source remains still and nights and days are updated correctly. However, when I rotate the camera using the mouse, the light appears to follow the the camera, so you always see an illuminated part of the Earth.
This is how I set the light source:
var light = new THREE.DirectionalLight(0xffffff, 1);
light.position.set(5,3,5);
scene.add(light);
This is how I pass the parameters to the shader:
uniforms_earth = {
sunPosition: { type: "v3", value: light.position },
dayTexture: { type: "t", value: THREE.ImageUtils.loadTexture( "daymap.jpg" ) },
nightTexture: { type: "t", value: THREE.ImageUtils.loadTexture( "images/nightmap.jpg" ) }
};
This is the vertex shader:
varying vec2 v_Uv;
varying vec3 v_Normal;
uniform vec3 sunPosition;
varying vec3 v_vertToLight;
void main() {
v_Uv = uv;
v_Normal = normalMatrix * normal;
vec4 worldPosition = modelViewMatrix * vec4(position, 1.0);
v_vertToLight = normalize(sunPosition - worldPosition.xyz);
gl_Position = projectionMatrix * worldPosition;
}
And this the fragment shader:
uniform sampler2D dayTexture;
uniform sampler2D nightTexture;
varying vec2 v_Uv;
varying vec3 v_Normal;
varying vec3 v_vertToLight;
void main( void ) {
vec3 dayColor = texture2D(dayTexture, v_Uv).rgb;
vec3 nightColor = texture2D(nightTexture, v_Uv).rgb;
vec3 fragToLight = normalize(v_vertToLight);
float cosineAngleSunToNormal = dot(normalize(v_Normal), fragToLight);
cosineAngleSunToNormal = clamp(cosineAngleSunToNormal * 10.0, -1.0, 1.0);
float mixAmount = cosineAngleSunToNormal * 0.5 + 0.5;
vec3 color = mix(nightColor, dayColor, mixAmount);
gl_FragColor = vec4( color, 1.0 );
}
Finally, I use the THREE library for the camera controls:
var controls = new THREE.TrackballControls(camera);
And I update the Earth rotation inside the render function as:
function render() {
controls.update();
earth.rotation.y += rotation_speed;
requestAnimationFrame(render);
renderer.render(scene, camera);
}
I have already tried to change how I compute v_vertToLight so that both the vertex and the light position are in the same world as:
v_vertToLight = normalize((modelViewMatrix*vec4(sunPosition, 1.0)).xyz - worldPosition.xyz);
This stops the light from moving when I change the camera, but then, the night-day shadow remains always in the exact same place as the light appears to start rotating with the Earth itself.
I feel that I am close to solving this, so any hint or help would be much appreciated. Thank you for your time.
Blockquote
What you call worldPosition is not a position in world space, it is a position in view space. Rename the misnamed variable:
vec4 worldPosition = modelViewMatrix * vec4(position, 1.0);
vec4 viewPosition = modelViewMatrix * vec4(position, 1.0);
sunPosition is a position in world space. It has to be transformed to view space, before it can be used to calculate the view space light vector. This has to be done by the viewMatrix rather than modelViewMatrix. Note, the modelViewMatrix from model space to view space and the viewMatrix transforms from worlds space to view space (see three.js - WebGLProgram):
vec4 viewSunPos = viewMatrix * vec4(sunPosition, 1.0);
v_vertToLight = normalize(viewSunPos.xyz - viewPosition.xyz);
Note, v_vertToLight and v_Normal both have to be either view space vectors or world space vectors, the have to have the same reference system. Otherwise it would not make sense to calculate the dot product of both vectors.
Vertex shader:
varying vec2 v_Uv;
varying vec3 v_Normal;
uniform vec3 sunPosition;
varying vec3 v_vertToLight;
void main() {
vec4 viewPosition = modelViewMatrix * vec4(position, 1.0);
vec4 viewSunPos = viewMatrix * vec4(sunPosition, 1.0);
v_Uv = uv;
v_Normal = normalMatrix * normal;
v_vertToLight = normalize(viewSunPos.xyz - viewPosition.xyz);
gl_Position = projectionMatrix * viewPosition;
}
See the very simple example, which uses the vertex shader:
(function onLoad() {
var loader, camera, scene, renderer, orbitControls, mesh;
init();
animate();
function init() {
renderer = new THREE.WebGLRenderer({
antialias: true,
alpha: true
});
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(window.innerWidth, window.innerHeight);
renderer.shadowMap.enabled = true;
document.body.appendChild(renderer.domElement);
camera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight, 1, 100);
camera.position.set(0, 1, -4);
//camera.lookAt( -1, 0, 0 );
loader = new THREE.TextureLoader();
loader.setCrossOrigin("");
scene = new THREE.Scene();
scene.background = new THREE.Color(0xffffff);
scene.add(camera);
window.onresize = resize;
var ambientLight = new THREE.AmbientLight(0x404040);
scene.add(ambientLight);
var directionalLight = new THREE.DirectionalLight( 0xffffff, 0.5 );
directionalLight.position.set(1,2,1.5);
scene.add( directionalLight );
orbitControls = new THREE.OrbitControls(camera, renderer.domElement);
addGridHelper();
createModel();
}
function createModel() {
var uniforms = {
u_time : {type:'f', value:0.0},
u_resolution: {type: 'v2', value: {x:2048.,y:1024.}},
u_color : {type: 'v3', value: {x:1.0, y:0.0, z:0.0} },
sunPosition : {type: 'v3', value: {x:5.0, y:5.0, z:5.0} }
};
var material = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: document.getElementById('vertex-shader').textContent,
fragmentShader: document.getElementById('fragment-shader').textContent,
});
var geometry = new THREE.BoxGeometry( 1, 1, 1 );
mesh = new THREE.Mesh(geometry, material);
mesh.position.set(0, 0, -1);
scene.add(mesh);
}
function addGridHelper() {
var helper = new THREE.GridHelper(100, 100);
helper.material.opacity = 0.25;
helper.material.transparent = true;
scene.add(helper);
var axis = new THREE.AxesHelper(1000);
scene.add(axis);
}
function resize() {
var aspect = window.innerWidth / window.innerHeight;
renderer.setSize(window.innerWidth, window.innerHeight);
camera.aspect = aspect;
camera.updateProjectionMatrix();
}
function animate() {
requestAnimationFrame(animate);
orbitControls.update();
render();
}
function render() {
mesh.rotation.y += 0.01;
renderer.render(scene, camera);
}
})();
<script src="https://cdn.jsdelivr.net/npm/three#0.131/build/three.js"></script>
<script src="https://cdn.jsdelivr.net/npm/three#0.131/examples/js/controls/OrbitControls.js"></script>
<script type='x-shader/x-vertex' id='vertex-shader'>
varying vec2 v_Uv;
varying vec3 v_Normal;
uniform vec3 sunPosition;
varying vec3 v_vertToLight;
void main() {
vec4 viewPosition = modelViewMatrix * vec4(position, 1.0);
vec4 viewSunPos = viewMatrix * vec4(sunPosition, 1.0);
v_Uv = uv;
v_Normal = normalMatrix * normal;
v_vertToLight = normalize(viewSunPos.xyz - viewPosition.xyz);
gl_Position = projectionMatrix * viewPosition;
}
</script>
<script type='x-shader/x-fragment' id='fragment-shader'>
precision highp float;
uniform float u_time;
uniform vec2 u_resolution;
varying vec2 v_Uv;
varying vec3 v_Normal;
varying vec3 v_vertToLight;
uniform vec3 u_color;
void main(){
float kd = max(0.0, dot(v_vertToLight, v_Normal));
gl_FragColor = vec4(u_color.rgb * kd + 0.1, 1.0);
}
</script>

How to mix / blend between 2 vertex positions based on distance from camera?

I'm trying to mix / blend between 2 different vertex positions depending on the distance from the camera. Specifically, I'm trying to create an effect that blends between a horizontal plane closer to the camera and a vertical plane in the distance. The result should be a curved plane going away and up from the current camera position.
I want to blend from this (a plane flat on the ground):
To this (the same plane, just rotated 90 degrees):
The implementation I have so far feels close but I just can't put my finger on what pieces I need to finish it. I took an approach from a similar Tangram demo (shader code), however I'm unable to get results anywhere near this. The Tangram example is also using a complete different setup to what I'm using in Three.js so I've not been able to replicate everything.
This is what I have so far: https://jsfiddle.net/robhawkes/a97tu864/
varying float distance;
mat4 rotateX(float rotationX) {
return mat4(
vec4(1.0,0.0,0.0,0.0),
vec4(0.0,cos(rotationX),-sin(rotationX),0.0),
vec4(0.0,sin(rotationX),cos(rotationX),0.0),
vec4(0.0,0.0,0.0,1.0)
);
}
void main()
{
vec4 vPosition = vec4(position, 1.0);
vec4 modelViewPosition = modelViewMatrix * vPosition;
float bend = radians(-90.0);
vec4 newPos = rotateX(bend) * vPosition;
distance = -modelViewPosition.z;
// Show bent position
//gl_Position = projectionMatrix * modelViewMatrix * newPos;
float factor = 0.0;
//if (vPosition.x > 0.0) {
// factor = 1.0;
//}
//factor = clamp(0.0, 1.0, distance / 2000.0);
vPosition = mix(vPosition, newPos, factor);
gl_Position = projectionMatrix * modelViewMatrix * vPosition;
}
I'm doing the following:
Calculate the rotated position of the vertex (the vertical version)
Find the distance from the vertex to the camera
Use mix to blend between the horizontal position and vertical position depending on the distance
I've tried multiple approaches and I just can't seem to get it to work correctly.
Any ideas? Even pointing me down the right path will be immensely helpful as my shader/matrix knowledge is limited.
The major issue is, that you tessellate the THREE.PlaneBufferGeometry in width segments, but not in height segments:
groundGeometry = new THREE.PlaneBufferGeometry(
1000, 10000,
100, // <----- widthSegments
100 ); // <----- heightSegments is missing
Now you can use the z coordinate of the view space for the interpolation:
float factor = -modelViewPosition.z / 2000.0;
var camera, controls, scene, renderer;
var groundGeometry, groundMaterial, groundMesh;
var ambientLight;
init();
initLight();
initGround();
animate();
function init() {
camera = new THREE.PerspectiveCamera( 70, window.innerWidth / window.innerHeight, 0.01, 10000 );
camera.position.y = 500;
camera.position.z = 1000;
controls = new THREE.MapControls( camera );
controls.maxPolarAngle = Math.PI / 2;
scene = new THREE.Scene();
scene.add(camera);
var axesHelper = new THREE.AxesHelper( 500 );
scene.add( axesHelper );
renderer = new THREE.WebGLRenderer( { antialias: true } );
renderer.setSize( window.innerWidth, window.innerHeight );
document.body.appendChild( renderer.domElement );
}
function initLight() {
ambientLight = new THREE.AmbientLight( 0x404040 );
scene.add( ambientLight );
}
function initGround() {
groundMaterial = new THREE.ShaderMaterial({
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
transparent: true
});
groundGeometry = new THREE.PlaneBufferGeometry( 1000, 10000, 100, 100 );
groundMesh = new THREE.Mesh( groundGeometry, groundMaterial );
groundMesh.position.z = -3000;
groundMesh.position.y = -100;
groundMesh.rotateX(-Math.PI / 2)
scene.add( groundMesh );
}
function animate() {
requestAnimationFrame( animate );
controls.update();
renderer.render( scene, camera );
}
<script type="x-shader/x-vertex" id="vertexShader">
varying float distance;
mat4 rotateX(float rotationX) {
return mat4(
vec4(1.0,0.0,0.0,0.0),
vec4(0.0,cos(rotationX),-sin(rotationX),0.0),
vec4(0.0,sin(rotationX),cos(rotationX),0.0),
vec4(0.0,0.0,0.0,1.0)
);
}
void main()
{
vec4 vPosition = vec4(position, 1.0);
vec4 modelViewPosition = modelViewMatrix * vPosition;
float bend = radians(-90.0);
vec4 newPos = rotateX(bend) * vPosition;
distance = -modelViewPosition.z;
float factor = -modelViewPosition.z / 2000.0;
vPosition = mix(vPosition, newPos, factor);
gl_Position = projectionMatrix * modelViewMatrix * vPosition;
}
</script>
<script type="x-shader/x-fragment" id="fragmentShader">
varying float distance;
void main() {
if (distance < 3000.0) {
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
} else {
gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);
}
}
</script>
<script src="https://threejs.org/build/three.min.js"></script>
<script src="https://rawgit.com/mrdoob/three.js/dev/examples/js/controls/MapControls.js"></script>

Resources