instancedBuffergeometry with spritesheet doesn't show up from threejs v117 - three.js

I'm using InstancedBufferGeometry to draw many .gltf objects and using spritesheet like https://imgur.com/a/G9CBldQ to apply many kinds of texture.
To use spritesheet, I'm using material based on MeshLambertMaterial using onBeforeCompile function, and before threejs v116, it was working perfectly.
But after upgrading threejs and GLTFLoader to v117, nothing is displayed.
I'm implementing onBeforeCompile like this:
class InstancedLambertMaterial extends THREE.MeshLambertMaterial {
constructor(params) {
super(params);
this.spriteGrids = params?.spriteGrids;
this.userData = {
uniforms: {
vUvScale: { value: 1 / Math.sqrt(params?.spriteGrids) }
}
};
}
onBeforeCompile(shader) {
Object.assign(shader.uniforms, this.userData.uniforms);
shader.vertexShader = `#define USE_INSTANCING_CUSTOM\n${shader.vertexShader}`;
const instancedAttributes = `
attribute vec3 translation;
attribute vec4 orientation;
attribute vec3 scale;
attribute vec2 vUvOffsets;
varying vec2 v_vUvOffsets;
uniform float vUvScale;
`;
shader.vertexShader = shader.vertexShader.replace('#include <common>', `${instancedAttributes}\n#include <common>`);
const replacedProjectVertex = `
vec4 mvPosition = vec4( transformed, 1.0 );
#ifdef USE_INSTANCING
mvPosition = instanceMatrix * mvPosition;
#endif
#ifdef USE_INSTANCING_CUSTOM
vUv = uv;
transformed *= scale;
vec3 vcV = cross(orientation.xyz, transformed);
transformed = vcV * (2.0 * orientation.w) + (cross(orientation.xyz, vcV) * 2.0 + transformed);
mvPosition = vec4(translation + transformed, 1.0);
#endif
mvPosition = modelViewMatrix * mvPosition;
gl_Position = projectionMatrix * mvPosition;
#ifdef USE_INSTANCING_CUSTOM
v_vUvOffsets = vUvOffsets;
#endif
`;
shader.vertexShader = shader.vertexShader.replace('#include <project_vertex>', replacedProjectVertex);
shader.fragmentShader = `#define USE_SPRITESHEET\n${shader.fragmentShader}`;
const spriteSheetUniforms = `
#include <map_pars_fragment>
#ifdef USE_SPRITESHEET
uniform float vUvScale;
varying vec2 v_vUvOffsets;
#endif
`;
shader.fragmentShader = shader.fragmentShader.replace('#include <map_pars_fragment>', spriteSheetUniforms);
const spriteSheetTexelColorBranch = `
#ifdef USE_SPRITESHEET
vec4 texelColor = texture2D( map, (vUv * vUvScale) + (v_vUvOffsets * vUvScale) );
texelColor = mapTexelToLinear( texelColor );
diffuseColor *= texelColor;
#endif
`;
shader.fragmentShader = shader.fragmentShader.replace('#include <map_fragment>', spriteSheetTexelColorBranch);
this.userData = shader;
}
}
and preparing each transformation attributes like this and apply it.
const scales = new THREE.InstancedBufferAttribute(new Float32Array(instances * 3), 3, false);
const translations = new THREE.InstancedBufferAttribute(new Float32Array(instances * 3), 3, false);
const orientations = new THREE.InstancedBufferAttribute(new Float32Array(instances * 4), 4, false);
const tex_vec = new THREE.InstancedBufferAttribute(new Float32Array(instances * 2), 2, false);
I checked shader output(by raising shader error deliberately) and looks like nothing is changed related to draw my objects.
I looked into relase notes of v117, but looks like nothing changed related to my projects.
I want to be able to execute these codes for the newest version of threejs.
I made an working example. Both code is same, except version of threejs and GLTFLoader.
this is the result of v116
https://jsfiddle.net/maemaemae3/o9t1wxrm/1/
and v117
https://jsfiddle.net/maemaemae3/2cgym7n3/1/

The problem is this line:
const igeo = new THREE.InstancedBufferGeometry().copy(geometry);
If you do this, properties of InstancedBufferGeometry become undefined since they do not exist in BufferGeometry. A refactoring in r117 made this error visible.
I've fixed your second fiddle by restoring the instanceCount property:https://jsfiddle.net/9k4oqerc/

Related

Setting up WebGL shaders on Mac, Three.js, & vite-plugin-glsl

I am working on learning shaders through Three.js but I am having a bit trouble trying to get the setup to work. I am using vites plugin glsl for my shader which I have set up. At first I tried following along some more advance videos, but the glsl/frag/vert files didn't seem to work so I found a video that brought it down to the basicss. Thankfully I can get the shader to visualize and change color, but it looks like my vertex shader does not want to work. Originally I placed them in separate GLSL files, but that was giving me more problems that way, so I opted towards embedding them inside of my JS files. Here is my current basic project.
import * as THREE from 'three';
import { OrbitControls } from "three/addons/controls/OrbitControls.js";
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight)
camera.position.set(-10, 10, -1)
const renderer = new THREE.WebGLRenderer();
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(window.innerWidth, window.innerHeight);
const canvas = document.body;
canvas.appendChild(renderer.domElement);
const controls = new OrbitControls(camera, canvas);
// Lights
const ambientLights = new THREE.AmbientLight(0xffffff, 0.6);
scene.add(ambientLights)
const directionalLight = new THREE.DirectionalLight('#ffffff', 1);
directionalLight.castShadow = true;
directionalLight.receiveShadow = true;
directionalLight.shadow.mapSize.set(window.innerWidth, window.innerHeight);
directionalLight.shadow.camera.far = 0.01;
directionalLight.shadow.normalBias = 1.05;
directionalLight.position.set(200, 400, 10.25);
scene.add(directionalLight);
const shaderMaterial = new THREE.RawShaderMaterial({
vertexShader: `
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
uniform float uFrequency;
uniform float uAmplitude;
attribute vec3 position;
void main() {
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
// anything I place in here doesn't update
modelPosition.y += sin(modelPosition.x * uFrequency) * uAmplitude;
vec4 viewPosition = viewMatrix * modelPosition;
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);
}
`,
fragmentShader: `
precision mediump float;
void main() {
gl_FragColor = vec4(0.0, 0.5, 0.5, 1.0);
}
`,
wireframe: true,
side: THREE.DoubleSide,
uniforms: {
uFrequency: { value: 10.0 },
uAmplitude: { value: 0.1 }
}
})
const plane = new THREE.Mesh(
new THREE.PlaneGeometry(10, 10, 10, 10),
shaderMaterial
)
plane.rotation.set(-Math.PI / 2, 0, 0);
plane.castShadow = true;
plane.receiveShadow = true;
scene.add(plane);
const animate = () => {
controls.update();
renderer.render(scene, camera)
requestAnimationFrame(animate);
}
animate();
The fragment shader works because I see the change in color, however my vertex shader is where the issue lies. It doesn't disappear, but nothing changes and it also doesn't throw any errors to debug. Whenever I try updating my vertex shader, nothing happens. So it's semi working, but can't really do anything pass setting it up like so. I understand that WebGL was deprecated for MACs, but I always notice several videos online with mac users still using it, so I figure it has to work, there must be some things I'm just missing
The issue with your vertex shader is that you're assigning values to modelPosition, but then you're not using them! When you assign the final output to gl_Position, you're not using any of the calculations you performed previously:
void main() {
// You create and modify modelPosition...
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
modelPosition.y += sin(modelPosition.x * uFrequency) * uAmplitude;
vec4 viewPosition = viewMatrix * modelPosition;
// ...but then you don't use it on your final output!
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);
}
Instead of doing all the matrix multiplications again from scratch, just make sure you use the position you've modified.
void main() {
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
modelPosition.y += sin(modelPosition.x * uFrequency) * uAmplitude;
vec4 viewPosition = viewMatrix * modelPosition;
// Make sure you use viewPosition in your final output
gl_Position = projectionMatrix * viewPosition;
}

Using WebGLRenderer with logarithmicDepthBufferset option set to true together with ShaderMaterial

I use the following ShaderMaterial for my objects in scene. The code below is working. However, if I set the WebGLRenderer option logarithmicDepthBuffer to true, the Material defined below is not displayed correctly.
new THREE.ShaderMaterial({
uniforms: {
color1: {
value: new THREE.Color('#3a0000')
},
color2: {
value: new THREE.Color('#ffa9b0')
}
},
vertexShader: `
varying vec3 vNormal;
void main(void){
vNormal = normalMatrix * normalize(normal);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}`,
fragmentShader: `
uniform vec3 color1;
uniform vec3 color2;
varying vec3 vNormal;
void main(void){
vec3 view_nv = normalize(vNormal);
vec3 nv_color = view_nv * 0.5 + 0.5;
vec3 c = mix(color1, color2, nv_color.r);
gl_FragColor = vec4(c, 1.0);
}`,
side: THREE.DoubleSide,
});
After looking for a solution to this problem, I found the following SO answer. Summarizing, the solution is to add 4 pieces of code to vertexShader and fragmentShader.
Where exactly do I have to integrate the provided code snippets, i.e. Vertex shader body and Fragment shader body?
I tried various "positions" but I always got WebGL errors.
THREE.WebGLProgram: shader error: 0 gl.VALIDATE_STATUS false gl.getProgramInfoLog Must have a compiled vertex shader attached. ERROR: 0:63: 'EPSILON' : undeclared identifier
UPDATE added playground: https://codepen.io/anon/pen/gQoaye
If you add the option of logarithmicDepthBuffer to the constructor, you will see that the ShaderMaterial won't work anymore.
var renderer = new THREE.WebGLRenderer(logarithmicDepthBuffer:true);
Where exactly do I have to integrate the provided code snippets, i.e. Vertex shader body and Fragment shader body?
In the vertex shader you have to define EPSILON.
After adding the code snipptes logdepthbuf_pars_vertex.glsl and logdepthbuf_vertex.glsl, the final vertex shader is:
#ifdef USE_LOGDEPTHBUF
#define EPSILON 1e-6
#ifdef USE_LOGDEPTHBUF_EXT
varying float vFragDepth;
#endif
uniform float logDepthBufFC;
#endif
varying vec3 vNormal;
void main(void){
vNormal = normalMatrix * normalize(normal);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
#ifdef USE_LOGDEPTHBUF
gl_Position.z = log2(max( EPSILON, gl_Position.w + 1.0 )) * logDepthBufFC;
#ifdef USE_LOGDEPTHBUF_EXT
vFragDepth = 1.0 + gl_Position.w;
#else
gl_Position.z = (gl_Position.z - 1.0) * gl_Position.w;
#endif
#endif
}
After adding the code snippets, the final fragment shader is:
#ifdef USE_LOGDEPTHBUF
uniform float logDepthBufFC;
#ifdef USE_LOGDEPTHBUF_EXT
#extension GL_EXT_frag_depth : enable
varying float vFragDepth;
#endif
#endif
uniform vec3 color1;
uniform vec3 color2;
varying vec3 vNormal;
void main(void){
vec3 view_nv = normalize(vNormal);
vec3 nv_color = view_nv * 0.5 + 0.5;
vec3 c = mix(color1, color2, nv_color.r);
gl_FragColor = vec4(c, 1.0);
#if defined(USE_LOGDEPTHBUF) && defined(USE_LOGDEPTHBUF_EXT)
gl_FragDepthEXT = log2(vFragDepth) * logDepthBufFC * 0.5;
#endif
}
See the example:
(function onLoad() {
var container, camera, scene, renderer, orbitControls;
function createModel() {
var material = new THREE.ShaderMaterial({
uniforms: {
color1: {
value: new THREE.Color('#3a0000')
},
color2: {
value: new THREE.Color('#ffa9b0')
}
},
vertexShader: `
#ifdef USE_LOGDEPTHBUF
#define EPSILON 1e-6
#ifdef USE_LOGDEPTHBUF_EXT
varying float vFragDepth;
#endif
uniform float logDepthBufFC;
#endif
varying vec3 vNormal;
void main(void){
vNormal = normalMatrix * normalize(normal);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
#ifdef USE_LOGDEPTHBUF
gl_Position.z = log2(max( EPSILON, gl_Position.w + 1.0 )) * logDepthBufFC;
#ifdef USE_LOGDEPTHBUF_EXT
vFragDepth = 1.0 + gl_Position.w;
#else
gl_Position.z = (gl_Position.z - 1.0) * gl_Position.w;
#endif
#endif
}`,
fragmentShader: `
#ifdef USE_LOGDEPTHBUF
#ifdef USE_LOGDEPTHBUF_EXT
#extension GL_EXT_frag_depth : enable
varying float vFragDepth;
#endif
uniform float logDepthBufFC;
#endif
uniform vec3 color1;
uniform vec3 color2;
varying vec3 vNormal;
void main(void){
vec3 view_nv = normalize(vNormal);
vec3 nv_color = view_nv * 0.5 + 0.5;
vec3 c = mix(color1, color2, nv_color.r);
gl_FragColor = vec4(c, 1.0);
#if defined(USE_LOGDEPTHBUF) && defined(USE_LOGDEPTHBUF_EXT)
gl_FragDepthEXT = log2(vFragDepth) * logDepthBufFC * 0.5;
#endif
}`,
side: THREE.DoubleSide,
});
//var material = new THREE.MeshPhongMaterial({color:'#b090b0'});
var geometry = new THREE.BoxGeometry( 1, 1, 1 );
var mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);
}
function init() {
container = document.getElementById('container');
renderer = new THREE.WebGLRenderer({
antialias: true,
logarithmicDepthBuffer: true
});
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(window.innerWidth, window.innerHeight);
renderer.shadowMap.enabled = true;
container.appendChild(renderer.domElement);
camera = new THREE.PerspectiveCamera(60, window.innerWidth / window.innerHeight, 1, 100);
camera.position.set(0, 1, -2);
scene = new THREE.Scene();
scene.background = new THREE.Color(0xffffff);
scene.add(camera);
window.onresize = resize;
orbitControls = new THREE.OrbitControls(camera, container);
var helper = new THREE.GridHelper(100, 100);
helper.material.opacity = 0.25;
helper.material.transparent = true;
scene.add(helper);
var axis = new THREE.AxesHelper(1000);
scene.add(axis);
createModel();
}
function resize() {
var aspect = window.innerWidth / window.innerHeight;
renderer.setSize(window.innerWidth, window.innerHeight);
camera.aspect = aspect;
camera.updateProjectionMatrix();
}
function animate() {
requestAnimationFrame(animate);
orbitControls.update();
render();
}
function render() {
renderer.render(scene, camera);
}
init();
animate();
})();
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/110/three.min.js"></script>
<script src="https://threejs.org/examples/js/controls/OrbitControls.js"></script>
<div id="container"></div>
I assume you have tried inserting that code in your two shaders? To my understanding, that should be correct.
The error appears to be complaining about the shader not compiling, due to a reference to EPSILON in the vertex shader body, although EPSILON was never declared.
Try defining EPSILON, e.g. using a macro in the shader itself:
#define EPSILON 1e-6
or pass it to the shader as a uniform. (Notice that this is just an example value; you may want to research what a suitable value for EPSILON might be in your particular case.)

Storing data as a texture for use in Vertex Shader for Instanced Geometry (THREE JS / GLSL)

I'm using a THREE.InstancedBufferGeometry, and I wish to access data in the Vertex Shader, encoded into a Texture.
What I want to do, is create a Data Texture with one pixel per instance, which will store position data for each instance (then at a later stage, I can update the texture using a simulation with a flow field to animate the positions).
I'm struggling to access the data from the texture in the Vertex Shader.
const INSTANCES_COUNT = 5000;
// FOR EVERY INSTANCE, GIVE IT A RANDOM X, Y, Z OFFSET, AND SAVE IT IN DATA TEXTURE
const data = new Uint8Array(4 * INSTANCES_COUNT);
for (let i = 0; i < INSTANCES_COUNT; i++) {
const stride = i * 4;
data[stride] = (Math.random() - 0.5);
data[stride + 1] = (Math.random() - 0.5);
data[stride + 2] = (Math.random() - 0.5);
data[stride + 3] = 0.0;
}
const offsetTexture = new THREE.DataTexture( data, INSTANCES, 1, THREE.RGBAFormat, THREE.FloatType );
offsetTexture.minFilter = THREE.NearestFilter;
offsetTexture.magFilter = THREE.NearestFilter;
offsetTexture.generateMipmaps = false;
offsetTexture.needsUpdate = true;
// CREATE MY INSTANCED GEOMETRY
const geometry = new THREE.InstancedBufferGeometry();
geometry.maxInstancedCount = INSTANCES_COUNT;
geometry.addAttribute( 'position', new THREE.Float32BufferAttribute([5, -5, 0, -5, 5, 0, 0, 0, 5], 3 )); // SIMPLE TRIANGLE
const vertexShader = `
precision highp float;
uniform vec3 color;
uniform sampler2D offsetTexture;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
attribute vec3 position;
varying vec3 vPosition;
varying vec3 vColor;
void main(){
vPosition = position;
vec4 orientation = vec4(.0, .0, .0, .0);
vec3 vcV = cross( orientation.xyz, vPosition );
vPosition = vcV * ( 2.0 * orientation.w ) + ( cross( orientation.xyz, vcV ) * 2.0 + vPosition );
vec2 uv = position.xy;
vec4 data = texture2D( offsetTexture, uv );
vec3 particlePosition = data.xyz * 1000.0;
gl_Position = projectionMatrix * modelViewMatrix * vec4( vPosition + particlePosition, 1.0 );
}
`;
const fragmentShader = `
precision highp float;
varying vec3 vColor;
void main() {
gl_FragColor = vec4(vColor, 1.0);
}
`;
const uniforms = {
size: { value: 1.0 },
color: {
type: 'c',
value: new THREE.Color(0x3db230),
},
offsetTexture: {
type: 't',
value: offsetTexture,
},
};
// CREATE MY MATERIAL
const material = new THREE.RawShaderMaterial({
uniforms,
vertexShader,
fragmentShader,
side: THREE.DoubleSide,
transparent: false,
});
scene.add(new THREE.Mesh(geometry, material));
At the moment it seems that the data from the image isn't accessible in the vertex shader (if I just set the vUv to vec2(1.0, 0.0), for example, and change the offset positions, nothing changes), and also I'm not sure how to go about making sure that the instance can reference the correct texel in the texture.
So, my two issues are:
1) How to correctly set the Data Image Texture, and access that data in the Vertex Shader
2) How to correctly reference the texel storing the data for each particular instance (e.g, instance 1000 should use vec2(1000,1), etc
Also, do I have to normalize the data (0.0-1.0, or 0–255, or -1 – +1)
Thanks
You need to compute some kind of an index into the texture per instance.
Meaning, you need an attribute that is going to be shared by each instance.
if your triangle is
[a,b,c]
your index should be
[0,0,0]
Lets say you have 1024 instances and a 1024x1 px texture.
attribute float aIndex;
vec2 myIndex = ((aIndex + 0.5)/1024.,1.);
vec4 myRes = texture2D( mySampler, myIndex);

Per instance UV texture mapping in Three.js InstancedBufferGeometry

I have a InstancedBufferGeometry made up of a single Plane:
const plane = new THREE.PlaneBufferGeometry(100, 100, 1, 1);
const geometry = new THREE.InstancedBufferGeometry();
geometry.maxInstancedCount = 100;
geometry.attributes.position = plane.attributes.position;
geometry.index = plane.index;
geometry.attributes.uv = plane.attributes.uv;
geometry.addAttribute( 'offset', new THREE.InstancedBufferAttribute( new Float32Array( offsets ), 3 ) ); // an offset position
I am applying a texture to each plane, which is working as expected, however I wish to apply a different region of the texture to each instance, and I'm not sure about the correct approach.
At the moment I have tried to build up uv's per instance, based on the structure of the uv's for a single plane:
let uvs = [];
for (let i = 0; i < 100; i++) {
const tl = [0, 1];
const tr = [1, 1];
const bl = [0, 0];
const br = [1, 0];
uvs = uvs.concat(tl, tr, bl, br);
}
...
geometry.addAttribute( 'uv', new THREE.InstancedBufferAttribute( new Float32Array( uvs ), 2) );
When I do this, I don't have any errors, but every instance is just a single colour (all instances are the the same colour). I have tried changing the instance size, and also the meshes per attribute (which I don't fully understand, struggling to find a good explanation in the docs).
I feel like I'm close, but I'm missing something, so a point in the right direction would be fantastic!
(For reference, here are my shaders):
const vertexShader = `
precision mediump float;
uniform vec3 color;
uniform sampler2D tPositions;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
attribute vec2 uv;
attribute vec2 dataUv;
attribute vec3 position;
attribute vec3 offset;
attribute vec3 particlePosition;
attribute vec4 orientationStart;
attribute vec4 orientationEnd;
varying vec3 vPosition;
varying vec3 vColor;
varying vec2 vUv;
void main(){
vPosition = position;
vec4 orientation = normalize( orientationStart );
vec3 vcV = cross( orientation.xyz, vPosition );
vPosition = vcV * ( 2.0 * orientation.w ) + ( cross( orientation.xyz, vcV ) * 2.0 + vPosition );
vec4 data = texture2D( tPositions, vec2(dataUv.x, 0.0));
vec3 particlePosition = (data.xyz - 0.5) * 1000.0;
vUv = uv;
vColor = data.xyz;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position + particlePosition + offset, 1.0 );
}
`;
const fragmentShader = `
precision mediump float;
uniform sampler2D map;
varying vec3 vPosition;
varying vec3 vColor;
varying vec2 vUv;
void main() {
vec3 color = texture2D(map, vUv).xyz;
gl_FragColor = vec4(color, 1.0);
}
`;
As all my instances need to take the same size rectangular area, but offset (like a sprite sheet), I have added a UV offset and UV scale attribute to each instance, and use this to define which area of the map to use:
const uvOffsets = [];
for (let i = 0; i < INSTANCES; i++) {
const u = i % textureWidthHeight;
const v = ~~ (i / textureWidthHeight);
uvOffsets.push(u, v);
}
...
geometry.attributes.uv = plane.attributes.uv;
geometry.addAttribute( 'uvOffsets', new THREE.InstancedBufferAttribute( new Float32Array( uvOffsets ), 2 ) );
uniforms: {
...
uUvScale: { value: 1 / textureWidthHeight }
}
And in the fragment shader:
void main() {
vec4 color = texture2D(map, (vUv * uUvScale) + (vUvOffsets * uUvScale));
gl_FragColor = vec4(1.0, 1.0, 1.0, color.a);
}
\o/

How can textures with transparent spots be correctly applied to multiple stacked plane instances in threejs?

I'm creating 512 instances of the same 1x1 plane with a texture that has transparent areas. The planes are randomly spread around the origin like the image below.
How can the planes in front be drawn after the planes behind so that the transparency of the planes in front take into account the output of the planes from behind?
(with depthTest disabled)
(with depthTest normal)
For reference, the transparency disabled version of the instanced geometry. This proves that the planes are correctly positioned.
Update:
Adding code as asked:
import {
Mesh,
ShaderMaterial,
Vector3,
PlaneBufferGeometry,
EdgesGeometry,
LineBasicMaterial,
LineSegments,
InstancedBufferAttribute,
UniformsLib,
BufferAttribute,
TextureLoader,
InstancedBufferGeometry,
DoubleSide,
} from 'three'
import path from 'path'
import fs from 'fs'
import {
randomValueBetween,
} from '../../utils'
const vertexShader = fs.readFileSync(path.resolve(__dirname, './assets/vertex.glsl'), 'utf8')
const fragmentShader = fs.readFileSync(path.resolve(__dirname, './assets/fragment.glsl'), 'utf8')
const createInstancedAtrributes = (geometry, instanceCount) => {
const startseed = new InstancedBufferAttribute(new Float32Array(instanceCount * 1), 1)
const scale = new InstancedBufferAttribute(new Float32Array(instanceCount * 3), 3)
const offset = new InstancedBufferAttribute(new Float32Array(instanceCount * 2), 2)
const orientationY = new InstancedBufferAttribute(new Float32Array(instanceCount), 1)
const baseScale = 0.5
for (let i = 0; i < instanceCount; i += 1) {
scale.setXYZ(i,
baseScale * randomValueBetween(0.8, 1.3, 1),
baseScale * randomValueBetween(0.8, 1.3, 1),
baseScale * randomValueBetween(0.8, 1.3, 1),
)
orientationY.setX(i, randomValueBetween(0.0, 1.0, 3))
startseed.setX(i, randomValueBetween(1, 3, 1))
}
for (let i = 0; i < instanceCount / 4; i += 4) {
const randomX = randomValueBetween(-3.5, 3.5, 1)
const randomY = randomValueBetween(-3.5, 3.5, 1)
offset.setXY(i, randomX, randomY)
}
geometry.addAttribute('scale', scale)
geometry.addAttribute('offset', offset)
geometry.addAttribute('startseed', offset)
geometry.addAttribute('orientationY', offset)
return { scale, offset }
}
const createInstancedGeometry = (instancePerUnitCount) => {
const geometry = new InstancedBufferGeometry()
geometry.maxInstancedCount = instancePerUnitCount
const shape = new PlaneBufferGeometry(1, 1, 1, 3)
const data = shape.attributes
geometry.addAttribute('position', new BufferAttribute(new Float32Array(data.position.array), 3))
geometry.addAttribute('uv', new BufferAttribute(new Float32Array(data.uv.array), 2))
geometry.addAttribute('normal', new BufferAttribute(new Float32Array(data.normal.array), 3))
geometry.setIndex(new BufferAttribute(new Uint16Array(shape.index.array), 1))
shape.dispose()
createInstancedAtrributes(geometry, instancePerUnitCount)
return geometry
}
export default class GrassDeform extends Mesh {
constructor() {
const geometry = createInstancedGeometry(8 * 256)
const uniforms = {
uTime: {
type: 'f',
value: 0,
},
uMap: {
type: 't',
value: null,
},
}
const textureLoader = new TextureLoader()
textureLoader.load(path.resolve(__dirname, './assets/grass-texture-01.png'), (t) => {
uniforms.uMap.value = t
})
const material = new ShaderMaterial({
uniforms: Object.assign({},
UniformsLib.ambient,
UniformsLib.lights,
uniforms,
),
vertexShader,
fragmentShader,
lights: true,
transparent: true,
side: DoubleSide,
})
super(geometry, material)
this.geometry = geometry
this.material = material
this.up = new Vector3(0, 0, 1)
const lineGeo = new EdgesGeometry(geometry) // or WireframeGeometry
const mat = new LineBasicMaterial({ color: 0xffffff, linewidth: 2 })
const wireframe = new LineSegments(lineGeo, mat)
this.add(wireframe)
this.frustumCulled = false
}
update({ ellapsedTime }) {
this.material.uniforms.uTime.value = ellapsedTime
}
}
And the object is added to the scene like this:
const grass2 = new GrassDeform2()
grass2.position.set(-1, 0, 0.50)
grass2.rotateX(Math.PI / 2)
scene.add(grass2)
dirLight.target = grass2
const animate = (ellapsedTime = 0) => {
stats.begin()
grass2.update({ ellapsedTime })
/// other scene stuff
renderer.render(scene, playerController.camera)
requestAnimationFrame(animate)
}
animate()
The vertex shader:
#if NUM_DIR_LIGHTS > 0
struct DirectionalLight {
vec3 direction;
vec3 color;
int shadow;
float shadowBias;
float shadowRadius;
vec2 shadowMapSize;
};
uniform DirectionalLight directionalLights[ NUM_DIR_LIGHTS ];
#endif
uniform float uTime;
attribute vec2 offset;
attribute vec3 scale;
attribute float startseed;
varying vec2 vUv;
varying vec3 vPosition;
varying vec3 vDirectionalLightDirection;
varying vec3 vDirectionalLightColor;
varying vec3 uNormal;
void main() {
vec3 pos = position * scale;
pos.x += offset.x;
pos.z += offset.y;
pos.y += (scale.y - 1.0) * 0.5;
pos.y = orientationY
vPosition = pos;
uNormal = normal;
vUv = uv;
uNormal = normal;
vDirectionalLightDirection = directionalLights[0].direction;
vDirectionalLightColor = directionalLights[0].color;
float variation = startseed + uTime * 0.002;
float pass = (0.5 + pos.y) * 0.05;
pos.x += sin(pass + variation) * pass;
pos.z += cos(pass + variation + 0.01) * pass;
pos.y += sin(pass + variation - 0.01) * pass;
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos,1.0);
}
And the fragment shader (has some extra stuff for light, not added for now):
uniform sampler2D uMap;
varying vec2 vUv;
varying vec3 vPosition;
varying vec3 vDirectionalLightDirection;
varying vec3 vDirectionalLightColor;
varying vec3 uNormal;
void main() {
vec4 map = texture2D(uMap, vUv);
vec3 lightVector = normalize((vDirectionalLightDirection) - vPosition);
float dotNL = dot( uNormal, lightVector );
vec3 baseColor = map.rgb;
vec3 lightedColor = vDirectionalLightColor * 0.6 * dotNL;
if ( map.a < 0.5 ) discard; //!!! THIS WAS THE LINE NEEDED TO SOLVE THE ISSUE
gl_FragColor = vec4( map.rgb , 1 );
}
After applying the change from the final result, the scene looks right!
You can solve your problem with alpha testing. Use a pattern like the following in your fragment shader:
vec4 texelColor = texture2D( map, vUv );
if ( texelColor.a < 0.5 ) discard;
Your material will no longer need to have transparent = true, since you appear to be using a cut-out in which the texture alpha is either 0 or 1.
three.js r.88

Resources