Three.js - repositioning vertices in a 'particle' mesh - three.js

I have a basic three.js game working and I'd like to add particles. I've been searching online, including multiple questions here, and the closest I've come to getting a 'particle system' working is using a THREE.BufferGeometry, a THREE.BufferAttribute and a THREE.Points mesh. I set it up like this:
const particleMaterial = new THREE.PointsMaterial( { size: 10, map: particleTexture, blending: THREE.AdditiveBlending, transparent: true } );
const particlesGeometry = new THREE.BufferGeometry;
const particlesCount = 300;
const posArray = new Float32Array(particlesCount * 3);
for (let i = 0; i < particlesCount; i++) {
posArray[i] = Math.random() * 10;
}
const particleBufferAttribute = new THREE.BufferAttribute(posArray, 3);
particlesGeometry.setAttribute( 'position', particleBufferAttribute );
const particlesMesh = new THREE.Points(particlesGeometry, particleMaterial);
particlesMesh.counter = 0;
scene.add(particlesMesh);
This part works and displays the particles fine, at their initial positions, but of course I'd like to move them.
I have tried all manner of things, in my 'animate' function, but I am not happening upon the right combination. I'd like to move particles, ideally one vertex per frame.
The current thing I'm doing in the animate function - which does not work! - is this:
particleBufferAttribute.setXYZ( particlesMesh.counter, objects[0].position.x, objects[0].position.y, objects[0].position.z );
particlesGeometry.setAttribute( 'position', particleBufferAttribute );
//posArray[particlesMesh.counter] = objects[0].position;
particlesMesh.counter ++;
if (particlesMesh.counter > particlesCount) {
particlesMesh.counter = 0;
}
If anyone has any pointers about how to move Points mesh vertices, that would be great.
Alternatively, if this is not at all the right approach, please let me know.
I did find Stemkoski's ShaderParticleEngine, but I could not find any information about how to make it work (the docs are very minimal and do not seem to include examples).

You don't need to re-set the attribute, but you do need to tell the renderer that the attribute has changed.
particleBufferAttribute.setXYZ( particlesMesh.counter, objects[0].position.x, objects[0].position.y, objects[0].position.z );
particleBufferAttribute.needsUpdate = true; // This is the kicker!
By setting needsUpdate to true, the renderer knows to re-upload that attribute to the GPU.
This might not be concern for you, but just know that moving particles in this way is expensive, because you re-upload the position attribute every single frame, which includes all the position data for every particle you aren't moving.

Related

Three js texture offset not updating

I am currently trying to create a mesh that is colored using a datatexture, my initial coloring shows up just fine, but now my next goal is to offset the texture along the y axis. very similar to this example.
http://math.hws.edu/graphicsbook/demos/c5/textures.html
How I create my texture / mesh:
this.colorTexture = new DataTexture(colors, this.frameWidth, frameCount, RGBFormat, FloatType, UVMapping, RepeatWrapping, RepeatWrapping);
const material = new MeshBasicMaterial({
side: FrontSide,
vertexColors: true,
wireframe: false,
map: this.colorTexture
});
this.mesh = new Mesh(geometry, material);
How I attempt to animate the texture using offset:
this.mesh.material.map.offset.y -= 0.001;
this.mesh.material.map.needsUpdate = true;
this.mesh.material.needsUpdate = true;
this.mesh.needsUpdate = true;
I have confirmed that the function I'm using to try to offset is being called during each animation frame, however the visualization itself is not animating or showing changes apart from the initial positioning of the colors I wrote to the texture.
Any help is greatly appreciated :)
The uv transformation matrix of a texture is updated automatically as long as Texture.matrixAutoUpdate is set to true (which is also the default value). You can simply modulate Texture.offset. There is no need to set any needsUpdate flags (Mesh.needsUpdate does not exist anyway).
It's best if you strictly stick to the code from the webgl_materials_texture_rotation example. If this code does not work, please demonstrate the issue with a live example.

calling object3D children for unique styles / animations

I'm wondering how I would go about calling the individual children of cube (mesh0, mesh1) so I'm able to set different styles / animations to them both.
AFRAME.registerComponent('multi_box', {
schema: {},
update: function() {
for (var i = 0; i < 2; i++) {
var material = new THREE.MeshBasicMaterial({color: "blue"});
var geometry = new THREE.BoxGeometry(1, 1, 1);
var cube = new THREE.Mesh(geometry, material);
cube.position.x = i == 0 ? -1 : 1;
cube.position.y = 0.5;
cube.position.z = -5;
this.el.setObject3D("mesh"+i, cube); //unique name for each object
}
console.log(this.el.object3DMap)
}
});
Codepen link: https://codepen.io/ubermario/pen/wrwjVG
I can console.log them both and see that they are unique objects to each other but I'm having trouble calling them:
var meshtest = this.el.getObject3D('mesh0')
console.log(meshtest.position)
I'v tried this method but with no luck: aframe get object3d children
Any help is appreciated :)
Instancing
In your for cycle, you create
a new geometry instance
a new material instance
a new mesh that connect the previous two
Each new keyword creates a unique instance that is independent of the others. In your case mesh0 and mesh1 are fully independent of each other. If you change any property of an instance, like for example material color or position, the other(s) will not be affected by that. In fact you do that by assigning a different x position to each cube.
Storage
Your component holds a map of 3D objects. Each is identified by a unique name. You generate this name by concatenating the prefix mesh with the iteration number (value of i).
You can later access the cubes just the same way you created them. Either by name
this.el.getObject3D('mesh0')
this.el.getObject3D('mesh1')
//etc.
or by index
this.el.object3D.children[0]
this.el.object3D.children[1]
and you can manipulate them further. For example you can put his on Ln19 in your Codepen:
this.el.getObject3D('mesh0').position.y = 2;
this.el.object3D.children[1].position.z = -3;
Just for completeness: If you would omit the +i at the end, you would overwrite the same key again and again, so mesh would reference just the last cube and others would get lost.
Changing properties
Three.js has a nice API, but in javascript you always need to think what happens behind the scenes. You will see that while learning new stuff. For example:
this.el.object3D.children[1].position.z = -3;
this.el.object3D.children[1].material.color.set('#ffff00');
this.el.object3D.children[1].material.color = [1, 1, 0];
As you can see the position can be changed directly, but color needs a setter sometimes. Things get more complicated with vectors where you need to watch which methods change the current instance and which produce a new one. You could easily forget to clone it. Say you had:
var pos = new THREE.Vector3(-1, 0.5, -5)
for (var i = 0; i < 2; i++) {
var material = new THREE.MeshBasicMaterial({color: "blue"});
var geometry = new THREE.BoxGeometry(1, 1, 1);
var cube = new THREE.Mesh(geometry, material);
//wrong, since it will assign the object reference,
//i.e. always the same object.
//Thus, the cubes will be at one position in the end
cube.position = pos;
//right
cube.pos = pos.clone();
pos.x += 1;
this.el.setObject3D("mesh"+i, cube); //unique name for each object
}
The difference is a bit more moderate topic on reference and value types. You will find many tutorials for that, for example this small gist I just found.
I hope this was helpful and wish you good progress in learning!

Set target of directional light in THREE.js

I have model far away from the origin, and I want a directional light to hit the model like sunlight would do.
I set a position and a target for my DirectionalLight:
export const dirLight = getDirectional();
function getDirectional() {
const dirLight = new DirectionalLight( 0xffffff, 1 );
dirLight.position.set( 585000 + 10000, 6135000 + 10000, -500 + 5000);
return dirLight;
};
const helper = new THREE.DirectionalLightHelper( dirLight, 1000 );
let t = new THREE.Object3D();
t.translateX(585000);
t.translateY(6135000);
t.translateZ(1000);
dirLight.target = t;
scene.add(dirLight);
scene.add(dirLight.target);
scene.add(t);
helper.update();
scene.add( helper );
I would expect the light direction now to be parallel to vector between light position and light target, but apparently the light direction is still towards the origin of the scene. What am I doing wrong ?
A running example can be seen here
The documentation states that the target needs to be added to the scene so that the world coordinates are calculated. However, that does not seem to work.
So, instead I tried manually updating the world coordinates, and that worked. Probably that will only work with a static target.
In your case that would be adding
dirLight.target.updateMatrixWorld();

Is it possible to let Fog interact with the material's opacity?

I am working on a project that displays buildings. The requirement is to let the building gradually fade out (transparent) based on the distance between the camera and the buildings. Also, this effect has to follow the camera's movement.
I consider using THREE.Fog(), but the Fog seems can only change the material's color.
Above is a picture of the building with white fog.
The buildings are in tiles, each tile is one single geometry (I merged all the buildings into one) using
var bigGeometry = new THREE.Geometry();
bigGeometry.merge(smallGeometry);
The purple/blue color thing is the ground, and ground.material.fog = false;. So the ground won't interact with the fog.
My question is:
Is it possible to let the fog interact with the building's material's opacity instead of color? (more white translate to more transparent)
Or should I use Shader to control the material's opacity based on distance to the camera? But I have no idea of how to do this.
I also considered adding alphaMap. If so, each building tile have to map an alphaMap and all these alphaMap have to interact with the camera's movement. It's going to be a tons of work.
So any suggestions?
Best Regards,
Arthur
NOTE: I suspect there are probably easier/prettier ways to solve this than opacity. In particular, note that partially-opaque buildings will show other buildings behind them. To address that, consider using a gradient or some other scene background, and choosing a fog color to match that, rather than using opacity. But for the sake of trying it...
Here's how to alter an object's opacity based on its distance. This doesn't actually require THREE.Fog, I'm not sure how you would use the fog data directly. Instead I'll use THREE.NodeMaterial, which (as of three.js r96) is fairly experimental. The alternative would be to write a custom shader with THREE.ShaderMaterial, which is also fine.
const material = new THREE.StandardNodeMaterial();
material.transparent = true;
material.color = new THREE.ColorNode( 0xeeeeee );
// Calculate alpha of each fragment roughly as:
// alpha = 1.0 - saturate( distance / cutoff )
//
// Technically this is distance from the origin, for the demo, but
// distance from a custom THREE.Vector3Node would work just as well.
const distance = new THREE.Math2Node(
new THREE.PositionNode( THREE.PositionNode.WORLD ),
new THREE.PositionNode( THREE.PositionNode.WORLD ),
THREE.Math2Node.DOT
);
const normalizedDistance = new THREE.Math1Node(
new THREE.OperatorNode(
distance,
new THREE.FloatNode( 50 * 50 ),
THREE.OperatorNode.DIV
),
THREE.Math1Node.SAT
);
material.alpha = new THREE.OperatorNode(
new THREE.FloatNode( 1.0 ),
normalizedDistance,
THREE.OperatorNode.SUB
);
Demo: https://jsfiddle.net/donmccurdy/1L4s9e0c/
Screenshot:
I am the OP. After spending some time reading how to use Three.js's Shader material. I got some code that is working as desired.
Here's the code: https://jsfiddle.net/yingcai/4dxnysvq/
The basic idea is:
Create an Uniform that contains controls.target (Vector3 position).
Pass vertex position attributes to varying in the Vertex Shader. So
that the Fragment Shader can access it.
Get the distance between each vertex position and controls.target. Calculate alpha value based on the distance.
Assign alpha value to the vertex color.
Another important thing is: Because the fade out mask should follow the camera move, so don't forget to update the control in the uniforms every frame.
// Create uniforms that contains control position value.
uniforms = {
texture: {
value: new THREE.TextureLoader().load("https://threejs.org/examples/textures/water.jpg")
},
control: {
value: controls.target
}
};
// In the render() method.
// Update the uniforms value every frame.
uniforms.control.value = controls.target;
I had the same issue - a few years later - and solved it with the .onBeforeCompile function which is maybe more convenient to use.
There is a great tutorial here
The code itself is simple and could be easily changed for other materials. It just uses the fogFactor as alpha value in the material.
Here the material function:
alphaFog() {
const material = new THREE.MeshPhysicalMaterial();
material.onBeforeCompile = function (shader) {
const alphaFog =
`
#ifdef USE_FOG
#ifdef FOG_EXP2
float fogFactor = 1.0 - exp( - fogDensity * fogDensity * vFogDepth * vFogDepth );
#else
float fogFactor = smoothstep( fogNear, fogFar, vFogDepth );
#endif
gl_FragColor.a = saturate(1.0 - fogFactor);
#endif
`
shader.fragmentShader = shader.fragmentShader.replace(
'#include <fog_fragment>', alphaFog
);
material.userData.shader = shader;
};
material.transparent = true
return material;
}
and afterwards you can use it like
const cube = new THREE.Mesh(geometry, this.alphaFog());

Is there a way I can create a Path or Curve to use for TubeGeomety(path,...) from an existing geometry's points/vertices array?

I'm very new to both three.js & to js in general.
1st I select a polyHedron geometry with a dat.gui checkbox
which renders say a tetrahedron. these selections work.
I also have a dat.gui checkbox to either phongfill or wireframe render.
I initially wanted just a wireframe type mesh but not with all of the internal triangles. I found the edgesgeometry() function which draws pretty much what I want(hard edges only). there is however a known issue with linewidth not working in windows anymore. all lines drawn as strokeweight/width 1.
I'd like to use tubeGeometry() to draw tubes of whatever radius as opposed to 1weight lines. I know I'll have to draw something such as a sphere at/over the connection vertices for it to not look ridiculous.
geo = new THREE.TetrahedronBufferGeometry(controls0.Radius,controls0.Detail);
...
egeo = new THREE.EdgesGeometry( geo );
lmat = new THREE.LineBasicMaterial({ color: 0x0099ff, linewidth: 4 });
ph = new THREE.LineSegments( egeo, lmat );
scene.add(ph);
....
playing around in the console I found some geometry/bufferGeomery arrays that are likely the vertices/indices of my selected X-hedron as their sizes change with type(tetra/icosa etc) selection & detail increase/decrease:
//p = dome.geometry.attributes.uv.array;
p = egeo.attributes.position.array
//p = geo.attributes.uv.array
...
var path = new THREE.Curve();
path.getPoint = function (t) {
// trace the arc as t ranges from 0 to 1
var segment = (0 - Math.PI*2) *t;
return new THREE.Vector3( Math.cos(segment), Math.sin(segment), 0);
};
var geomet = new THREE.TubeBufferGeometry( path, 10, 0.2, 12, false );
var mesh = new THREE.Mesh( geomet, mat );
scene.add( mesh );
from above the tubeGeometry() draws fine separately as well but with the "path" made by that curve example. How can I use the vertices from my tetrahedron for example to create that "path" to pass to tubegeometry() ?
maybe a function that creates "segment vectors" from the vertices ?
I think it needs other properties of curve/path as well ?
I'm quite stuck at this point.
ANY Help, suggestions or examples would be greatly appreciated !
thanks.
You can try to create a TubeGeometry for each edge. Generate a LineCurve3 as the input path. Use the vertices of the edge as the start and end vector for the line.
Consider to use something like "triangulated lines" as an alternative in order to visualize the wireframe of a mesh with a linewidth greater than 1. With the next release of three.js(R91) there are new line primitives for this. Demo:
https://rawgit.com/mrdoob/three.js/dev/examples/webgl_lines_fat.html
This approach is much more performant than drawing a bunch of meshes with a TubeGeometry.

Resources