Three js Explode Modifier with Material - three.js

Trying to mimic the behaviour of this site.
http://experience.mausoleodiaugusto.it/it/chapter-02
I can successfully load in a dae model with a texture and have this displayed in the 3JS scene. I have also played around a bit with the tesselate and explode modifiers with reasonable success however the method I am using takes the dae geometry, creates faces from this, which are assigned a colour and turned into new mesh and added to the scene alongside my original dae model. So I end up with two objects - my original dae model with texture and the new exploded mesh with coloured faces
How would I map the texture from the dae file to each individual face in the new mesh so that when it explodes it looks accurate as per the example - or am I going about this the wrong way? Perhaps each face needs its inidvidual texture already mapped in the dae file?
Any suggestions would be welcome.

Related

Collada model loaded in Black

I have copied the example given in the official three js repo of how to load the DAE Collada model, modified it to remove the animations, and loaded my own Collada model.
The problem is that the model loads in black as you can see in this codesandbox, and I wasn't able to change the material color!
You can see that the same model file(after I converted it from Collada to GLTF) is parsed with blue color!
Can you please tell me how can I load the color correctly? thanks in advance.
I looked at your geometry with console.log(child.geometry) and noticed that it doesn't have any normals data. It only has position, uv, & uv2, but you need normals to calculate the behavior of light reflecting from your faces.
There are 3 things you can do to solve this:
1 Add your own normals:
Re-export your Collada file with normals data from your 3D editor software.
2 Flat shading:
Use flat-shading so every face is flat, without smooth or rounded edges. You don't need normals data for this because the shader just assumes every triangle's normal is pointing perpendicular to its surface.
avatar.traverse(function (child) {
if (child.isMesh) {
child.material.flatShading = true;
}
});
3 Calculate normals:
You could ask Three.js to calculate the normals for you. Just be aware that it does this by averaging connected vertices, so faces might accidentally look too smooth or too hard in places you don't expect.
avatar.traverse(function (child) {
if (child.isMesh) {
child.geometry.computeVertexNormals();
}
});

Importing mesh from blender causes distortion

I'm trying to load a skinned mesh from blender into a THREE.js scene, but it... looks a little odd:
There is a skeleton that was loaded from the animation only, the mesh modified for the game, and there is a little one that is directly loaded from the three.js ObjectLoader.
It's supposed to look more like this (from blender):
I found it!
when exporting from blender, I modified the number of influences to 4. It appears that some vertices were being modified by more than one bone and therefore when only two influencing bones were exported it distorted.

Unity 3D Does Not Apply Texture To Object Properly

I have got a problem with texturing an object in Unity 3D. I have made a simple object in 3Ds Max and inserted it into Unity and then tried to apply an image as texture but it does not apply the texture and it only changes the color of the object! This is the print screen:
As you can see I have got two models. One of them is made in 3Ds Max and does not apply the texture and the other one is made in Unity and it's a cube and it gets the texture correctly.
So what's going wrong here ? Not that I also changed the tiling and offset settings of model's shader but still nothing's changed at all! :(
You didn't UV Unwrapped the 3d object before exporting to Unity3D.
To apply the textures on 3d mesh, any engine needs to know coordinates for texture, and this is called UV Mapping.
WIKIPEDIA - UV_mapping

Convert THREE.js scene to CANNON.js world

I have a basic THREE.js scene, created in Blender, including cubes and rotated planes. Is there any way that I can automatically convert this THREE.js scene into a CANNON.js world ?
Thanks
Looking at the Three.js Blender exporter, it looks like it only exports mesh data, no information about mathematical shapes (boxes, planes, spheres etc) that Cannon.js needs to work. You could try to import your meshes directly into Cannon.js, using its Trimesh class, but this would sadly only work for collisions against spheres and planes.
What you need to feed Cannon.js is mathematical geometry data, telling it which of your triangles in your mesh that represent a box (or plane) and where its center of mass is.
A common (manual) workflow for creating 3D WebGL physics, is importing the 3D models into a WebGL-enabled game engine (like Unity, Goo Create or PlayCanvas). In the game engine you can add collider shapes to your models (boxes, planes, spheres, etc), so the physics engine can work efficiently. You can from there preview your physics simulation and export a complete WebGL experience.
Going to post another answer since there are a few new options to consider here...
I wrote a simple mesh2shape(...) helper that can convert (one object at a time) from THREE.Mesh to CANNON.Shape objects. It doesn't support certain features, such as heightmaps/terrain.
Example:
var shape = mesh2shape(object3D, {type: mesh2shape.Type.BOX})
There is a (experimental!) BLENDER_physics extension for the glTF format to include physics data with a model. You could add physics data in Blender, export to glTF, and then modify THREE.GLTFLoader to pass along the physics data to your application, helping you construct CANNON.js objects.

three.js create texture from cubecamera

When using a cube camera one normally sets the envMap of the material to the cubeCamera.renderTarget, e.g.:
var myMaterial = new THREE.MeshBasicMaterial({color:0xffffff,
envMap: myCubeCamera.renderTarget,
side: THREE.DoubleSide});
This works great for meshes that are meant to reflect or refract what the cube camera sees. However, I'd like to simply create a texture and apply that to my mesh. In other words, I don't want my object to reflect or refract. I want the face normals to be ignored.
I tried using a THREE.WebGLRenderTarget, but it won't handle a cube camera. And using a single perpspective camera with WebGLRenderTarget does not give me a 360 texture, obviously.
Finally, simply assigning the cubeCamera.renderTarget to the 'map' property of the material doesn't work either.
Is it possible to do what I want?
r73.
Edit: this is not what the author of the question is looking for, I'll keep my answer below for other people
Your envmap is already a texture so there's no need to apply it as a map. Also, cubemaps and textures are structurally different, so it won't be possible to swap them, or if you succeed in doing that the result is not what you probably you might expect.
I understand from what you're asking you want a static envmap instead to be updated at each frame, if that's the case simply don't run myCubeCamera.updateCubeMap() into your render function. Instead place it at the end of your scene initialization with your desired cube camera position, your envmap will show only that frame.
See examples below:
Dynamic Cubemap Example
Static Cubemap Example
The answer is: Set the refractionRatio on the material to 1.0. Then face normals are ignored since no refraction is occurring.
In a normal situation where the Cube Camera is in the same scene as the mesh, this would be pointless because the mesh would be invisible. But in cases where the Cube Camera is looking at a different scene, then this is a useful feature.

Resources