I'm using BufferGeometry to draw triangles.
I can use a mesh geometry, specifiyng 3 index-attribute for every triangle. I'm using a basic material without wireframe. I suposse I'll can use raycast.
Also I have seen the linesegments approach for wireframe. Interesting.
Ok, my problem... I'd like to see my triangles as a wireframe and also I need raycast. So .... the solution is to create my own shader, isn't it ?
Thanks
you dont have to create a custom shader you can have a mesh with wireframe material, and the ray should still "hit" the object
var mesh = new THREE.Mesh(geometry,new THREE.MeshBasicMaterial({wireframe : true}));
if it for some reason does not hit or you want to LineSegments object you can keep track of all transformations that affected the object, and apply them onto a mesh you wont add to scene
var segmentObject = new THREE.LineSegments(geometry,lineMaterial);
scene.add(segmentObject);
var meshNotInScene = new THREE.Mesh(geometry,dummyMaterial);
and you will use the mesh object to determine if raycast hit the object
this way you can have a different hitbox for an object for example if you have a flying donut by pairing it with a circle mesh you can select it even if you click in its hole etc...
keep in mind that materials have sides and if you dont care about which side is which set "side" to THREE.DoubleSide
Related
I have a model, only one mesh, and material is an array in mesh with 4 ShaderMaterial.The data structure of the model object,Position of material mark point
I want to click the mesh to get the material in the corresponding position, and use Raycaster to get the intersect object. I want to find the corresponding material in the object through the face.materialIndex.How do I get the right material? intersect data
If you have a specific materialIndex, you can get the corresponding material object like so:
const material = mesh.material[ materialIndex ];
As stated in the documentation when using multiple materials per mesh, Mesh.material becomes an array of materials.
I'm learning webgl, threejs and glsl shaders. The scene you see below is my attempt at working with instanced geometry. I have 3 patches of "grass". The grass is actually made of instanced cones around the central position of the mesh I'm building.
As you can see, if the central position (marked with a white wireframe cone) of the mesh is no longer in the camera view, all instances disappear.
How can I prevent this?
And to be even more specific: Are all the grass patches, or all fire instances, or all particles of the same time supposed to be instanced at once and place around the scene how we see fit? My assumption is that they should. Right?
If you are using InstancedBufferGeometry, there are two ways you can deal with frustum culling.
One way is to turn frustum culling off for the mesh:
mesh.frustumCulled = false;
The other option is to set the bounding sphere of the mesh's geometry manually, if it is known in your case:
geometry.boundingSphere = new THREE.Sphere( new THREE.Vector3(), radius );
three.js r.88
I want to place a circle on an object when you click on it.
I updated this example to work with webGL renderer (I basically changed THREE.SpriteCanvasMaterialto THREE.SpriteMaterial), but not only is the circle now a square, it also glitches with the surface of the object. Here's a JSFiddle that demonstrates my problem (click an object to test).
I found some similar questions on stackoverflow, but I can't seem to make it work in my example. Any tips?
you are adding sprites into a moving 3d scene, easy way to do what you want is to add spheres
var particle = new THREE.Mesh(new THREE.SphereGeometry(10,10,10),
new THREE.MeshBasicMaterial({color:0xff0000}) );
particle.position.copy( intersects[ 0 ].point );
scene.add( particle );
if you want real flat circles you will have to create a circular geometry and align it correctly to the object so it sits atop of the rectangle wall and if you want circles that do not deform when object is turned you will need some custom shader stuff
When using a cube camera one normally sets the envMap of the material to the cubeCamera.renderTarget, e.g.:
var myMaterial = new THREE.MeshBasicMaterial({color:0xffffff,
envMap: myCubeCamera.renderTarget,
side: THREE.DoubleSide});
This works great for meshes that are meant to reflect or refract what the cube camera sees. However, I'd like to simply create a texture and apply that to my mesh. In other words, I don't want my object to reflect or refract. I want the face normals to be ignored.
I tried using a THREE.WebGLRenderTarget, but it won't handle a cube camera. And using a single perpspective camera with WebGLRenderTarget does not give me a 360 texture, obviously.
Finally, simply assigning the cubeCamera.renderTarget to the 'map' property of the material doesn't work either.
Is it possible to do what I want?
r73.
Edit: this is not what the author of the question is looking for, I'll keep my answer below for other people
Your envmap is already a texture so there's no need to apply it as a map. Also, cubemaps and textures are structurally different, so it won't be possible to swap them, or if you succeed in doing that the result is not what you probably you might expect.
I understand from what you're asking you want a static envmap instead to be updated at each frame, if that's the case simply don't run myCubeCamera.updateCubeMap() into your render function. Instead place it at the end of your scene initialization with your desired cube camera position, your envmap will show only that frame.
See examples below:
Dynamic Cubemap Example
Static Cubemap Example
The answer is: Set the refractionRatio on the material to 1.0. Then face normals are ignored since no refraction is occurring.
In a normal situation where the Cube Camera is in the same scene as the mesh, this would be pointless because the mesh would be invisible. But in cases where the Cube Camera is looking at a different scene, then this is a useful feature.
I have a THREE.JS scene where there's a plane geometry in the middle of the scene. The plane geometry has a camera added to it. I also am using this example http://mrdoob.github.io/three.js/examples/webgl_materials_lightmap.html to add a lightmap and am adding it to my plane geometry.
Psuedocode:
planeGeometry.add(camera);
planeGeometry.add(sphereGeometryLightMap);
The problem is that when I try to rotate the sphere geometry on any axis, nothing happens. I tried using .rotation and setting the matrix4. Why is it that I can't rotate this sphere object when added to another object? How can I work around this?
You don't need to add the camera to the planeGeometry.
Also I assume that sphereGeometryLightMap is of type THREE.Mesh in which case you need to add it to the scene. Not to the planeGeometry.
If you want the sphereGeometry relative to the planeGeometry you can do:
var object = new THREE.Object3D();
object.add (planeGeometry);
object.add (sphereGeometry);
Before you add the geometries to the object you position them relative as you want.
Then, instead of modifying the planeGeometry, you modify the object.