I'm trying to get a sprite sheet texture in THREE.PointsMaterial to use offsets, so point particles can have different visuals.
Apparently this should be possible using THREE.ShaderMaterial, instead of PointsMaterial.
This answer seems to get so close to my solution, but the fiddle is broken and I'm having a hard time reconstructing it.
Shaders are new to me and I could use some help achieving the goal: get point particles to use different sections of a sprite sheet.
With a single image texture it's easy, like this, but trying to use a sprite sheet with offsets is killing me ;)
Related
I want to render a cube similar to .
My problem is how to render the face projections.
I tried using Reflector, but it is tricky to size and position so it captures just the face that I want, and also shows the sides.
I also saw I can use a separate canvas to render (I imagine using an orthographic camera), but I wish for everything to be in the same canvas. I saw an example with multiple views, but it seems that they can't be positioned behind.
So, is there a way to achieve this?
One possible approach to solve the issue:
Setup an orthographic camera such that its frustum encloses the cube. You can then position the camera in front of each side of the cube, use lookAt( cube.position ) to orient it properly and then render the scene into a render target. You need one render target per side. You can then use it as a texture for the respective plane mesh.
There is an official live example that demonstrates how RTT (render-to-texture) is done with three.js. Try to use it as a code template for your own app.
https://threejs.org/examples/#webgl_rtt
This question already exists:
Change dimensions of Cubical Shower 3d model in unity 3D
Closed 2 years ago.
Is it possible to change width of any fbx model in 3D without changing its realistic look so that after changing its dimension, the model should not be stretched?
If 2 objects are placed beside each other then need to increase the size of one object and change position of other object with respect to first object.
Thanks in advance.
this breaks down to two problems, if you want to scale an object in just one dimension it will always stretch, for example your your table:
While the board looks fine the legs will get stretched and look unrealistic.
Now the question is what can you do?
It depends on your model.
First of all has your model only one mesh? or has every component a single mesh?
Preferably you want your components to have a independent mesh object. For your table it would be something like this:
This way you can only scale your board and then transform the position of your legs accordingly so that they fit to the new board size.
If you have only one mesh there is not a lot you can do in Unity. For that you would need to go into Blender or any other 3D modeling tool and split the components manually.
Now if you only stretched the board and your model has a texture you will notice that it will look stretched.
What can you do about that?
Go to your texture and first of all check the wrap mode
in this case we want it on repeat, after that we need to change the material setting
since we stretched the geometry we need to change the tiling, befor it was on y = 1 but we scaled the y dimensions so now we need to adapt this number aswell and make the texture repeat. For a table this is doable, if we for example work with more complex textures that have specific parts this will not work and you have to change the texture manually.
now the texture looks better but you probably will have abrupt color changes, this is because the texture is repeated, i "circled" it on the picture. For this problem you have to change the texture in a picture editing program and make it seamless.
I hope this helped a bit, i know this is only the basics and to get a perfect texture and image you have to put in a bit more work, but for that i would highly recommend to read a tutorial.
Hello I'm trying to archive the effect in the image below (that is like shine light but only on top of the raw image)
Unfortunately I can not figure out how to do it, tried some shaders and assets from the asset store, but so far no one has worked, also I dont know much about shaders.
The raw image is an ui element, and renders a render texture that is being captured by a camera.
I'm totally lost here, any kind of help will be appreciated, how to make that effect?
Fresnel shaders use the difference between the surface normal and the view vector to detect which pixels are facing the viewer and which aren't. A UI plane will always face the user, so no luck there.
Solving this with shaders can be done in two ways - either you bake a normal map of the imagined "curvature" of the outer edge (example), or you create a signed distance field (example), or some similar method which maps the distance to the edge. A normal map would probably allow for the most complex effects, and i am sure that some fresnel shaders could work with that too. It does however require you to make a model of the shape and bake the normals from that.
A signed distance field on the other hand can be generated with script from an image, so if you have a lot of images, it might be the fastest approach. Getting the edge distance in real time inside the shader would not really work since you'd have to sample a very large amount of neighboring pixels, which might make the shader 10-20 times slower depending on how thick you need the edge to be.
If you don't need the image to be that dynamic, then maybe just creating an inner glow black/white texture in Photoshop and overlaying it using an additive shader would work better for you. If you don't know how to write shaders, then maybe the two above approaches are a bit of a tall order.
I am visualizing a graph using Three.js and for each node of the graph I add a label using TextGeometry. It is a pretty small graph but when I add text my application gets really slow. What should I do about it?
TextGeometry is more suitable for cases when you are really interested in rendering the text in 3D. It will create complex geometry that will surely slow your app down specially when there is a lot of text or you use CanvasRenderer.
For labels, it is generally better to use 2D labels, which are way faster to render. There are many different approaches to this. These can go on top of the Three.js rendering canvas, on a separate canvas, or even normal HTML nodes positioned using CSS properties. Alternatively, you can dynamically create small canvases of your label texts, and use them as sprite textures always facing camera - this might be the easiest way as the labels would be part of the 3D scene as your other objects. For a separate layer approach, you need to use unprojectVector or such to figure out screen XY coordinates to match your 3D scene positions.
See these SO posts for example:
- Dynamically create 2D text in three.js
- Canvas and SpriteMaterial
- How do I add a tag/label to appear on top of several objects so that the tag always faces the camera when the user clicks the object?
I'm struggling with a visualization I'm working on that involves a stream of repeated images. I have it working with a single sprite with a ParticleSystem, but I can only apply a single material to the system. Since I want to choose between textures I tried creating a pool of Particle objects so that I could choose the materials individually, but I can't get an individual Particle to show up with the WebGL renderer.
This is my first foray into WebGL/Three.js, so I'm probably doing something bone-headed, but I thought it would be worth asking what the proper way to go about this is. I'm seeing three possibilities:
I'm using Particle wrong (initializing with a mapped material, adding to the scene, setting position) and I need to fix what I'm doing.
I need a ParticleSystem for each sprite I want to display.
What I'm doing doesn't fit into particles at all and I really should be using another object type.
All the examples I see using the canvas renderer use Particle directly, but I can't find an example using the WebGL renderer that doesn't use ParticleSystem. Any hints?
Ok, I am going from what I have read elsewhere on this github issues page. You should start by reading it. It seems that the Particle is simply for the Canvas Renderer, and it will become Sprite in a further edition of Three.JS. ParticleSystem, however is not going to fulfill your needs either it seems. I don't think these classes are going to help you accomplish this in WebGL in 3D. Depending on what you are doing you might be better off with the CanvasRenderer anyway. ParticleSystem will only allow you to apply a single material which will serve as the material for each particle in the system as you suggested.
Short answer:
You can render THREE.Particle using THREE.CanvasRenderer only.