apply morphTargets to ParticleSystem or PointCloud - three.js

I am using Three.js version 65.
I am displaying a set of points # time t=0 in 3D space using ParticleSystem. And also I am having next set of points at time t=1. Now I want to animate it as in JSONLoader morphTarget animation? Could anybody suggest me the best way to achieve this?
(or)
Can I prefer WebGL shader programming for this? Please suggest.
Thanks in advance.

Yes you can do that with shaders. You'd e.g. create a custom shader for your particle system with the attributes vec3 position, vec3 nextPosition and a uniform float scale which goes from 0 to 1.
Then you can add some logic to the shader where you calculate the new position like vec3 pos = position * scale + nextPosition * (1.0 - scale) (along with the usual billboard / GL_Point code ofc). And when you reached scale 1 you swap position with nextPosition and fill nextPosition with the relative follower.
Good luck have fun :)
PS: My code mentioned is just for linear interpolation. In your case you might consider other interpolations. Maybe even add another two attribute vectors to indicate the leading and the following point in order to calculate the new position with a bezier curve.
Lastly you'll have to give a thought to performance sooner or later. If you have 10k particles and 1k "states" you might run into performance issues.

Related

webgl - model coordinates to screen coordinates

Im having an issue with calculating screen coordinates of a vertex. this is not a specifically a webgl issue, more of a general 3d graphics issue.
the sequence of matrix transformations that Im using is:
result_vec4 = perspective_matrix * camera_matrix * model_matrix * vertex_coords_vec4
model_matrix being the transformation of a vertex in its local coordinate system into the global scene coord system.
so my understanding is that the final result_vec4 is in clip space? which should then be in the [-1,1] range. which is not what Im getting... result_vec4 just ends up containing some standard values for the coords, not corresponding to the correct screen position of the vertex.
does anyone have any ideas as to what might be the issue here?
thank you very much for any thoughts.
To go in clip space you need to project result_vec4 on the hyperplane w=1 using:
result_vec4 /= result_vec4.w
By applying this perspective division result_vec4.xyz will be in [-1,1].

three.js - Adjusting opacity of individual particles

I am trying to vary the opacity of particles as a function of their distance from a plane.
This issue describes my problem, and the answer a year ago was essentially "you can't". Opacity is apparently a parameter of a material, not an element, and hence individual particle opacity is not possible.
Has anything changed, is there any way I could achieve this? If individual particle colouring is possible, I imagine this isn't out of reach.
Cheers
EDIT - This answer shows how to set per-point opacity using a custom ShaderMaterial. See https://stackoverflow.com/a/67892506/1461008 for an approach using PointsMaterial.
ParticleSystem has been renamed to PointCloud and then to Points.
Yes, you can create a Point Cloud and vary the alpha value of each particle's color dynamically.
In three.js, you can do this by setting the Point Cloud's material to be a ShaderMaterial having an attribute equal to the desired alpha value for each particle.
If ShaderMaterials, vertex shaders and fragment shaders are new to you, here is a really simple Fiddle that implements a Point Cloud with dynamic alphas: https://jsfiddle.net/9Lvrnpwc/.
EDIT: Updated fiddle
three.js r.148
Not sure why, but proposed solution didn't work for me. I used somewhat tricky shading to make points round and blurry at edges. So the corners of points were supposed to be transparent, but they appeared black: http://jsfiddle.net/5kz64ero/1/
Relevant part of my fragment shader:
// Distance from 0.0 to 0.5 from the center of the point
float d = distance(gl_PointCoord, vec2(0.5, 0.5));
// Applying sigmoid to smoothen the edge
float opacity = 1.0 / (1.0 + exp(16.0 * (d - 0.25)));
gl_FragColor = vec4(opacity * vColor, opacity);
I figured that traditionally this is solved by depth-sorting (with farthest points coming first), and I found some evidence that some older implementations of ParticleSystem in Three contained sortParticles attribute. But it's not there anymore. And in my case sorting would really involve redoing that every time camera position changes. Instead I set depthWrite: false and it seems to solve the issue.
The result: http://jsfiddle.net/5kz64ero/6/

3D sprites, writing correct depth buffer information

I am writing a particle engine for iOS using Monotouch and openTK. My approach is to project the coordinate of each particle, and then write a correctly scaled textured rectangle at this screen location.
it works fine, but I have trouble calculating the correct depth value so that the sprite will correctly overdraw and be overdrawn by 3D objects in the scene.
This is the code I am using today:
//d=distance to projection plane
float d=(float)(1.0/(Math.Tan(MathHelper.DegreesToRadians(fovy/2f))));
Vector3 screenPos=Vector3.Transform(ref objPos,ref viewMatrix, out screenPos);
float depth=1-d/-screenPos.Z;
Then I am drawing a trianglestrip at the screen coordinate where I put the depth value calculated above as the z coordinate.
The results are almost correct, but not quite. I guess I need to take the near and far clipping planes into account somehow (near is 1 and far is 10000 in my case), but I am not sure how. I tried various ways and algorithms without getting accurate results.
I'd appreciate some help on this one.
What you really want to do is take your source position and pass it through modelview and projection or whatever you've got set up instead if you're not using the fixed pipeline. Supposing you've used one of the standard calls to set up the stack, such as glFrustum, and otherwise left things at identity then you can get the relevant formula directly from the man page. So reading directly from that you'd transform as:
z_clip = -( (far + near) / (far - near) ) * z_eye - ( (2 * far * near) / (far - near) )
w_clip = -z
Then, finally:
z_device = z_clip / w_clip;
EDIT: as you're working in ES 2.0, you can actually avoid the issue entirely. Supply your geometry for rendering as GL_POINTS and perform a normal transform in your vertex shader but set gl_PointSize to be the size in pixels that you want that point to be.
In your fragment shader you can then read gl_PointCoord to get a texture coordinate for each fragment that's part of your point, allowing you to draw a point sprite if you don't want just a single colour.

Render depth buffer to texture

Quite new at shaders, so please bear with me if I am doing something silly here. :)
I am trying to render the depth buffer of a scene to a texture using opengl ES 2.0 on iOS, but I do not seem to get entirely accurate results unless the models have a relatively high density of polygons showing on the display.
So, for example if I render a large plane consisting of only four vertices, I get very inaccurate results, but if I subdivide this plane the results get more accurate for each subdivision, and ultimately I get a correctly rendered depth buffer.
This reminds me a lot about affine versus perspective projected texture mapping issues, and I guess I need to play around with the ".w" component somehow to fix this. But I thought the "varying" variables should take this into account already, so I am a bit at loss here.
This is my vertex and fragment shader:
[vert]
uniform mat4 uMVPMatrix;
attribute vec4 aPosition;
varying float objectDepth;
void main()
{
gl_Position=uMVPMatrix * aPosition;
objectDepth=gl_Position.z;
}
[frag]
precision mediump float;
varying float objectDepth;
void main()
{
//Divide by scene clip range, set to a constant 200 here
float grayscale=objectDepth/200.0;
gl_FragColor = vec4(grayscale,grayscale,grayscale,1.0);
}
Please note that this shader is simplified a lot just to highlight the method I am using. Although for the naked eye it seems to work well in most cases, I am in fact rendering to 32 bit textures (by packing a float into ARGB), and I need very high accuracy for later processing or I get noticeable artifacts.
I can achieve pretty high precision by cranking up the polygon count, but that drives my framerate down a lot, so, is there a better way?
You need to divide z by the w component.
This is very simple, the depth is not linear so you can not use linear interpolation for z ... you will solve it very easily if you interpolate 1/z instead of z. You can also perform some w math, exactly as suggested by rasmus.
You can read more about coordinate interpolation at http://www.luki.webzdarma.cz/eng_05_en.htm (page about implementing a simple software renderer)

How do I use a vertex shader to multiply my vertex data by a uniform?

This is a question that came from an earlier problem I had. Basically, I was trying to implement orthographic scaling in my shader by modifying the scale components of my projection matrix, but it wasn't possible. What I had to actually do was scale the verts before "sending them in" to my shader via a draw. That works like a charm...
But, of course, the issue is that in software now I'm responsible for scaling all my verts before handing them off to the shader. This makes me wonder if it would be possible to have a vertex shader do this. I imagine it is, but I can't figure it out.
What I'm doing is just going through all 4 of my verts (held in float vertices[8]) and doing *= scale;. To be slightly more accurate, I multiply the X and Y components separately by scaleX and scaleY.
How can I do this same thing in a vertex shader?
replace gl_Vertex with (gl_Vertex * scale) everywhere in your vertex shader. Or if you're using a user-defined input for your coordinate, put * scale on that.

Categories

Resources