I'm accustomed to UV map coordinates in the range from [0,1], however, when inspecting an imported GLTF SkinnedMesh with BufferGeometry, I noticed that the range is a 16-bit unsigned integer (Uint16Array):
What's additionally confusing to me is that the normalized attribute is false. According to the docs, it seems like this should be set to true if these 16-bit values are normalized to the range [0,1]. And yet the UV texture map is working fine.
How does this UV map work? What is its range?
EDIT: Here is a screenshot of the UV Map in Blender:
Related
How are colors clamped in WebGL when using FLOAT as color format when rendering into a framebuffer? Can I have color channels below 0.0 and above 1.0? How can I control how colors are clamped?
In OpenGL values stored in framebuffers in float format are not clamped, you can store any value (but with a precision that is limited based on the format).
From: https://www.khronos.org/registry/webgl/extensions/WEBGL_color_buffer_float/ "NOTE: fragment shaders outputs gl_FragColor and gl_FragData[0] will only be clamped and converted when the color buffer is fixed-point"
If you read these values using a sampler then you should get the value stored in the buffer.
Note: You can verify this by storing values out of this range and then using this data in a shader such that it will be obvious in the output (try filling a float texture with different values covering a large range (e.g. 0-1000) then try rendering this texture to the screen by dividing by 1000, if your data is not clamped you should see the expected values, if it is clamped you will have loss of data.
i've got a cinema4D created geometry/material with two different uv-sets/meshes for the material. normalmap & lightmap have a different uv set then the tileable color shading.
is there any possibility two pass two different uv sets? as far as i can see, the faceVertexUVs is an array itself, but what would the shader look like?
thanks for an answer!
My problem is that after exporting a 3d model from Blender to json
with 3 uv sets and 3 different textures(diffuse map, normal map and light map) it looks like normal map is using the same uv set as diffuse map.
I've been wondering if it's possible that normalMap in THREE.MeshPhongMaterial can use separate UV set, just like LightMap? Or it only uses the same uv set as DiffuseMap?
With THREE.MeshPhongMaterial, all the maps share the primary UV set, with the exception of the lightMap and the aoMap, which share the 2nd set of UVs.
If you want different behavior, you will have to create a custom ShaderMaterial.
three.js r.71
I'm using the normal shader in three.js r.58, which I understand requires a normal map. However, I'm using a dynamic displacement map, so a pre-computed normal map won't work in this situation.
All the examples I've found of lit displacement maps either use flat shading or pre-computed normal maps. Is it possible to calculate the normals dynamically based on the displaced vertices instead?
Edit: I've posted a demo of a sphere with a displacement map showing flat normals:
Here's a link to the github repo with all of my examples illustrating this problem, and the solutions I eventually found:
https://github.com/meetar/three.js-normal-map-0
This answer is based on your comments above.
You can do what you want, but it is quite sophisticated, and you will of course have to modify the three.js 'normal' shader.
Have a look at http://alteredqualia.com/three/examples/webgl_cubes_indexed.html. Look at the fragment shader, and you will see
vec3 normal = normalize( cross( dFdx( vViewPosition ), dFdy( vViewPosition ) ) );
Alteredqualia is using a derivative normal in the fragment shader ( instead of an attribute normal ) because the vertex positions are changing in the vertex shader, and the normal is not known.
What he is doing is calculating the normal using the cross product of the x and y screen-space derivatives of the fragment position.
This will set the normal as the face normal. It will be discontinuous at hard edges.
three.js r.58
What I was describing above is called a "bump map" and it comes as a default with the three.js phong shader. I combined the normalmap shader with chunks of the phong shader responsible for bump mapping:
http://meetar.github.io/three.js-normal-map-0/bump.html
Though the normals are a bit noisy they are basically correct.
You can also calculate a normal map from the displacement map with JavaScript. This results in smooth normals, and is a good option if your displacement map isn't changing too often.
This method uses the code found in this demo: http://mrdoob.com/lab/javascript/height2normal/
Demo here:
http://meetar.github.io/three.js-normal-map-0/index14.html
I have an image that is a combination of the RGB and depth data from a Kinect camera.
I'd like to do two things, both in WebGL if possible:
Create 3D model from the depth data.
Project RGB image onto model as texture.
Which WebGL JavaScript engine should I look at? Are there any similar examples, using image data to construct a 3D model?
(First question asked!)
Found that it is easy with 3D tools in Photoshop (3D > New Mesh From Grayscale): http://www.flickr.com/photos/forresto/5508400121/
I am not aware of any WebGL framework that resolves your problem specifically. I think you could potentially create a grid with your depth data, starting from a rectangular uniform grid and moving each vertex to the back or to the front (Z-axis) depending on the depth value.
Once you have this, then you need to generate the texture array and from the image you posted on flickr I would infer that there is a mapping one-to-one between the depth image and the texture. So generating the texture array should be straightforward. You just map the correspondent coordinate (s,t) on the texture to the respective vertex. So for every vertex you have two coordinates in the texture array. Then you bind it.
Finally you need to make sure that you are using the texture to color your image. This is a two step process:
First step: pass the texture coordinates as an "attribute vec2" to the vertex shader and save it to a varying vec2.
Second step: In the fragment shader, read the varying vec2 that you created on step one and use it to generate the gl_FragColor.
I hope it helps.