I want to load an image in THREE.Points and I'm using bufferGeometry without uv attribute and shaderMaterial
can anyone explain me what is vu and why I need it? I have a 2d world (all points have z=0 position ) so can I use simple vec2 instead of saving uv coordinations in attribute?
thanks in advance
Related
I'm creating a terrain with three.js and texturing with some grass texture I found, and applying FlatShading so it looks low poly, but the shading only modifies that and I'm still seeing the texture applied, I need each face to have a flat color and not the texture, like this picture:
http://i.ytimg.com/vi/9WFBnc_gPMo/maxresdefault.jpg
Unity using PolyWorld asset, from a terrain with a grass texture for example, it applies flat shading and also uses a flat color as texture (predominant color?)
What you seem to want is inefficient, but one way to accomplish it would be to make sure that all three faceVertexUV values for each triangle face are the same: that is for a triangle ABC the UV coordinates are all, say (.4,.6),(.4,.6),(.4,.6)
This means that all pixels of the rendered triangle will have that one uniform UV and you'll always get the same texture color across the triangle (some filtering notwithstanding in very extreme foreshortening cases)
Is it possible to apply texture to mesh without specifying UV's in geometry in three.js ?
There are classes such as THREE.CubeGeometry, THREE.SphereGeometry, etc. that automatically generate the UV coordinates for you. However, if you are creating your own geometry from scratch (i.e., specifying vertex locations, creating faces, etc.) then the answer is no. Either you need to set the UV coordinates manually when creating the geometry, or you need to write a custom shader which determines the UV coordinates for any given point. Think about it this way: if you don't specify UV coordinates, the points on your geometry have no idea which point on your texture they should display.
I'm using the normal shader in three.js r.58, which I understand requires a normal map. However, I'm using a dynamic displacement map, so a pre-computed normal map won't work in this situation.
All the examples I've found of lit displacement maps either use flat shading or pre-computed normal maps. Is it possible to calculate the normals dynamically based on the displaced vertices instead?
Edit: I've posted a demo of a sphere with a displacement map showing flat normals:
Here's a link to the github repo with all of my examples illustrating this problem, and the solutions I eventually found:
https://github.com/meetar/three.js-normal-map-0
This answer is based on your comments above.
You can do what you want, but it is quite sophisticated, and you will of course have to modify the three.js 'normal' shader.
Have a look at http://alteredqualia.com/three/examples/webgl_cubes_indexed.html. Look at the fragment shader, and you will see
vec3 normal = normalize( cross( dFdx( vViewPosition ), dFdy( vViewPosition ) ) );
Alteredqualia is using a derivative normal in the fragment shader ( instead of an attribute normal ) because the vertex positions are changing in the vertex shader, and the normal is not known.
What he is doing is calculating the normal using the cross product of the x and y screen-space derivatives of the fragment position.
This will set the normal as the face normal. It will be discontinuous at hard edges.
three.js r.58
What I was describing above is called a "bump map" and it comes as a default with the three.js phong shader. I combined the normalmap shader with chunks of the phong shader responsible for bump mapping:
http://meetar.github.io/three.js-normal-map-0/bump.html
Though the normals are a bit noisy they are basically correct.
You can also calculate a normal map from the displacement map with JavaScript. This results in smooth normals, and is a good option if your displacement map isn't changing too often.
This method uses the code found in this demo: http://mrdoob.com/lab/javascript/height2normal/
Demo here:
http://meetar.github.io/three.js-normal-map-0/index14.html
I'm trying to draw to a subrect of a texture based FBO, but am having difficulty. The FBO has dimensions of say 500x500 and I am trying to have the fragment shader only redraw say a 20x20 pixel subrect. Modiyfing the full texture works without difficulty.
At first I tried setting glViewport to the needed subrect, but it doesn't look to be that simple. I'm suspecting that the Vertex attributes affecting gl_Position and the varying texture coordinates are involved, but I can't figure out how.
Turns out that I was trying to modify the texture coordinate attributes, but was more easily able to just modify the viewport using glViewport and gl_FlagCoord within the shader.
I have an image that is a combination of the RGB and depth data from a Kinect camera.
I'd like to do two things, both in WebGL if possible:
Create 3D model from the depth data.
Project RGB image onto model as texture.
Which WebGL JavaScript engine should I look at? Are there any similar examples, using image data to construct a 3D model?
(First question asked!)
Found that it is easy with 3D tools in Photoshop (3D > New Mesh From Grayscale): http://www.flickr.com/photos/forresto/5508400121/
I am not aware of any WebGL framework that resolves your problem specifically. I think you could potentially create a grid with your depth data, starting from a rectangular uniform grid and moving each vertex to the back or to the front (Z-axis) depending on the depth value.
Once you have this, then you need to generate the texture array and from the image you posted on flickr I would infer that there is a mapping one-to-one between the depth image and the texture. So generating the texture array should be straightforward. You just map the correspondent coordinate (s,t) on the texture to the respective vertex. So for every vertex you have two coordinates in the texture array. Then you bind it.
Finally you need to make sure that you are using the texture to color your image. This is a two step process:
First step: pass the texture coordinates as an "attribute vec2" to the vertex shader and save it to a varying vec2.
Second step: In the fragment shader, read the varying vec2 that you created on step one and use it to generate the gl_FragColor.
I hope it helps.