I use a wood texture image in my model. by default my texture is stretched on the model you see this on woodark. When I changed the repeat the texture is more stretching and I are not understand why. I search to undertand how to use right the mapping in my model with base explain but I have found only examples with colors pixels.
thank to answers
You should make sure your textures have power of two dimensions (ie. 256x256, 512x512 etc). Textures of arbitrary dimensions (NPOT) bring all kinds of mapping trouble in WebGL.
If you are unable to resize textures server-side, you can do it client-side. This link has some sample javascript code, as well as other relevant information: http://www.khronos.org/webgl/wiki/WebGL_and_OpenGL_Differences
Related
Hello I'm trying to archive the effect in the image below (that is like shine light but only on top of the raw image)
Unfortunately I can not figure out how to do it, tried some shaders and assets from the asset store, but so far no one has worked, also I dont know much about shaders.
The raw image is an ui element, and renders a render texture that is being captured by a camera.
I'm totally lost here, any kind of help will be appreciated, how to make that effect?
Fresnel shaders use the difference between the surface normal and the view vector to detect which pixels are facing the viewer and which aren't. A UI plane will always face the user, so no luck there.
Solving this with shaders can be done in two ways - either you bake a normal map of the imagined "curvature" of the outer edge (example), or you create a signed distance field (example), or some similar method which maps the distance to the edge. A normal map would probably allow for the most complex effects, and i am sure that some fresnel shaders could work with that too. It does however require you to make a model of the shape and bake the normals from that.
A signed distance field on the other hand can be generated with script from an image, so if you have a lot of images, it might be the fastest approach. Getting the edge distance in real time inside the shader would not really work since you'd have to sample a very large amount of neighboring pixels, which might make the shader 10-20 times slower depending on how thick you need the edge to be.
If you don't need the image to be that dynamic, then maybe just creating an inner glow black/white texture in Photoshop and overlaying it using an additive shader would work better for you. If you don't know how to write shaders, then maybe the two above approaches are a bit of a tall order.
What are the possible and good ways/best practices/etc to improve texture quality in THREE.js?
I have a scene where I have planes(cards) with 512x512px textures. How it looks you can see on images below. My problem is that textures looks blurred. I have tried to change filters and value of anisotropy and it helps, but just a little and texture still blurred. The only one way that I found when texture looks like I want - increase render size x2 and keep canvas size the same. It is bad way because of performance issues, but I don't find another way to get good texture quality.
The best quality - render size x2
Normal quality - magFilter = minFilter = THREE.LinearMipMapLinearFilter /anisotropy = 16
Bad quality - no filters
I hope for any help, thanks in advance
You hardly can do better than trilinear filtering with 16x anisotropic (and not all hardwares can achieve 16x anisotropic filtering).
However, you say your textures are 512x512, while (if your snapshots are real-size) it appear clear that:
they are rendered way smaller thant 512x512. It mean this is currently a lower mipmap level that is used to render your cardes, a mipmap generated by WebGL.
Your cards are rectangular while your textures are square. Depending how you mapped texture on your shape, this could mean the aspect-ratio change, so the sampler need to do some more interpolation (so filtering, meaning more blur)
So what you can try to do, is to:
use smaller base texture, 256x256 for example, which you done yourself with the best sharpness you can, so no min-filter is needed while WebGL sample the texture.
Adapt the mesh texture coordinates to your texture or vice versa to avoid aspect-ratio changes during texture sampling.
I am struggling with the common transparency sorting issue. I know there are ways around it (like manually sorting the objects or order-independent transparency) but all that can become quite fiddly. I'd be ok if there was a way to have objects that are partly opaque and partly 100% transparent and have them intersect correctly.
In theory this should be possible. Opaque pixels would be rendered to color buffer and z buffer in the standard way and transparent pixels are just left out.
What I'm looking for is something like indexed transparency as it was used with gif files, for instance that all pixels of an object that have the color #FF00FF are not rendered.
I just don't know if and how this would be possible using three.js. Also, I want to be able to use it with custom shaders.
EDIT: Thanks for your comments so far and sorry for the confusion. This is more of a conceptional thing than a specific problem with my code. It's just that I am often faced with the issue that parts of transparent objects cut out parts of other transparent objects which should be in front of them. Also, transparent objects do not intersect correctly, it's always that one covers another. I understand why this happens and that it is a problem which is inherent to the way transparency is treated. But often I only need parts of an object completely transparent, no partial-shine-through-alpha transparency. Which could be possible if there was a way to leave out certain pixels of objects and render the rest like a normal opaque object.
Let's assume I want to have a metal chain and each segment is a PlaneGeometry thing with a texture that shows the shape of an O (and the rest transparent). Now the chain should be shown with correct interlinkage so to say.
Any help welcome!
Cheers!
If you are rendering a three.js scene, and your texture maps contain no partially-transparent pixels -- that is, each pixel is either 100% opaque or 100% transparent, then you can achieve a proper rendering by setting
material.alphaTest = 0.5;
//material.transparent = true; // likely not needed
The same holds true if you are using a binary alpha-map.
If you are writing a custom shader, then you can achieve the same effect by using a pattern like the following in your fragment shader:
if ( texelColor.a < 0.5 ) discard;
three.js r.84
I'm wrapping a 3D model with texture in three js. But in some areas the image is stretching even though my texture is in good resolution.
this is a UV mapping issue, not a THREE.js issue. check over in 3d modelling or game dev.
I'm trying and failing to work out how to achieve a quad-tree of materials (images) on a single plane, much like a Google Maps-style zoomable tile that gets more accurate the closer you get.
In short, I want to be able to have a 1x1 image texture (covering a plane that is 256 units wide and tall) that can then be replaced with a 2x2 texture, that can then be replaced with a 4x4 texture, and so on.
Like the image example below…
Ideally, I want to avoid having to create a different plane for each zoom level / number of segments. A perfect solution would allow me to break a single plane into 8x8 segments (highest zoom) and update the number of textures on the fly. So it would start with a 1x1 texture across all 64 (8x8) segments, then change into a 2x2 texture with each texture covering 4x4 segments, and so on.
Unfortunately, I can't work out how to do this. I explored setting the materialIndex for each face but you aren't able to update those after the first render so that wouldn't work. I've tried looking into UV coordinates but I don't understand how it would work in this situation, nor how to actually implement that in Three.js – there is little in the way of documentation / examples for this specific case.
A vertex shader is another option that came up in research, but again I don't know enough to understand how to construct that.
I'd appreciate any and all help with this, it will be a technique that proves valuable for other Three.js users I'm sure.
Not 100% sure what you are trying to do, whether you are talking about texture atlasing (looking up and different textures based on current setting/zooms) but if you are looking for quad-tree based texturing that increases in detail as you zoom in then this is essentially what mipmaping is and does.
(It can be also be used to do all sorts of weird things because of that, but that's another adventure entirely)
Generally mipmapping is automatic based on the filtering you use - however it sounds like you need more control over it.
I created an example hidden away in the three.js source tree which may help:
http://mrdoob.github.com/three.js/examples/webgl_materials_texture_manualmipmap.html
Which shows you how to load each mipmap level in manually, rather than have it just be automatically generated.
HTH