I have a simple three.js application. I'm trying to texture some geometry. My hardware is VERY memory constrained. Conveniently, my texture is grayscale. So I figured I could try and save the texture in an 8-bit format and then load it using AlphaFormat as the texture's pixel format? I was hoping this could help keep my video memory usage down. Does anyone have experience with doing something like that? Its just wasteful for my application to load 3 channels worth of color for a grayscale image. Any suggestions would be great.
Thanks!
Related
What are the possible and good ways/best practices/etc to improve texture quality in THREE.js?
I have a scene where I have planes(cards) with 512x512px textures. How it looks you can see on images below. My problem is that textures looks blurred. I have tried to change filters and value of anisotropy and it helps, but just a little and texture still blurred. The only one way that I found when texture looks like I want - increase render size x2 and keep canvas size the same. It is bad way because of performance issues, but I don't find another way to get good texture quality.
The best quality - render size x2
Normal quality - magFilter = minFilter = THREE.LinearMipMapLinearFilter /anisotropy = 16
Bad quality - no filters
I hope for any help, thanks in advance
You hardly can do better than trilinear filtering with 16x anisotropic (and not all hardwares can achieve 16x anisotropic filtering).
However, you say your textures are 512x512, while (if your snapshots are real-size) it appear clear that:
they are rendered way smaller thant 512x512. It mean this is currently a lower mipmap level that is used to render your cardes, a mipmap generated by WebGL.
Your cards are rectangular while your textures are square. Depending how you mapped texture on your shape, this could mean the aspect-ratio change, so the sampler need to do some more interpolation (so filtering, meaning more blur)
So what you can try to do, is to:
use smaller base texture, 256x256 for example, which you done yourself with the best sharpness you can, so no min-filter is needed while WebGL sample the texture.
Adapt the mesh texture coordinates to your texture or vice versa to avoid aspect-ratio changes during texture sampling.
I'm trying to get dds textures to work with three.js. I have a model in json (converted from .obj + .mtl using three.js converter) using baked textures in jpg/png format. I've created a dds texture (DXT1 with mipmaps). When I load model (using JSONLoader) which uses DDS texture the UV map doesn't seems to be applied. I'm getting no mapping at all.
For example plane with jpg texture:
And by switching to DDS I am getting this:
is it expected behavior? Or maybe DDS textures doesn't support uv maps? Or maybe it is some sort of bug in three.js?
I would really use any help guys.
As explained here https://github.com/mrdoob/three.js/issues/4316 dds textures can't be flipped like normal jpg/png images so they appear upside down. The solution would be to flip the source image and then compress it or make a shader aware of that and flip uv coords.
I use a wood texture image in my model. by default my texture is stretched on the model you see this on woodark. When I changed the repeat the texture is more stretching and I are not understand why. I search to undertand how to use right the mapping in my model with base explain but I have found only examples with colors pixels.
thank to answers
You should make sure your textures have power of two dimensions (ie. 256x256, 512x512 etc). Textures of arbitrary dimensions (NPOT) bring all kinds of mapping trouble in WebGL.
If you are unable to resize textures server-side, you can do it client-side. This link has some sample javascript code, as well as other relevant information: http://www.khronos.org/webgl/wiki/WebGL_and_OpenGL_Differences
I have some troubles about ensure the quality texture when zooming in the 3d model with texturing the image two-dimensional image in opengl ES while have decided to ask for help!
I have loading the .obj format text file and texturing by 1024*1024 image. I can arbitrary rotation and scale the 3d model with texture.But when scale to certain degree,texture is shown that low quality.
How can i ensure the high quality texture?
What good method to solve this problem?
Any pointers would be appreciated.
Thanks!
I'm writing a drawing application, and the drawing canvas is an OpenGL texture. When you draw onto the canvas, it determines which region of the canvas texture has been changed, and copies that pixel data out (using glReadPixels) before applying the changes you made.
To undo, I want to simply revert to the previous texture state using that pixel data that was copied out. However, OpenGL ES doesn't provide a glDrawPixels command. What's the best way to do it?
I've considered two options, but I'm not sure either is that great:
Create a temporary texture using the pixels I copied out and draw that in. (However, copied region is not a power of two!)
Unbind the large canvas texture completely, manually alter the bytes of the texture, and then put it back into OpenGL. I'm not using any sort of compression, so this might not be that bad. But it seems like a hack?
Anybody have any ideas? I'd really appreciate it!
In case anyone stumbles across this while trying to do something similar, I've come up with a solution that seems to work well.
Grab an image of the current texture by binding it to the framebuffer and then writing the framebuffer to a CGImageRef.
Create a new CGContext and draw in the existing texture CGImageRef. Then draw old texture data in to the portion that the user changed, effectively "undoing" that change to the image.
Destroy old OpenGL texture and create a texture from the CGContext.
I think this is a pretty slow way of going about things, but I don't need huge performance - my real concern was limiting the amount of data being kept to represent the "old" texture.
If you need help with this (there's quite a bit of code) feel free to email me.