cracks at edges of shader material in THREE.js - animation

I'm using a shader material to wrap a rectangular texture around a torus in the expected way, gluing edge to edge. The problem is that at the edges, the texture doesn't quite wrap around, and there's this little glowing crack which looks like scanning.
I've isolated the problem and found that it's in my fragment shader. I need to be able to shift the uv values freely in my actual program, so i have some lines in the fragment shader normalizing the uv values to keep them between 0 and 1. However, this results in the unfortunate texture cracks. I am having trouble understanding why... in the beginning shouldn't the uv coordinates be inbetween 0 and 1? Even without any rotation, going on, these lines cause the cracks. Remove the lines, no cracks... but I can't remove them if I'm rotating the texture because then I get a totally undesired effect. hoping someone could explain how these lines in the fragment shader are causing the cracks and what I can do to achieve the same behavior but avoid the cracks, I'm a real novice in glsl.
the problem occurs in the below, with the if statements.
uniform sampler2D iChannel0;
uniform float rotX;
uniform float rotY;
varying vec2 vUv;
void main(){
vec2 uv = vUv;
uv.x = uv.x + rotX;
uv.y = uv.y + rotY;
if (uv.y > 1.)
uv.y = uv.y -1.;
if (uv.y < 0.)
uv.y = uv.y + 1.;
if (uv.x > 1.)
uv.x = uv.x -1.;
if(uv.x < 0.)
uv.x = uv.x + 1.;
gl_FragColor = texture2D(iChannel0, vec2(uv.x,uv.y));
}
Here's a very distilled jsfiddle
reproducing the problem and showing a meridian texture rotation on the torus.

Related

How do you increase the space between pixels in a fragment shader?

I'm currently working on a shader for a very mundane effect I'd like to achieve. It's a little bit hard to explain, but the basic gist is that I'm trying to "pull apart" the pixels of a pixel art image.
You can see my current progress, however minor, at this jsfiddle:
https://jsfiddle.net/roomyrooms/Lh46z2gw/85/
I can distort the image easily, of course, and make it stretch the further away from the center it is. But this distorts and warps it smoothly, and all the pixels remain connected (whether they're squished/smeared/etc.)
I would like to get an effect where the space between the pixels is stretched rather than the pixels themselves stretching. Sort of like if you were to swipe sand across a table. The grains of sand stay the same size, they just get further apart.
Welcome any ideas! Thanks. Here's what I've got code-wise so far:
var fragmentShader = `
precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D uSampler;
uniform highp vec4 inputSize;
uniform float time;
vec2 mapCoord( vec2 coord )
{
coord *= inputSize.xy;
coord += inputSize.zw;
return coord;
}
vec2 unmapCoord( vec2 coord )
{
coord -= inputSize.zw;
coord /= inputSize.xy;
return coord;
}
void main(void)
{
vec2 coord = mapCoord(vTextureCoord);
float dist = distance(coord.x, inputSize.x/2.);
coord.x += dist/4.;
coord = unmapCoord(coord);
gl_FragColor = texture2D(uSampler, coord);
}`
EDIT: Added an illustration of the effect I'm trying to achieve here:
I can get something along these lines:
With modulo, but it's discarding half of the image in the process.
You can:
discard some fragments (that is slow on some mobile devices)
use stencil mask to just draw where you want
draw transparent pixels alpha=0 for the ones that you do not want.
and lastly, you can draw an array of points or squires and move them around.
As far as I know, the fragment shader will run on every pixel in your triangle. You can only tell it what color to set that pixel to. In your example you're already duplicating columns of pixels so you can discard some hopefully without losing any of the source image's pixels if you stretch the coord 2x then discard every other column.
vec2 coord = mapCoord(vTextureCoord);
if(coord.x < 100.0 && floor(coord.x/2.0)==floor((coord.x+1.0)/2.0))
discard;

Finding the size of a screen pixel in UV coordinates for use by the fragment shader

I've got a very detailed texture (with false color information I'm rendering with a false-color lookup in the fragment shader). My problem is that sometimes the user will zoom far away from this texture, and the fine detail will be lost: fine lines in the texture can't be seen. I would like to modify my code to make these lines pop out.
My thinking is that I can run fast filter over neighboring textels and pick out the biggest/smallest/most interesting value to render. What I'm not sure how to do is to find out if (and how much) to do this. When the user is zoomed into a triangle, I want the standard lookup. When they are zoomed out, a single pixel on the screen maps to many texture pixels.
How do I get an estimate of this? I am doing this with both orthogographic and perspective cameras.
My thinking is that I could somehow use the vertex shader to get an estimate of how big one screen pixel is in UV space and pass that as a varying to the fragment shader, but I still don't have a solid grasp on either the transforms and spaces enough to get the idea.
My current vertex shader is quite simple:
varying vec2 vUv;
varying vec3 vPosition;
varying vec3 vNormal;
varying vec3 vViewDirection;
void main() {
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
vPosition = (modelMatrix *
vec4(position,1.0)).xyz;
gl_Position = projectionMatrix * mvPosition;
vec3 transformedNormal = normalMatrix * vec3( normal );
vNormal = normalize( transformedNormal );
vViewDirection = normalize(mvPosition.xyz);
}
How do I get something like vDeltaUV, which gives the distance between screen pixels in UV units?
Constraints: I'm working in WebGL, inside three.js.
Here is an example of one image, where the user has zoomed perspective in close to my texture:
Here is the same example, but zoomed out; the feature above is a barely-perceptible diagonal line near the center (see the coordinates to get a sense of scale). I want this line to pop out by rendering all pixels with the red-est color of the corresponding array of textels.
Addendum (re LJ's comment)...
No, I don't think mipmapping will do what I want here, for two reasons.
First, I'm not actually mapping the texture; that is, I'm doing something like this:
gl_FragColor = texture2D(mappingtexture, texture2d(vec2(inputtexture.g,inputtexture.r))
The user dynamically creates the mappingtexture, which allows me to vary the false-color map in realtime. I think it's actually a very elegant solution to my application.
Second, I don't want to draw the AVERAGE value of neighboring pixels (i.e. smoothing) I want the most EXTREME value of neighboring pixels (i.e. something more akin to edge finding). "Extreme" in this case is technically defined by my encoding of the g/r color values in the input texture.
Solution:
Thanks to the answer below, I've now got a working solution.
In my javascript code, I had to add:
extensions: {derivatives: true}
to my declaration of the ShaderMaterial. Then in my fragment shader:
float dUdx = dFdx(vUv.x); // Difference in U between this pixel and the one to the right.
float dUdy = dFdy(vUv.x); // Difference in U between this pixel and the one to the above.
float dU = sqrt(dUdx*dUdx + dUdy*dUdy);
float pixel_ratio = (dU*(uInputTextureResolution));
This allows me to do things like this:
float x = ... the u coordinate in pixels in the input texture
float y = ... the v coordinate in pixels in the input texture
vec4 inc = get_encoded_adc_value(x,y);
// Extremum mapping:
if(pixel_ratio>2.0) {
inc = most_extreme_value(inc, get_encoded_adc_value(x+1.0, y));
}
if(pixel_ratio>3.0) {
inc = most_extreme_value(inc, get_encoded_adc_value(x-1.0, y));
}
The effect is subtle, but definitely there! The lines pop much more clearly.
Thanks for the help!
You can't do this in the vertex shader as it's pre-rasterization stage hence output resolution agnostic, but in the fragment shader you could use dFdx, dFdy and fwidth using the GL_OES_standard_derivatives extension(which is available pretty much everywhere) to estimate the sampling footprint.
If you're not updating the texture in realtime a simpler and more efficient solution would be to generate custom mip levels for it on the CPU.

Coloring rectangle in function of distance to nearest edge produces weird result in diagonals

I'm trying to color a rectangle in ShaderToy/GLSL in function of each pixel's distance to the nearest rectangle edge. However, a weird (darker) result can be seen on its diagonals:
I'm using the rectangle UV coordinates for it, with the following piece of code:
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
// Normalized pixel coordinates (from 0 to 1)
vec2 uv = fragCoord/iResolution.xy;
vec2 uvn=abs(uv-0.5)*2.0;
float maxc=max(uvn.y,uvn.x);
vec3 mate=vec3(maxc);
fragColor = vec4(mate.xyz,1);
}
As you can see, the error seems to come from the max(uvn.y,uvn.x); line of code, as it doesn't interpolate smoothly the color values as one would expect. For comparison, those are the images obtained by sampling uvn.y and uvn.x instead of the maximum between those two:
You can play around with the shader at this URL:
https://www.shadertoy.com/view/ldcyWH
The effect that you can see is optical illusion. You can make this visible by grading the colors. See the answer to stackoverflow question Issue getting gradient square in glsl es 2.0, Gamemaker Studio 2.0.
To achieve a better result, you can use a shader, which smoothly change the gradient, from a circular (or elliptical) gradient in the middle of the the view, to a square gradient at the borders of the view:
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
// Normalized pixel coordinates (from 0 to 1)
vec2 uv = fragCoord/iResolution.xy;
vec2 uvn=abs(uv-0.5)*2.0;
vec2 distV = uvn;
float maxDist = max(abs(distV.x), abs(distV.y));
float circular = length(distV);
float square = maxDist;
vec3 color1 = vec3(0.0);
vec3 color2 = vec3(1.0);
vec3 mate=mix(color1, color2, mix(circular,square,maxDist));
fragColor = vec4(mate.xyz,1);
}
Preview:

My fragment shader in WebGL program is setting all the colors from my texture to black

I have a simple game that uses three textures with transparent parts. I can see the silhouettes of my textures, but anywhere that doesn't alpha set to zero returns black (0, 0, 0, 1).
Here's my fragment shader:
precision mediump float;
// our texture
uniform sampler2D u_image0;
uniform sampler2D u_image1;
uniform sampler2D u_image2;
// the texCoords passed in from the vertex shader.
varying vec2 v_texCoord;
void main() {
// Look up a color from the texture.
vec4 textureColor = texture2D(u_image0, v_texCoord);
if (textureColor.a < 0.5)
discard;
else
gl_FragColor = vec4(textureColor.rgb, textureColor.a);
vec4 textureColor1 = texture2D(u_image1, v_texCoord);
if (textureColor1.a < 0.5)
discard;
else
gl_FragColor = vec4(textureColor1.rgb, textureColor1.a);
vec4 textureColor2 = texture2D(u_image2, v_texCoord);
if (textureColor2.a < 0.5)
discard;
else
gl_FragColor = vec4(textureColor2.rgb, textureColor2.a);
I got the conditional that tests for alpha from another question, where pixels with zero alpha were being set to white. Solved my problem, but not sure if it scales properly to multiple textures. I'm pretty sure I'm doing it wrong.
Thanks in advance, and let me know if I need to add more code (vertex shader, etc).
It is unclear to me what you actually try to achieve.
The way you wrote this code makes me think that you do not know what the discard statement actually does: it completely discards the fragment, the current invocation of the shader will be aborted immediately.
What you shader does is just discard the whole fragment if any of the 3 textures has an alpha value below 0.5. The fact that you have written to gl_FragCoord before doing the discard does not matter at all. If all of the textures have the some alpha above 0.5, the final color will be that of u_image2.

GLSL: simulating 3D texture with 2D texture

I came up with some code that simulates 3D texture lookup using a big 2D texture that contains the tiles. 3D Texture is 128x128x64 and the big 2D texture is 1024x1024, divided into 64 tiles of 128x128.
The lookup code in the fragment shader looks like this:
#extension GL_EXT_gpu_shader4 : enable
varying float LightIntensity;
varying vec3 pos;
uniform sampler2D noisef;
vec4 flat_texture3D()
{
vec3 p = pos;
vec2 inimg = p.xy;
int d = int(p.z*128.0);
float ix = (d % 8);
float iy = (d / 8);
vec2 oc = inimg + vec2(ix, iy);
oc *= 0.125;
return texture2D(noisef, oc);
}
void main (void)
{
vec4 noisevec = flat_texture3D();
gl_FragColor = noisevec;
}
The tiling logic seems to work ok and there is only one problem with this code. It looks like this:
There are strange 1 to 2 pixel wide streaks between the layers of voxels.
The streaks appear just at the border when d changes.
I've been working on this for 2 days now and still without any idea of what's going on here.
This looks like a texture filter issue. Think about it: when you come close to the border, the bilinear filter will consider the neighboring texel, in your case: from another "depth layer".
To avoid this, you can clamp the texture coords so that they are never outside the rect defined outmost texel centers of the tile (similiar to GL_CLAMP_TO_EDGE, but on a per-tile basis). But you should be aware that the problems will become worse when using mipmapping. You should also be aware, that currently you are not able to filter in the z direction, as a real 3D texture would. You could simulate this manually in the shader, of course.
But really: why not just using 3D textures? The hw can do all this for you, with much less overhead...

Resources