I need to remove all odd lines from a texture - this is part of a simple deinterlacer.
In the following code sample, instead of getting the RGB from texture, I choose to output white colour for odd line and red colour for even line - so I can visually check if the result is what I expected.
_texcoord is passed in from vertex shader and has a range of [0, 1] for both x and y
uniform sampler2D sampler0; /* not used here because we directly output White or Red color */
varying highp vec2 _texcoord;
void main() {
highp float height = 480.0; /* assume texture has height of 480 */
highp float y = height * _texcoord.y;
if (mod(y, 2.0) >= 1.0) {
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
} else {
gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);
}
}
When render to screen, the output isn't expected. Vertically it's RRWWWRRWWW But I am really expecting RWRWRWRW (i.e. alternate between red and white color)
My code run on iOS and target to GLES 2.0 so it should be no different on Android with GLES 2.0
Question: Where did I do wrong?
EDIT
Yes the texture height is correct
I guess my question is: given a _texcoord.y, how to tell if it's referring to odd or even line of the texture.
void main(void)
{
vec2 p= vec2(floor(gl_FragCoord.x), floor(gl_FragCoord.y));
if (mod(p.y, 2.0)==0.0)
gl_FragColor = vec4(texture2D(tex,uv).xyz ,1.0);
else
gl_FragColor = vec4(0.0,0.0,0.0 ,1.0);
}
Related
I'm drawing a 2d plan on the screen using webgl. I would like to rotate the plan a bit to give a 3d impression.
current:
wanted:
My first approach was to use vanishing points like drawing in perspective but I didn't know how to change the y coordinate and I didn't get to the end. Is there an easier way to rotate the output?
Here is my code:
uniform float scale;
uniform vec2 ratio;
uniform vec2 center;
in vec3 fillColor;
in vec2 position;
out vec3 color;
void main() {
color = fillColor;
gl_Position = vec4((position - center) * ratio, 0.0, scale);
}
If you want to build a whole game engine or a complex animation, you will need to dig into perspective projection matrices.
But if you just want to achieve this little effect and try to understand how it works, you can just use the w coord of gl_Position. This coordinate is essential to tell the GPU how to interpolate UV textures in a valid 3D way, for example. And it will be divided to x, y and z.
So let's assume you want to display a rectangle. You will need two triangles.
4 vertices will suffice if you use TRIANGLE_STRIP mode. We could use only one attribute, but for the sake of tutorial, I will use two:
Vertex #
attPos
attUV
0
-1, +1
0, 0
1
-1, +1
0, 1
2
+1, +1
1, 0
3
+1, -1
1, 1
And all the logic will be in the vertex shader:
uniform float uniScale;
uniform float uniAspectRatio;
attribute vec2 attPos;
attribute vec2 attUV;
varying vec2 varUV;
void main() {
varUV = attUV;
gl_Position = vec4(
attPos.x * uniScale,
attPos.y * uniAspectRatio,
1.0,
attUV.y < 0.5 ? uniScale : 1.0
);
}
The line attUV.y < 0.5 ? uniScale : 1.0 means
If attUV.y is 0, then use uniScale
Otherwise use 1.0
The attUV attribute let's you use a texture if you want. In this example,
I just simulate a checkboard with this fragment shader:
precision mediump float;
const float MARGIN = 0.1;
const float CELLS = 8.0;
const vec3 ORANGE = vec3(1.0, 0.5, 0.0);
const vec3 BLUE = vec3(0.0, 0.6, 1.0);
varying vec2 varUV;
void main() {
float u = fract(varUV.x * CELLS);
float v = fract(varUV.y * CELLS);
if (u > MARGIN && v > MARGIN) gl_FragColor = vec4(BLUE, 1.0);
else gl_FragColor = vec4(ORANGE, 1.0);
}
You can see all this in action in this CopePen:
https://codepen.io/tolokoban/full/oNpBRyO
Basically I'm offsetting a texture2D from its original inputImageTexture with this code
highp vec2 toprightcoord = textureCoordinate + 0.25;
highp vec4 tr = texture2D(inputImageTexture, toprightcoord);
It's does what it supposed to do, however it leaves a stretched pixel color from the edge of the offsetted texture (like a cheese from pulled pizza slice).
How to replace it to any color or transparent?
I assume that you have set the texture wrap parameters to GL_CLAMP_TO_EDGE. See glTexParameter.
This causes the stretched pixels when access the texture with the texture lookup function texture2D, out of the range [0.0, 1.0].
You can create a "tiled" texture with the wrap parameter GL_REPEAT.
But if you want
"How to replace it to any color or transparent?"
, then you have to do a range check.
The following code sets the alpha channel to 0.0, if the limit of 1.0 is exceeded at either the x or y coordinate. Where the variable inBounds is set to 1.0 if the texture coordinate is in bounds and else it is set to 0.0:
vec2 toprightcoord = textureCoordinate + 0.25;
vec4 tr = texture2D(inputImageTexture, toprightcoord);
vec2 boundsTest = step(toprightcoord, vec2(1.0));
flaot inBounds = boundsTest.x * boundsTest.y;
tr.a *= inBounds;
You can extend this, to a range test in [0.0, 1.0]:
vec2 boundsTest = step(vec2(0.0), toprightcoord) * step(toprightcoord, vec2(1.0));
Note, the glsl function step
genType step( genType edge, genType x);
returnes 0.0, if x[i] < edge[i], and 1.0 is returned otherwise.
With the glsl function mix, the color can be replaced by a different one:
vec4 red = vec4(1.0, 0.0, 0.0, 1.0);
tr = mix(red, tr, inBounds);
i found this great page hls picker, and i'm wondering if there is possibility to achieve similar effect in WebGL. I'm passing to my fragment shader some color, for example #FF7400, what is the easiest way to convert it to hsl and change its luminosity, or to have smooth transition to black color (luminosity equel 0). I want to make clouds in my page that have different color(luminosity) depends how far they are from sun. Thanks in advance for any help.
thanks for geat links but i think that i found much simplier way to made easy color transition, all i need is to use webGL method T mix(T x, T y, float a) - linear blend of x and y.
This code i use in shadertoy editor:
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = gl_FragCoord.xy / iResolution.xy;
vec4 orange = vec4(0.533, 0.25, 0.145, 1.0);
vec4 blue = vec4(0.18, 0.23, 0.27, 1.0);
vec4 black = vec4(0.0, 0.0, 0.0, 1.0);
vec4 white = vec4(1.0, 1.0, 1.0, 1.0);
float ratio = iResolution.x / iResolution.y;
float PI = 3.14159265359;
vec4 mixC = mix(orange, blue, sin(ratio * uv.y));
mixC = mix(mixC, black, cos(2.0 * PI * uv.x) / ratio);
mixC = mix(mixC, black, cos(2.0 * PI * uv.y) / ratio);
mixC = mix(mixC, white, 0.1);
fragColor = mixC;
}
As you can see there, i've made transition between of four colors with just couple lines of code and the results looks like this:
I think about fragment shader as a little photoshop. Every photoshop operation should be possible with WebGL.
If we are talking about 2D image where sun position is relative and you want to just use some basic image processing, you can use functions from this answer rgb2hsv and hsv2rgb. I think it should work with GLSL 1.
Then you can multiply Sat, Lum or Hue and then transfer it back to RGB.
If it doesnt, you might have to reimplement it from common formula, use wiki or this link: http://www.rapidtables.com/convert/color/rgb-to-hsl.htm
If you want to do some more image processing, when neighbour pixels are needed, I suggest this great tutorial where you can easy do some edge sharpening, blur etc.: http://webglfundamentals.org/webgl/lessons/webgl-image-processing.html
original image
Hi guys, i think i found a unorthodox but easy way to desaturate an RGB image, we just need to find the average color for the pixel,
average _color = (R+G+B)/3 keeping the Alpha...
vec4(average_color,average_color,average_color, Alpha);
void main()
{
//write a color of the current fragment to a variable
lowp vec4 color_of_pixel = texture2D(texture_sampler, var_texcoord0.xy);
//Find the average among red, green and blue and it keeps the "force" of the color...
float average_color = ((color_of_pixel.r + color_of_pixel.g + color_of_pixel.b)/3);
lowp vec4 color_of_pixel_final = vec4(average_color,average_color,average_color,color_of_pixel.a);
gl_FragColor = color_of_pixel_final; // write the color_of_pixel to the output gl_FragColor
}
Desaturated image
I've seen lots of tutorials on vignette shaders just like these but none of them say how to change the color of the vignette, they only talk about applying sepia or grey shaders to the whole composite image.
For example the video above gives the below code for the vignette shader. How do I change the color of the vignette ? So it's not black but red or orange and the part of the image in the interior of the vignette remains unmodified.
const float outerRadius = .65, innerRadius = .4, intensity = .6;
void main(){
vec4 color = texture2D(u_sampler2D, v_texCoord0) * v_color;
vec2 relativePosition = gl_FragCoord.xy / u_resolution - .5;
relativePosition.y *= u_resolution.x / u_resolution.y;
float len = length(relativePosition);
float vignette = smoothstep(outerRadius, innerRadius, len);
color.rgb = mix(color.rgb, color.rgb * vignette, intensity);
gl_FragColor = color;
}
In the shader you posted, it looks like the vignette value is basically a darkness value that's blended over the image, so in the line with the mix function, it's just multiplied by the texture color.
So to modify this to work with arbitrary color, you need to change it to an opacity value (invert it). And now that it's opacity, you can multiply it by intensity to simplify the next calculation. And finally, you can blend to the vignette color you choose.
So first declare the color you want before the main function.
const vec3 vignetteColor = vec3(1.0, 0.0, 0.0); //red
//this could be a uniform if you want to dynamically change it.
Then your second-to-last two lines change to the following.
float vignetteOpacity = smoothstep(innerRadius, outerRadius, len) * intensity; //note inner and outer swapped to switch darkness to opacity
color.rgb = mix(color.rgb, vignetteColor, vignetteOpacity);
By the way, "intensity = .6f" will cause the shader not to compile on mobile, so remove the f. (Unless you target OpenGL ES 3.0, but that's not fully supported by libgdx yet.)
Im trying to reduce the number of post process textures I have to draw in my scene. The end goal is to support an SSAO shader. The shader requires depth, postion and normal data. Currently I am storing the depth and normals in 1 float texture and the position in another.
I've been doing some reading, and it seems possible that you can get the position by simply using the depth stored in the normal texture. You have to unproject the x and y and multiply it by the depth value. I can't seem to get this right however and its probably due to my lack of understanding...
So currently my positions are drawn to a position texture. This is what it looks like (this is currently working correctly)
So is my new method. I pass the normal texture that stores the normal x,y and z in the RGB channels and the depth in the w. In the SSAO shader I need to get the position and so this is how im doing it:
//viewport is a vec2 of the viewport width and height
//invProj is a mat4 using camera.projectionMatrixInverse (camera.projectionMatrixInverse.getInverse( camera.projectionMatrix );)
vec3 get_eye_normal()
{
vec2 frag_coord = gl_FragCoord.xy/viewport;
frag_coord = (frag_coord-0.5)*2.0;
vec4 device_normal = vec4(frag_coord, 0.0, 1.0);
return normalize((invProj * device_normal).xyz);
}
...
float srcDepth = texture2D(tNormalsTex, vUv).w;
vec3 eye_ray = get_eye_normal();
vec3 srcPosition = vec3( eye_ray.x * srcDepth , eye_ray.y * srcDepth , eye_ray.z * srcDepth );
//Previously was doing this:
//vec3 srcPosition = texture2D(tPositionTex, vUv).xyz;
However when I render out the positions it looks like this:
The SSAO looks very messed up using the new method. Any help would be greatly appreciated.
I was able to find a solution to this. You need to multiply the ray normal by the camera far - near (I was using the normalized depth value - but you need the world depth value.)
I created a function to extract the position from the normal/depth texture like so:
First in the depth capture pass (fragment shader)
float ld = length(vPosition) / linearDepth; //linearDepth is cam.far - cam.near
gl_FragColor = vec4( normalize( vNormal ).xyz, ld );
And now in the shader trying to extract the position...
/// <summary>
/// This function will get the 3d world position from the Normal texture containing depth in its w component
/// <summary>
vec3 get_world_pos( vec2 uv )
{
vec2 frag_coord = uv;
float depth = texture2D(tNormals, frag_coord).w;
float unprojDepth = depth * linearDepth - 1.0;
frag_coord = (frag_coord-0.5)*2.0;
vec4 device_normal = vec4(frag_coord, 0.0, 1.0);
vec3 eye_ray = normalize((invProj * device_normal).xyz);
vec3 pos = vec3( eye_ray.x * unprojDepth, eye_ray.y * unprojDepth, eye_ray.z * unprojDepth );
return pos;
}