What is "uniform samplerXX iChannel0..3" in this shader? - opengl-es

I happened to see this shader on shader toy.
https://www.shadertoy.com/view/ldf3W8
I wanted to know what:
uniform samplerXX iChannel0..3;
is?
I tried to look at the vertex shader, but I don't find any thing there.
Also how can I convert audio waves to a texture? (which is being done here)

I wanted to know what uniform samplerXX iChannel0..3; is?
uniforms are externally set variables that have the same value for invocations of the shader during a primitive draw (a vertex shader gets called for each vertex a primitive consists of, fragment shaders for each fragment (which is roughly translates to pixels) drawn to by the primitive).
samplers are OpenGL's way of binding texture units to a shader. In the actual OpenGL program you're loading the texture using glGenTextures, glActiveTexture, glBindTexture, glTexImage (and a bunch of other functions, but those are the important ones) and bind the texture unit selected with glActiveTexture to a sampler uniform.
Also how can I convert audio waves to a texture?
Textures are just interpolated lookup tables. You can place any data you like in a LUT. Most of the time a texture is used for image data, but you can loac PCM samples into it just as well. So you simply fetch PCM data from the audio API and pass it into a texture as data.

Related

How to store and access per fragment attributes in WebGL

I am doing a particle system in WebGL using Three.js, and I want to do all the computation of the particles in the shaders. To achieve that, the positions (for example) of the particles are stored in a texture which is sampled by the vertex shader of each particle (POINT primitive).
The position texture is in fact two render targets which are swapped each frame after being updated off screen. Each pixel of this texture represent a particle.
To update a position, I read one of he render targets (texture2D), do some computation, and write on the other render target (fragment output).
To perform the "do some computation" step, I need some per particle attributes, like its velocity (and a lot of others). Since this step is done in the fragment shader, I can't use the vertex attributes buffers, so I have to store these properties in separate textures and sample each of them in the fragment shader.
It works, but sampling textures is slow as far as I know, and I wonder if there is some better ways to do this, like having one vertex per particle, each rendering a single fragment of the position texture.
I know that OpenGL 4 as some alternative ways to deal with this, like UBO or SSBO, but I'm not sure about WebGL.

different OpenGL ES vertex color interpolation modes?

I'm trying to implement a simple linear gradient like in photoshop. The color interpolation between vertices seems to go by (additive?)numerical value rather than what you would expect in "paint blending." Here is a visual example with green and red:
The one on the left is roughly what I get, and I want to achieve the one on the right.
Is there any easy way to achieve this?
As #Andon commented, using the texture system is a good way to do this. Here's what you need:
Assign one (or more, but you only need one for this trick) texture coordinate(s) to each vertex when you set up attributes in your vertex buffer.
In your vertex shader, read that attribute and write it to a varying so it gets interpolated for use in the fragment shader.
In your fragment shader, read the varying -- this tells you how far along the gradient ramp you should be in the current fragment; i.e. a blending factor.
At this point, you have two choices:
Use a 1d texture image that looks like the gradient you want, and lookup into it with the texture2D shader function and the varying texture coordinate you got. This will fetch the corresponding texel color so you can output it to gl_FragColor.
Calculate the color blend in the fragment shader. If you pass in the endpoint colors in your shader as uniforms, you combine them according to the blending factor using whatever math you can do in GLSL (including things like Photoshop blend modes).

Is a varying a pixel?

I am writing a Fragment shader for an Open GL ES application, and Im trying to clarify the difference between a Pixel and a Varying?
A varying type in OpenGL ES contains an optional, user-defined output from the vertex shader to the fragment shader (e.g. a surface normal if using per-pixel lighting). It is used to calculate the final fragment color (gl_FragColor) within the fragment shader. While a final color can be output from the vertex shader (e.g. if using per-vertex lighting) as a varying type, this is not the norm and depends on your desired shader behaviour.
A pixel is simply the smallest measured unit of an image or screen. The OpenGL ES pipeline produces fragments (raw data) which are then converted (or not) to pixels, depending on their visibility, depth, stencil, colour, etc.

Relation between shader and UV texture mapping

I'm using OpenGL ES + GLKit. I've never been this low-level before in my life so I still have to learn a lot of things. I've developed a Unity games before and you just give it a .obj file and corresponding texture and it's done. (UV mapping happens to be inside the .obj file?)
I want to develop a kind of special Toon Shader with some different characteristics for use with 3D model. So I need to write a vertex shader (.vsh) and fragment shader (.fsh) right?
However, I just know that in order to apply a texture to a model with correct UV coordinate, you have to do this in shader? (am I right?) With "Texture Shader".
So, If I want to both apply the texture with UV mapping then apply my special Toon Shader, I have to write both in the same shader? There is no way I can create a plug-and-play Toon shader so I can use it with anything?
As a side question, which file format is a UV coordinate and how can I take that in to a shader program? What kind of attribute variable?
So I need to write a vertex shader (.vsh) and fragment shader (.fsh)
right?
Yes.
However, I just know that in order to apply a texture to a model with
correct UV coordinate
True
There is no way I can create a plug-and-play Toon shader so I can use
it with anything?
Check Uber-Shaders
and how can I take that in to a shader program? What kind of attribute
variable?
You are defining your attributes in shader by yourself. Check this GLSL tutorial

Opengl-es 1.1 change pixel data on the FBO

Is there a way to get the FBO pixel data and then: greyscale it fast and take back the image to that FBO again?
If you're using the fixed-function pipeline (ES 1.1), you can use glReadPixels to pull pixel data off the GPU so you can process it directly. Then you'd need to create a texture from that result, and render a quad mapped to the new texture. But this is a fairly inefficient way of accomplishing the result.
If you're using shaders (ES 2.0), you can do this on the GPU directly, which is faster. That means doing the greyscaling in a fragment shader in one of a few ways:
If your rendering is simple to begin with, you can add the greyscale math in your normal fragment shader, and perhaps toggle it with a boolean uniform variable.
If you don't want to mess with greyscale in your normal pipeline, you can render normally to an offscreen FBO (texture), and then render the contents of that texture to the screen's FBO using a special greyscale texturing shader that does the math on sampled texels.
Here's the greyscale math if you need it: https://web.archive.org/web/20141230145627/http://bobpowell.net/grayscale.aspx Essentially, plug the RGB values into that formula, and use the resulting luminance value in all your channels.

Resources