Fragment shader for 3D-Textures - opengl-es

I want to use a fragment shader to write back into a 3D-Sampler: There is an array of images as input (as uniform sampler3D) to my opengl-es-application and i want to perform some per-pixel operations on it. Can i use a fragment shader to calculate values for a 3D-Texture?

Related

Is there a way to access a vector by coordinates as you would a texture in GLSL?

I am implementing a feature extraction algorithm with OpenGL ES 3.0 (given an input texture with some 1's and mostly 0's, produce an output texture that has feature regions labeled). The problem I face in my fragment shader is how to perform a “lookup” on an intermediate vec or float rather than a sampler.
Conceptually every vec or float is just a texel-valued function, so there ought to be a way to get its value given texel coordinates, something like textureLikeLookup(texel_valued, v_color) - but I haven’t found anything that parallels the texture* functions.
The options I see are:
Render my vector to a framebuffer and pass that as a texture into another shader/rendering pass - undesirable because I have many such passes in a loop, and I want to avoid interleaving CPU calls;
Switch to ES 3.1 and take advantage of imageStore (https://www.khronos.org/registry/OpenGL-Refpages/es3.1/html/imageStore.xhtml) - it seems clear that if I can update an intermediate image within my shader then I can achieve this within the fragment shader (cf. https://www.reddit.com/r/opengl/comments/279fc7/glsl_frag_update_values_to_a_texturemap_within/), but I would rather stick to 3.0 if possible.
Is there a better/natural way to deal with this problem? In other words, do something like this
// works, because tex_sampler is a sampler2D
vec4 texel_valued = texture(tex_sampler, v_color);
when the data is not a sampler2D but a vec:
// doesn't work, because texel_valued is not a sampler but a vec4
vec4 oops = texture(texel_valued, v_color);

Three.js - get image dimensions and pass to fragment shader

I'm sending a bitmap as a uniform to a fragment shader with THREE.ImageUtils.loadTexture(). What's the simplest way for the fragment shader to access the size of the image in pixels?
Pass the values you want to the shader as uniforms.

What is "uniform samplerXX iChannel0..3" in this shader?

I happened to see this shader on shader toy.
https://www.shadertoy.com/view/ldf3W8
I wanted to know what:
uniform samplerXX iChannel0..3;
is?
I tried to look at the vertex shader, but I don't find any thing there.
Also how can I convert audio waves to a texture? (which is being done here)
I wanted to know what uniform samplerXX iChannel0..3; is?
uniforms are externally set variables that have the same value for invocations of the shader during a primitive draw (a vertex shader gets called for each vertex a primitive consists of, fragment shaders for each fragment (which is roughly translates to pixels) drawn to by the primitive).
samplers are OpenGL's way of binding texture units to a shader. In the actual OpenGL program you're loading the texture using glGenTextures, glActiveTexture, glBindTexture, glTexImage (and a bunch of other functions, but those are the important ones) and bind the texture unit selected with glActiveTexture to a sampler uniform.
Also how can I convert audio waves to a texture?
Textures are just interpolated lookup tables. You can place any data you like in a LUT. Most of the time a texture is used for image data, but you can loac PCM samples into it just as well. So you simply fetch PCM data from the audio API and pass it into a texture as data.

different OpenGL ES vertex color interpolation modes?

I'm trying to implement a simple linear gradient like in photoshop. The color interpolation between vertices seems to go by (additive?)numerical value rather than what you would expect in "paint blending." Here is a visual example with green and red:
The one on the left is roughly what I get, and I want to achieve the one on the right.
Is there any easy way to achieve this?
As #Andon commented, using the texture system is a good way to do this. Here's what you need:
Assign one (or more, but you only need one for this trick) texture coordinate(s) to each vertex when you set up attributes in your vertex buffer.
In your vertex shader, read that attribute and write it to a varying so it gets interpolated for use in the fragment shader.
In your fragment shader, read the varying -- this tells you how far along the gradient ramp you should be in the current fragment; i.e. a blending factor.
At this point, you have two choices:
Use a 1d texture image that looks like the gradient you want, and lookup into it with the texture2D shader function and the varying texture coordinate you got. This will fetch the corresponding texel color so you can output it to gl_FragColor.
Calculate the color blend in the fragment shader. If you pass in the endpoint colors in your shader as uniforms, you combine them according to the blending factor using whatever math you can do in GLSL (including things like Photoshop blend modes).

Is a varying a pixel?

I am writing a Fragment shader for an Open GL ES application, and Im trying to clarify the difference between a Pixel and a Varying?
A varying type in OpenGL ES contains an optional, user-defined output from the vertex shader to the fragment shader (e.g. a surface normal if using per-pixel lighting). It is used to calculate the final fragment color (gl_FragColor) within the fragment shader. While a final color can be output from the vertex shader (e.g. if using per-vertex lighting) as a varying type, this is not the norm and depends on your desired shader behaviour.
A pixel is simply the smallest measured unit of an image or screen. The OpenGL ES pipeline produces fragments (raw data) which are then converted (or not) to pixels, depending on their visibility, depth, stencil, colour, etc.

Resources