android OpenGLES 1.x CameraPreview to Surfacetexture - opengl-es

I am trying to send the camera preview to a surfacetexture object and render it on a square. I have running code for GLES20 but didnt find anything for 1.x.
Basically it should work like this, right?
// setup texture
gl.glActiveTexture(GL10.GL_TEXTURE0);
gl.glGenTextures(1, textures, 0);
gl.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textures[0]);
gl.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, ...);
...
// setup surfacetexture object
surface = new SurfaceTexture(textures[0]);
surface.setOnFrameAvailableListener(this);
// setup camera
mCamera = Camera.open(0);
Camera.Parameters param = mCamera.getParameters();
List<Size> psize = param.getSupportedPreviewSizes();
//find previewsize to match glsurface from renderer
param.setPreviewSize(psize.get(i).width, psize.get(i).height);
mCamera.setParameters(param);
// set the texture and start preview
mCamera.setPreviewTexture(surface);
mCamera.startPreview();
// in the "onFrameAvailable" handler, i switch a flag to mark a new frame
updateSurface = true;
// and in the renderloop i update and redraw
if (updateSurface) {
surface.updateTexImage();
updateSurface = false;
}
gl.glActiveTexture(GL10.GL_TEXTURE0);
gl.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textures[0]);
// Draw square
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBufferFloor);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
The square gets drawn but is completely white. I dont receive glErrors or other exceptions. The "onFrameAvailable" handler gets called too.
If i use glTeximage with a loaded bitmap, it is correctly drawn on the square.
ANY ideas? Thank you!

I'm facing the same problem. Maybe I'm wrong, but it seems that SurfaceTexture is not compatible with GLES10. Surface texture uses GL_TEXTURE_EXTERNAL_OES, thereby it a custom fragment shader that is able to use this texture ("#extension GL_OES_EGL_image_external : require ").
As glUseProgram(...), etc are not avaible in GLES10, we cannot use custom shaders.
As I said, maybe I'm wrong... Good luck
EDIT : I finaly get it to work. You should use "gl.glEnable(GLES11Ext.GL_TEXTURE_EXTERNAL_OES);"

Related

Three.js - Stencil only on certain objects

I'm looking into making kind of a portal effect using Three.js.
The main idea is to be able to see through multiple windows another Scene.
Exactly like in this example :
https://www.ronja-tutorials.com/post/022-stencil-buffers/ (See gif in article, too big to upload here)
I found an exact example of what i'm trying to do using three.js.
Here is the link :
https://sites.google.com/site/cgwith3js/home/masking-with-stencil
The fiddle was not working but I changed it to make it work :
http://jsfiddle.net/yzhreu6p/23/
scene = new THREE.Scene();
sceneStencil = new THREE.Scene();
...
scene.add(box); // red one
...
function animate() {
requestAnimationFrame(animate);
renderer.clear();
// animate the box
box.position.x = Math.cos(clock.getElapsedTime()) * 10;
var gl = renderer.getContext();
// enable stencil test
gl.enable(gl.STENCIL_TEST);
//renderer.state.setStencilTest( true );
// config the stencil buffer to collect data for testing
gl.stencilFunc(gl.ALWAYS, 1, 0xff);
gl.stencilOp(gl.REPLACE, gl.REPLACE, gl.REPLACE);
// render shape for stencil test
renderer.render(sceneStencil, camera);
// set stencil buffer for testing
gl.stencilFunc(gl.EQUAL, 1, 0xff);
gl.stencilOp(gl.KEEP, gl.KEEP, gl.KEEP);
// render actual scene
renderer.render(scene, camera);
// disable stencil test
gl.disable(gl.STENCIL_TEST);
}
Then I reuse the code in my project where I have kind of a town with buildings, and the hidden scene has a nyan cat textured box.
The problem I'm having is that the planes are disappearing when a building is behind and more serious problem, when there are buildings back the nyan cat texture, we see the texture in the building.
To better explain here is an image
I'm looking for a solution where the images are not visible inside the building, I found people that are talking about stencilMask but it's new for me.
Do I need to create another scene to make it work independently ?
If someone can help me, thank you for reading.

Reading Pixels in WebGL 2 as Float values

I need to read the pixels of my framebuffer as float values.
My goal is to get a fast transfer of lots of particles between CPU and GPU and process them in realtime. For that I store the particle properties in a floating point texture.
Whenever a new particle is added, I want to get the current particle array back from the texture, add the new particle properties and then fit it back into the texture (this is the only way I could think of to dynamically add particles and process them GPU-wise).
I am using WebGL 2 since it supports reading back pixels to a PIXEL_PACK_BUFFER target. I test my code in Firefox Nightly. The code in question looks like this:
// Initialize the WebGLBuffer
this.m_particlePosBuffer = gl.createBuffer();
gl.bindBuffer(gl.PIXEL_PACK_BUFFER, this.m_particlePosBuffer);
gl.bindBuffer(gl.PIXEL_PACK_BUFFER, null);
...
// In the renderloop, bind the buffer and read back the pixels
gl.bindBuffer(gl.PIXEL_PACK_BUFFER, this.m_particlePosBuffer);
gl.readBuffer(gl.COLOR_ATTACHMENT0); // Framebuffer texture is bound to this attachment
gl.readPixels(0, 0, _texSize, _texSize, gl.RGBA, gl.FLOAT, 0);
I get this error in my console:
TypeError: Argument 7 of WebGLRenderingContext.readPixels could not be converted to any of: ArrayBufferView, SharedArrayBufferView.
But looking at the current WebGL 2 Specification, this function call should be possible. Using the type gl.UNSIGNED_BYTE also returns this error.
When I try to read the pixels in an ArrayBufferView (which I want to avoid since it seems to be way slower) it works with the format/type combination of gl.RGBA and gl.UNSIGNED_BYTE for a Uint8Array() but not with gl.RGBA and gl.FLOAT for a Float32Array() - this is as expected since it's documented in the WebGL Specification.
I am thankful for any suggestions on how to get my float pixel values from my framebuffer or on how to otherwise get this particle pipeline going.
Did you try using this extension?
var ext = gl.getExtension('EXT_color_buffer_float');
The gl you have is webgl1,not webgl2.Try:
var gl = document.getElementById("canvas").getContext('webgl2');
In WebGL2 the syntax for glReadPixel is
void gl.readPixels(x, y, width, height, format, type, ArrayBufferView pixels, GLuint dstOffset);
so
let data = new Uint8Array(gl.drawingBufferWidth * gl.drawingBufferHeight * 4);
gl.readPixels(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight, gl.RGBA, gl.UNSIGNED_BYTE, pixels, 0);
https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/readPixels

Threejs - Applying simple texture on a shader material

Using Threejs (67) with a Webgl renderer, I can't seem to get a plane with a shader material to wear its texture. No matter what I do the material would just stay black.
My code at the moment looks quite basic :
var grassT = new Three.Texture(grass); // grass is an already loaded image.
grassT.wrapS = grassT.wrapT = Three.ClampToEdgeWrapping;
grassT.flipY = false;
grassT.minFilter = Three.NearestFilter;
grassT.magFilter = Three.NearestFilter;
grassT.needsUpdate = true;
var terrainUniforms = {
grassTexture : { type: "t", value: grassT},
}
Then I just have this revelant part in the vertexShader :
vUv = uv;
And on the fragmentShader side :
gl_FragColor = texture2D(grassTexture, vUv);
This results in :
Black material.
No error in console.
gl_FragColor value is always (0.0, 0.0, 0.0, 1.0).
What I tryed / checked:
Everything works fine if I just apply custom plain colors.
All is ok if I use vertexColors with plain colors too.
My texture width / height is indeed a power of 2.
The image is on the same server than the code.
Tested others images with same result.
The image is actually loading in the browser debugger.
UVS for the mesh are corrects.
Played around with wrapT, wrapS, minFilter, magFilter
Adapted the mesh size so the texture has a 1:1 ratio.
Preloaded the image with requirejs image plugin and created the texture from THREE.Texture() instead of using THREE.ImageUtils();
Played around with needsUpdate : true;
Tryed to add defines['USE_MAP'] during material instanciation.
Tryed to add material.dynamic = true.
I have a correct rendering loop (interraction with terrain is working).
What I still wonder :
It's a multiplayer game using a custom port with express + socket.io. Am I hit by any Webgl security policy ?
I have no lights logic at the moment, is that a problem ?
Maybe the shader material needs other "defines" at instanciation ?
I guess I'm overlooking something simpler, this is why I'm asking...
Thanks.
I am applying various effects on the same shader. I have a custom API that merge all different effects uniforms simply by using Three.UniformsUtils.merge() However this function is calling the clone() method on the texture and this is causing to reset needsUpdate to false before the texture reach the renderer.
It appears that you should set your texture needsUpdate property to true when reaching the material level. On the texture level, if the uniform you set get merged, and therefore cloned, later in the process, it'll lose its needsUpdate property.
The issue is also detailled here: https://github.com/mrdoob/three.js/issues/3393
In my case the following wasn't working (grassT is my texture):
grassT.needsUpdate = true
while the following is running perfectly later on in the code:
material.uniforms.grassTexture.value.needsUpdate = true;
Image loading is asynchronous. Most likely, you are rendering your scene before the texture image loads.
You must set the texture.needsUpdate flag to true after the image loads. three.js has a utility that will do that for you:
var texture = THREE.ImageUtils.loadTexture( "texture.jpg" );
Once rendered, the renderer sets the texture.needsUpdate flag back to false.
three.js r.68

Does webgl drawArray() empty/discard the buffers?

Trying to speed up the display of many near-identical objects in WebGL, I tried (naively, I guess), to re-use the buffers content. In the drawing routine of each object, I have (somewhat simplified):
if (! dataBuffered) {
dataBuffered = true;
:
: gl stuff here: texture loading, buffer binding and filling
:
}
// set projection and model-view matrices
gl.uniformMatrix4fv (shaderProgram.uPMatrix, false, pMatrix);
gl.uniformMatrix4fv (shaderProgram.uMVMatrix, false, mvMatrix);
// draw rectangle filled with texture
gl.drawArrays(gl.TRIANGLE_STRIP, 0, starVertexPositionBuffer.numItems);
My idea was that the texture, vertex, and texture coordinates buffer are the same, but the model-view matrix changes (same object in different places). But, alas, nothing shows up. When I comment the dataBuffered = true, it's visible.
So my question is, does drawArray() discard or empty the buffers? What else is happening? (I'm working along the lessons at learningwebgl.com, if that matters.)
Short answer is, Yes, you can reuse all the state you've set up for more than one gl.drawArrays().
http://omino.com/experiments/webgl/simplestWebGlReuseBuffers.html is a little example where it just changes one uniform float (Y-scale) and redraws, twice for every tick.
(In this case there's no textures, but some other state stays sticky.)
Hope that helps!
uniformSetFloat(gl,prog,"scaleY",1.0);
gl.drawArrays(gl.TRIANGLES, 0, posPoints.length / 3);
uniformSetFloat(gl,prog,"scaleY",0.2);
gl.drawArrays(gl.TRIANGLES, 0, posPoints.length / 3);

ThreeJS object outlines and masking

I have a problem with masking in Three.js.
I want to have outline around object and I did it using this tutorial
http://www.codeproject.com/Articles/8499/Generating-Outlines-in-OpenGL
I wrote this code;
renderer.autoClear = false;
...
renderer.render(scene, camera);
...
var gl = this.world.renderer.domElement.getContext('webgl') || this.world.renderer.domElement.getContext('experimental-webgl');
gl.clearStencil(0);
gl.clear(gl.STENCIL_BUFFER_BIT);
gl.enable(gl.STENCIL_TEST);
gl.stencilFunc(gl.ALWAYS, 1, 1);
gl.stencilOp(gl.KEEP, gl.REPLACE, gl.REPLACE);
gl.colorMask(0, 0, 0, 0);
renderer.render(sceneMask, camera);
gl.colorMask(1, 1, 1, 1);
gl.stencilFunc(gl.NOTEQUAL, 1, 1);
gl.stencilOp(gl.KEEP, gl.REPLACE, gl.REPLACE);
renderer.render(sceneOutlines, camera);
gl.disable(gl.STENCIL_TEST);
and it works like a charm.
But i want to have outline more thicker. On windows, web browsers using angle and DirectX so i can render thicker lines.
(I know that i can use scaled object by vertex normals, but in this way i will create outline thicker in some places and thiner in other)
Then i got the idea, to blur outline.
I found this tutorial
(this is not a http link)://stemkoski.blogspot.com/2013/03/using-shaders-and-selective-glow.html
and i add MaskPass before rendering scene with objects that will be blured.
What happend then? Nothing.
I inverting mask and disabling buffer clear for mask and render passes but in overall i dont know what im doing.
This is the jsFiddle with some example that i made.
http://jsfiddle.net/9MtGR/15/
It looks like outline works but im using additive shader and green cube (that should work as outline) is added to red cube (that should receive outline).
Is it possible to use Three.js masking in the way that red cube will have green blured outline?
Or mayby is there other way to get the same effect using not Three.js methods?
P.S. This is a matter of life and death so it's not a joke.
When I was working on some animation that required me to include star-wars-like lasers - that what helped in the end: http://bkcore.com/blog/3d/webgl-three-js-animated-selective-glow.html
Especially this example: http://demo.bkcore.com/threejs/webgl_tron_iso.html

Resources