Open GL ES 2.0 draw a square - opengl-es

I'm trying to follow the Android Developers "Displaying graphics with OpenGL ES" tutorial.
In the first part, I had to draw a triangle and it has been ok. Now I should draw a square, by using a Square class that can be defined by following the tutorial.
When I try to draw (using a draw() method that I mostly copied from the Triangle class), the screen shows only a triangle that's half of the defined square (obviously I've commented out the triangle.draw() line...).
Here's my code, has someone any hint to resolve it?
Square.java:
public class Square {
private FloatBuffer vertexBuffer;
private ShortBuffer drawListBuffer;
private final int mProgram;
private final String vertexShaderCode =
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
static float squareCoords[] = {
-0.5f, 0.5f, 0.0f, // top left
-0.5f, -0.5f, 0.0f, // bottom left
0.5f, -0.5f, 0.0f, // bottom right
0.5f, 0.5f, 0.0f }; // top right
// Set color with red, green, blue and alpha (opacity) values
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices
public Square() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(squareCoords);
vertexBuffer.position(0);
// initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
int vertexShader = MyGLRenderer.loadShader(GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
// create empty OpenGL ES Program
mProgram = GLES20.glCreateProgram();
// add the vertex shader to program
GLES20.glAttachShader(mProgram, vertexShader);
// add the fragment shader to program
GLES20.glAttachShader(mProgram, fragmentShader);
// creates OpenGL ES program executables
GLES20.glLinkProgram(mProgram);
}
private int mPositionHandle;
private int mColorHandle;
private final int vertexCount = squareCoords.length / COORDS_PER_VERTEX;
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
public void draw() {
// Add program to OpenGL ES environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// Draw the square
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}

See Triangle primitives
You have 4 vertex coordinates, which have the following arrangement:
static float squareCoords[] = {
-0.5f, 0.5f, 0.0f, // top left
-0.5f, -0.5f, 0.0f, // bottom left
0.5f, -0.5f, 0.0f, // bottom right
0.5f, 0.5f, 0.0f }; // top right
0 3
x x
| |
| |
x-----x
1 2
This pattern matches the primitive type GL_TRIANGLE_FAN, but not the primitive type GL_TRIANGLES.
Change the primitive type to solve your issue:
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, vertexCount);
Note for the primitive type GL_TRIANGLES you would need 6 vertex coordinates, e.g. with the following arrangement:
0 3 5
x x-----x
| \ \ |
| \ \ |
x-----x x
1 2 4
The primitive type GL_TRIANGLES matches your index buffer:
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 };
If you want to draw the primitives from an index list, then you have to use glDrawElements:
GLES20.glDrawElements(GLES20.GL_TRIANGLES, 6, GLES20.GL_SHORT, drawListBuffer);

Related

Is it possible to use a texture in OpenGL ES withouth a matrix?

I am learning OpenGL ES and use for this the book "Lean OpenGL ES for Mobile Game and Graphics Development" written by Mehta. He firstly shows how to build a simple project, goes over and adds orthoMatrix, then 3D, and in the end he explains how to use this all together with a texture. I am wondering if it is possible to use a texture without a matrix. I must say I tried it, the program did not crash, but the the texture was completely scattered and disorted - as if the vertexData was wrong. So my first question is if it is teoretically possible?
UPDATE 1
This code below represents an application which is able to show the camera preview on a texture. With the use of a matrix it works fine, but the use without a matrix results in the screen as shown on the picture.
this is the texture vertex shader and the texture fragment shader.
attribute vec4 a_Position;
attribute vec2 a_TextureCoordinates;
varying vec2 v_TextureCoordinates;
void main()
{
v_TextureCoordinates = a_TextureCoordinates;
gl_Position = a_Position;
}
#extension GL_OES_EGL_image_external : require
precision mediump float;
uniform samplerExternalOES u_TextureUnit;
varying vec2 v_TextureCoordinates;
void main()
{
gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates);
}
The texture loader
public class TextureHelper {
public static final int GL_TEXTURE_EXTERNAL_OES = 0x8D65;
public static int loadTexture() {
final int[] textureObjectsId = new int[1];
GLES20.glGenTextures(1, textureObjectsId, 0);
if (textureObjectsId[0] == 0 ) {
return 0;
}
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, textureObjectsId[0]);
GLES20.glTexParameterf(GLES20.GL_TEXTURE, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, 0);
return textureObjectsId[0];
The locations and uniforms
public class TextureShaderProgram extends ShaderProgram {
private final int uTextureUnitLocation;
// Attribute locations
private final int aPositionLocation;
private final int aTextureCoordinatesLocation;
public TextureShaderProgram(Context context) {
super(context, R.raw.texture_vertex_shader, R.raw.texture_fragment_shader);
uTextureUnitLocation = GLES20.glGetUniformLocation(program, U_TEXTURE_UNIT);
aPositionLocation = GLES20.glGetAttribLocation(program, A_POSITION);
aTextureCoordinatesLocation = GLES20.glGetAttribLocation(program, A_TEXTURE_COORDINATES);
}
public void setUniforms(int textureId) {
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, textureId);
GLES20.glUniform1i(uTextureUnitLocation, 0);
}
public int getPositionAttributeLocation() {
return aPositionLocation;
}
public int getTextureCoordinatesAttributeLocation() {
return aTextureCoordinatesLocation;
}
}
The vertexData (triangle fan)
public class Table {
private static final int POSITION_COMPONENT_COUNT = 2;
private static final int TEXTURE_COORDINATES_COMPONENT_COUNT = 2;
private static final int STRIDE = (POSITION_COMPONENT_COUNT + TEXTURE_COORDINATES_COMPONENT_COUNT * BYTES_PER_FLOAT);
private static final float[] VERTEXDATA = {
0f, 0f, 0.5f, 0.5f, //middle
-0.5f, -0.8f, 1f, 0.1f,//bottom left
0.5f, -0.8f, 1f, 0.9f, //bottom right
0.5f, 0.8f, 0f, 0.9f,//top right
-0.5f, 0.8f, 0f, 0.1f, // top left
-0.5f, -0.8f, 1f, 0.1f, // bottom left
};
Do you need more? Leave comment so I can post the rest if needed.
You will normally need some form of transform to animate the vertices in your scene. This is nearly always a matrix, because they can efficiently represent any desirable transform, but it doesn't have to be.
It is rare for the texture coordinate itself to be modified by a matrix - normally the texture is "fixed" on the model so the coordinate is just passed through the vertex shader without modification.

Textured android sample

I am working on OpenGLES and I try to add a textured square on my projet.
I design a simple my textured square class and I try to display it. My triangle is called in a Rendered class. For the moment the TexturedSquare class just display a empty bitmap.
package com.example.nativecpp;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.opengl.GLES20;
import android.opengl.GLUtils;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
public class TexturedSquare {
private static int[] textureHandle;
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;\n" +
"attribute vec2 aPosition;\n" +
"attribute vec2 aTexPos;\n" +
"varying vec2 vTexPos;\n" +
"void main() {\n" +
" vTexPos = aTexPos;\n" +
" gl_Position = uMVPMatrix * vec4(aPosition.xy, 0.0, 1.0);\n" +
"}";
private final String fragmentShaderCode =
"precision mediump float;\n"+
"uniform sampler2D uTexture;\n" +
"varying vec2 vTexPos;\n" +
"void main(void)\n" +
"{\n" +
" gl_FragColor = texture2D(uTexture, vTexPos);\n" +
"}";
private final FloatBuffer vertexBuffer;
private final ShortBuffer drawListBuffer;
private final int mProgram;
private int mPositionHandle;
private int mTexturePosHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
// number of coordinates per vertex in this array
final int COORDS_PER_VERTEX = 4;
float squareCoords[] = {
-1.0f,-1.0f, 0.0f, 0.0f, // vertex 3
-1.0f, 1.0f, 0.0f, 1.0f, // vertex 1
1.0f,-1.0f, 1.0f, 0.0f, // vertex 2
1.0f, 1.0f, 1.0f, 1.0f, // vertex 0
}; // top right
private final short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
float color[] = { 0.2f, 0.709803922f, 0.898039216f, 1.0f };
/**
* Sets up the drawing object data for use in an OpenGL ES context.
*/
public TexturedSquare() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (# of coordinate values * 4 bytes per float)
squareCoords.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(squareCoords);
vertexBuffer.position(0);
// initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(
// (# of coordinate values * 2 bytes per short)
drawOrder.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
// prepare shaders and OpenGL program
int vertexShader = MyGLRenderer.loadShader(
GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(
GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram); // create OpenGL program executables
}
public int loadTexture()
{
textureHandle = new int[1];
if (textureHandle[0] != -1)
{
GLES20.glGenTextures(1, textureHandle, 0);
int uTexture = GLES20.glGetAttribLocation(mProgram, "uTexture");
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
Bitmap.Config conf = Bitmap.Config.ARGB_8888; // see other conf types
Bitmap bitmap = Bitmap.createBitmap(640, 480, conf); // this creates a MUTABLE bitmap
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
GLES20.glUniform1i(uTexture, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == -1)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
/**
* Encapsulates the OpenGL ES instructions for drawing this shape.
*
* #param mvpMatrix - The Model View Project matrix in which to draw
* this shape.
*/
public void draw(float[] mvpMatrix) {
// Add program to OpenGL environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
mTexturePosHandle = GLES20.glGetAttribLocation(mProgram, "aTexPos");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
MyGLRenderer.checkGlError("glGetUniformLocation");
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(
mPositionHandle, 2,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
GLES20.glEnableVertexAttribArray(mTexturePosHandle);
GLES20.glVertexAttribPointer(
mPositionHandle, 2,
GLES20.GL_FLOAT, false,
vertexStride, (vertexBuffer.position(2)));
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
MyGLRenderer.checkGlError("glGetUniformLocation");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
MyGLRenderer.checkGlError("glUniformMatrix4fv");
// Draw the square
GLES20.glDrawElements(
GLES20.GL_TRIANGLES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
My program crash with that error:
FATAL EXCEPTION: GLThread 6633
Process: com.example.nativecpp, PID: 3281
java.lang.RuntimeException: glGetUniformLocation: glError 1282
at com.example.nativecpp.MyGLRenderer.checkGlError(MyGLRenderer.java:198)
at com.example.nativecpp.TexturedSquare.draw(TexturedSquare.java:179)
at com.example.nativecpp.MyGLRenderer.onDrawFrame(MyGLRenderer.java:118)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1553)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1253)
It seems there is an error in my index for vertex position but I can't resolve it.
Edit:
I still have the same error:
MyGLRenderer: glGetUniformLocation: glError 1282
The error appear after that:
GLES20.glVertexAttribPointer(
mPositionHandle, 2,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
MyGLRenderer.checkGlError("glGetUniformLocation"); ####ERROR!!!
Edit 2:
In that line:
uTexture = GLES20.glGetAttribLocation(mProgram, "uTexture");
uTexture = -1 I don't think it is normal ... so I change it to
int uTexture = GLES20.glGetUniformLocation(mProgram, "uTexture");
Now the uTexture is equal to 1 but the program still stuck on GLES20.glUniform1i(uTexture, 0); //glGetUniformLocation: glError 1282
Edit 3:
When I put that lines in the draw function:
int uTexture = GLES20.glGetUniformLocation(mProgram, "uTexture");
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
MyGLRenderer.checkGlError("glGetUniformLocation");
GLES20.glUniform1i(uTexture, 0);
MyGLRenderer.checkGlError("glGetUniformLocation");
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Draw the square
There is no error... why? The GPU don't have enougth time to put data in memory?
You accidentally load in the texture coordinates in the position attribute. Change
GLES20.glVertexAttribPointer(
mPositionHandle, 2,
GLES20.GL_FLOAT, false,
vertexStride, (vertexBuffer.position(2)));
To
GLES20.glVertexAttribPointer(
mTexturePosHandle, 2,
GLES20.GL_FLOAT, false,
vertexStride, (vertexBuffer.position(2)));
Few notable "mistakes":
You should only get the attribute locations once, nit every frame, because it's a costy function.
You don't do anything with a texture, if it has the id 0, while that's still valid, you should check for the id -1.
There may be a problem when you link a shader progra., ypu should check if it's link status is wrong and log the output of glGetProgramInfoLog if it is.

OpenGL ES 2.0 GLSL Barrel Distortion Shader not working

i took the code from the OpenGL ES 2.0 Tutorial from : github.com/learnopengles/Learn-OpenGLES-Tutorials
and customized the class "LessonOneRenderer" to test a barrel distortin Shader like mentioned on this site:
www.geeks3d.com/20140213/glsl-shader-library-fish-eye-and-dome-and-barrel-distortion-post-processing-filters/2/
This is the "untouched" Result without calling the distort() function:
http://fs1.directupload.net/images/150125/hw746rcn.png
and the result calling the function :
http://fs2.directupload.net/images/150125/f84arxoj.png
/**
* This class implements our custom renderer. Note that the GL10 parameter passed in is unused for OpenGL ES 2.0
* renderers -- the static class GLES20 is used instead.
*/
public class LessonOneRenderer implements GLSurfaceView.Renderer
{
/**
* Store the model matrix. This matrix is used to move models from object space (where each model can be thought
* of being located at the center of the universe) to world space.
*/
private float[] mModelMatrix = new float[16];
/**
* Store the view matrix. This can be thought of as our camera. This matrix transforms world space to eye space;
* it positions things relative to our eye.
*/
private float[] mViewMatrix = new float[16];
/** Store the projection matrix. This is used to project the scene onto a 2D viewport. */
private float[] mProjectionMatrix = new float[16];
/** Allocate storage for the final combined matrix. This will be passed into the shader program. */
private float[] mMVPMatrix = new float[16];
/** Store our model data in a float buffer. */
private final FloatBuffer mTriangle1Vertices;
/** This will be used to pass in the transformation matrix. */
private int mMVPMatrixHandle;
/** This will be used to pass in model position information. */
private int mPositionHandle;
/** This will be used to pass in model color information. */
private int mColorHandle;
/** How many bytes per float. */
private final int mBytesPerFloat = 4;
/** How many elements per vertex. */
private final int mStrideBytes = 7 * mBytesPerFloat;
/** Offset of the position data. */
private final int mPositionOffset = 0;
/** Size of the position data in elements. */
private final int mPositionDataSize = 3;
/** Offset of the color data. */
private final int mColorOffset = 3;
/** Size of the color data in elements. */
private final int mColorDataSize = 4;
private FloatBuffer vertexBuffer;
private ShortBuffer drawListBuffer;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
// X Y Z
static float squareCoords[] = { -0.5f, 0.5f, 0.0f, // top left
-0.5f, -0.5f, 0.0f, // bottom left
0.5f, -0.5f, 0.0f, // bottom right
0.5f, 0.5f, 0.0f }; // top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
static final int vertexStride = COORDS_PER_VERTEX * 4;
static final int vertexCount = 4;
private ByteBuffer dlb;
/**
* Initialize the model data.
*/
public LessonOneRenderer()
{
ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4); // (# of coordinate values * 4 bytes per float)
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(squareCoords);
vertexBuffer.position(0);
// initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2); // (# of coordinate values * 2 bytes per short)
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
// Define points for equilateral triangles.
// This triangle is red, green, and blue.
final float[] triangle1VerticesData = {
// X, Y, Z,
// R, G, B, A
-0.5f, 0.5f, 0.0f, // top left
1.0f, 0.0f, 0.0f, 1.0f,
-0.5f, -0.5f, 0.0f,// bottom left
0.0f, 0.0f, 1.0f, 1.0f,
0.5f, -0.5f, 0.0f, // bottom right
0.0f, 1.0f, 0.0f, 1.0f
,
0.5f, 0.5f, 0.0f, // top right
0.0f, 1.0f, 0.0f, 1.0f
};
// Initialize the buffers.
mTriangle1Vertices = ByteBuffer.allocateDirect(triangle1VerticesData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mTriangle1Vertices.put(triangle1VerticesData).position(0);
// initialize byte buffer for the draw list
dlb = ByteBuffer.allocateDirect(drawOrder.length * 2); // (# of coordinate values * 2 bytes per short)
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
}
#Override
public void onSurfaceCreated(GL10 glUnused, EGLConfig config)
{
// Set the background clear color to gray.
GLES20.glClearColor(0.5f, 0.5f, 0.5f, 0.5f);
// Position the eye behind the origin.
final float eyeX = 0.0f;
final float eyeY = 0.0f;
final float eyeZ = 1.5f;
// We are looking toward the distance
final float lookX = 0.0f;
final float lookY = 0.0f;
final float lookZ = -5.0f;
// Set our up vector. This is where our head would be pointing were we holding the camera.
final float upX = 0.0f;
final float upY = 1.0f;
final float upZ = 0.0f;
// Set the view matrix. This matrix can be said to represent the camera position.
// NOTE: In OpenGL 1, a ModelView matrix is used, which is a combination of a model and
// view matrix. In OpenGL 2, we can keep track of these matrices separately if we choose.
Matrix.setLookAtM(mViewMatrix, 0, eyeX, eyeY, eyeZ, lookX, lookY, lookZ, upX, upY, upZ);
final String vertexShader =
"uniform mat4 u_MVPMatrix; \n" // A constant representing the combined model/view/projection matrix.
+ "attribute vec4 a_Position; \n" // Per-vertex position information we will pass in.
+ "attribute vec4 a_Color; \n" // Per-vertex color information we will pass in.
+ "varying vec4 a_Pos; \n"
+ "varying vec4 v_Color; \n" // This will be passed into the fragment shader.
+" "
+"vec4 Distort(vec4 p){ \n"
+" float BarrelPower = 0.4; \n"
+" vec2 v = p.xy / p.w; \n"
+" float radius = length(v); \n"
+" if (radius > 0.0){ \n"
+" float theta = atan(v.y,v.x); \n"
+" radius = pow(radius, BarrelPower);\n"
+" v.x = radius * cos(theta); \n"
+" v.y = radius * sin(theta); \n"
+" p.xy = v.xy * p.w; \n"
+" }"
+" \n"
+" return p; \n"
+" } \n"
+ "void main() \n" // The entry point for our vertex shader.
+ "{ \n"
+ " v_Color = a_Color; \n" // Pass the color through to the fragment shader.
+ " vec4 P = u_MVPMatrix * a_Position;" // It will be interpolated across the triangle.
+ " gl_Position = Distort(P); \n" // gl_Position is a special variable used to store the final position.
+ " \n" // Multiply the vertex by the matrix to get the final point in
+ "} \n"; // normalized screen coordinates.
final String fragmentShader =
// Set the default precision to medium. We don't need as high of a
"varying vec4 a_Pos;" // precision in the fragment shader.
+ "varying vec4 v_Color; \n" // This is the color from the vertex shader interpolated across the
// triangle per fragment.
+ "void main() \n" // The entry point for our fragment shader.
+ "{ vec4 c = vec4(1.0); \n"
+ " gl_FragColor = v_Color; \n" // Pass the color directly through the pipeline.
+ "} \n";
// Load in the vertex shader.
int vertexShaderHandle = GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER);
if (vertexShaderHandle != 0)
{
// Pass in the shader source.
GLES20.glShaderSource(vertexShaderHandle, vertexShader);
// Compile the shader.
GLES20.glCompileShader(vertexShaderHandle);
// Get the compilation status.
final int[] compileStatus = new int[1];
GLES20.glGetShaderiv(vertexShaderHandle, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
// If the compilation failed, delete the shader.
if (compileStatus[0] == 0)
{
GLES20.glDeleteShader(vertexShaderHandle);
vertexShaderHandle = 0;
}
}
if (vertexShaderHandle == 0)
{
throw new RuntimeException("Error creating vertex shader.");
}
// Load in the fragment shader shader.
int fragmentShaderHandle = GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER);
if (fragmentShaderHandle != 0)
{
// Pass in the shader source.
GLES20.glShaderSource(fragmentShaderHandle, fragmentShader);
// Compile the shader.
GLES20.glCompileShader(fragmentShaderHandle);
// Get the compilation status.
final int[] compileStatus = new int[1];
GLES20.glGetShaderiv(fragmentShaderHandle, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
// If the compilation failed, delete the shader.
if (compileStatus[0] == 0)
{
GLES20.glDeleteShader(fragmentShaderHandle);
fragmentShaderHandle = 0;
}
}
if (fragmentShaderHandle == 0)
{
throw new RuntimeException("Error creating fragment shader.");
}
// Create a program object and store the handle to it.
int programHandle = GLES20.glCreateProgram();
if (programHandle != 0)
{
// Bind the vertex shader to the program.
GLES20.glAttachShader(programHandle, vertexShaderHandle);
// Bind the fragment shader to the program.
GLES20.glAttachShader(programHandle, fragmentShaderHandle);
// Bind attributes
GLES20.glBindAttribLocation(programHandle, 0, "a_Position");
GLES20.glBindAttribLocation(programHandle, 1, "a_Color");
// Link the two shaders together into a program.
GLES20.glLinkProgram(programHandle);
// Get the link status.
final int[] linkStatus = new int[1];
GLES20.glGetProgramiv(programHandle, GLES20.GL_LINK_STATUS, linkStatus, 0);
// If the link failed, delete the program.
if (linkStatus[0] == 0)
{
GLES20.glDeleteProgram(programHandle);
programHandle = 0;
}
}
if (programHandle == 0)
{
throw new RuntimeException("Error creating program.");
}
// Set program handles. These will later be used to pass in values to the program.
mMVPMatrixHandle = GLES20.glGetUniformLocation(programHandle, "u_MVPMatrix");
mPositionHandle = GLES20.glGetAttribLocation(programHandle, "a_Position");
mColorHandle = GLES20.glGetAttribLocation(programHandle, "a_Color");
// Tell OpenGL to use this program when rendering.
GLES20.glUseProgram(programHandle);
}
#Override
public void onSurfaceChanged(GL10 glUnused, int width, int height)
{
// Set the OpenGL viewport to the same size as the surface.
GLES20.glViewport(0, 0, width, height);
// Create a new perspective projection matrix. The height will stay the same
// while the width will vary as per aspect ratio.
final float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.0f;
Matrix.frustumM(mProjectionMatrix, 0, left, right, bottom, top, near, far);
}
#Override
public void onDrawFrame(GL10 glUnused)
{
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
// Do a complete rotation every 10 seconds.
long time = SystemClock.uptimeMillis() % 10000L;
float angleInDegrees = (360.0f / 10000.0f) * ((int) time);
// Draw the triangle facing straight on.
//Matrix.setIdentityM(mModelMatrix, 0);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//drawTriangle(mTriangle1Vertices);
//Draw Square
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, 0.0f, -0.7f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
this.drawSquare(mTriangle1Vertices);
//Draw Square
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, 0.0f, 0.7f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
this.drawSquare(mTriangle1Vertices);
// Draw one translated a bit down and rotated to be flat on the ground.
//Matrix.setIdentityM(mModelMatrix, 0);
//Matrix.translateM(mModelMatrix, 0, 0.0f, -1.0f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, 90.0f, 1.0f, 0.0f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//drawTriangle(mTriangle2Vertices);
// Draw one translated a bit to the right and rotated to be facing to the left.
//Matrix.setIdentityM(mModelMatrix, 0);
//Matrix.translateM(mModelMatrix, 0, 1.0f, 0.0f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, 90.0f, 0.0f, 1.0f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//drawTriangle(mTriangle3Vertices);
// test.draw();
}
/**
* Draws a triangle from the given vertex data.
*
* #param aTriangleBuffer The buffer containing the vertex data.
*/
private void drawTriangle(final FloatBuffer aTriangleBuffer)
{
// Pass in the position information
aTriangleBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aTriangleBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Pass in the color information
aTriangleBuffer.position(mColorOffset);
GLES20.glVertexAttribPointer(mColorHandle, mColorDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aTriangleBuffer);
GLES20.glEnableVertexAttribArray(mColorHandle);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 3);
}
private void drawSquare(final FloatBuffer aTriangleBuffer)
{
// Pass in the position information
aTriangleBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aTriangleBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Pass in the color information
aTriangleBuffer.position(mColorOffset);
GLES20.glVertexAttribPointer(mColorHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false,
vertexStride, aTriangleBuffer);
GLES20.glEnableVertexAttribArray(mColorHandle);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, 4);
}
}
Does anybody have an idea what i'm missing to distort the squares like in the example?
You apply the distortion effect in the vertex shader. So all you are doing is moving the 4 corners of those spheres. You can't achieve the desired effect that way.
Threre are different options. You theoretically could use some higher tesselation for your squares, so you just create a 2D grid of vertices, which then can indivudually be moved according to the distortion. If your grid is fine enough, the piecewise-linear nature of that approximation will not be visible anymore.
However, such effects are typically done differently. One usually does not tune the geometric models for specific effects. That kind of distortions usually are applied in screen-space, as a post-processing effect (and the link you posted does exactly this). The idea is that you can render the whole scene into a texture, and draw a single tetxured rectangle filling the whole screen as the final pass. In that pass, you can apply the distortion to the texture coordinates in the fragment shader, just as in the original example.
All of that can be done in OpenGL ES, too. The keywords to look for are RTT (render to texture) and FBOs (framebuffer objects). On GLES, this feature is available as the OES_framebuffer_object extension (which is widely supported on most ES implementations).
However, using that stuff is definitively a bit more advanced than your typical "lessson 1" of some tutorial, you might want to read some other lessons first before trying that... ;)

how to get only bitmap image from opengl surfaceview not background in android?

please anyone help me about getting bitmap from surface view here is my GlSurface View Render class, i am getting bitmap image but it has black background so i want only image and remove background black color.
public class GLLayer implements GLSurfaceView.Renderer{
/**
* This class implements our custom renderer. Note that the GL10 parameter
* passed in is unused for OpenGL ES 2.0 renderers -- the static class
* GLES20 is used instead.
*/
public Context mActivityContext;
/**
* Store the model matrix. This matrix is used to move models from object
* space (where each model can be thought of being located at the center of
* the universe) to world space.
*/
private float[] mModelMatrix = new float[16];
/**
* Store the view matrix. This can be thought of as our camera. This matrix
* transforms world space to eye space; it positions things relative to our
* eye.
*/
private float[] mViewMatrix = new float[16];
/**
* Store the projection matrix. This is used to project the scene onto a 2D
* viewport.
*/
private float[] mProject![enter image description here][1]ionMatrix = new float[16];
/**
* Allocate storage for the final combined matrix. This will be passed into
* the shader program.
*/
private float[] mMVPMatrix = new float[16];
/** Store our model data in a float buffer. */
private final FloatBuffer mCubePositions;
private final FloatBuffer mCubeColors;
private final FloatBuffer mCubeTextureCoordinates;
/** This will be used to pass in the transformation matrix. */
private int mMVPMatrixHandle;
/** This will be used to pass in the texture. */
private int mTextureUniformHandle0;
private int mTextureUniformHandle1;
private int mTextureUniformHandle2;
private int mTextureUniformHandle3;
private int mTextureUniformHandle4;
/** This will be used to pass in model position information. */
private int mPositionHandle;
/** This will be used to pass in model color information. */
// private int mColorHandle;
/** This will be used to pass in model texture coordinate information. */
private int mTextureCoordinateHandle;
/** How many bytes per float. */
private final int mBytesPerFloat = 4;
/** Size of the position data in elements. */
private final int mPositionDataSize = 3;
/** Size of the color data in elements. */
// private final int mColorDataSize = 4;
/** Size of the texture coordinate data in elements. */
private final int mTextureCoordinateDataSize = 2;
/** This is a handle to our cube shading program. */
private int mProgramHandle;
/** This is a handle to our texture data. */
private int mTextureDataHandle0;
private int mTextureDataHandle1;
private int mTextureDataHandle2;
private int mTextureDataHandle3;
private int mTextureDataHandle4;
/**
* Shader Titles
*/
static public int shader_selection = 0;
static public final int AMARO = 1;
static public final int EARLYBIRD = 2;
static public final int HEFE = 3;
static public final int HUDSON = 4;
static public final int MAYFAIR = 5;
static public final int RISE = 6;
static public final int TOASTER = 7;
static public final int VALENCIA = 8;
static public final int WILLOW = 9;
static public final int XPRO = 10;
private Bitmap mBitmap;
public static int width_surface ;
public static int height_surface;
public int width_bitmap =0;
public int height_bitmap=0;
public static boolean printOptionEnable = false;
// and more ...
/**
* Initialize the model data.
*/
public GLLayer(final Context activityContext, Bitmap mBitmap) {
mActivityContext = activityContext;
this.mBitmap = mBitmap;
this.width_bitmap = mBitmap.getWidth();
this.height_bitmap = mBitmap.getHeight();
// Define points for a cube.
final float[] cubePositionData = {
// Front face
-1.0f, 1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, 1.0f, -1.0f,
-1.0f, 1.0f, 1.0f, -1.0f, 1.0f, 1.0f, 1.0f, 1.0f };
// R, G, B, A
final float[] cubeColorData = {
// Front face (red)
1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f,
0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f };
final float[] cubeTextureCoordinateData = {
// Front face
0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f,
1.0f, 0.0f };
// Initialize the buffers.
mCubePositions = ByteBuffer.allocateDirect(cubePositionData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mCubePositions.put(cubePositionData).position(0);
mCubeColors = ByteBuffer.allocateDirect(cubeColorData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mCubeColors.put(cubeColorData).position(0);
mCubeTextureCoordinates = ByteBuffer.allocateDirect(cubeTextureCoordinateData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mCubeTextureCoordinates.put(cubeTextureCoordinateData).position(0);
}
protected String getVertexShader() {
return RawResourceReader.readTextFileFromRawResource(mActivityContext,R.raw._vertex_shader);
}
protected String getFragmentShader() {
int id;
switch (shader_selection) {
case AMARO:
id = R.raw.amaro_filter_shader;
break;
case EARLYBIRD:
id = R.raw.earlybird_filter_shader;
break;
case HEFE:
id = R.raw.hefe_filter_shader;
break;
case HUDSON:
id = R.raw.hudson_filter_shader;
break;
case MAYFAIR:
id = R.raw.mayfair_filter_shader;
break;
case RISE:
id = R.raw.rise_filter_shader;
break;
case TOASTER:
id = R.raw.toaster_filter_shader;
break;
case WILLOW:
id = R.raw.willow_filter_shader;
break;
case XPRO:
id = R.raw.xpro_filter_shader;
break;
default:
id = R.raw._fragment_shader;
break;
}
return RawResourceReader.readTextFileFromRawResource(mActivityContext,id);
}
#Override
public void onSurfaceCreated(GL10 glUnused, EGLConfig config) {
// Set the background clear color to black.
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
// Use culling to remove back faces.
GLES20.glEnable(GLES20.GL_CULL_FACE);
// Enable depth testing
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
final float eyeX = 0.0f;
final float eyeY = 0.0f;
final float eyeZ = -0.5f;
// We are looking toward the distance
final float lookX = 0.0f;
final float lookY = 0.0f;
final float lookZ = -5.0f;
// Set our up vector. This is where our head would be pointing were we
// holding the camera.
final float upX = 0.0f;
final float upY = 1.0f;
final float upZ = 0.0f;
Matrix.setLookAtM(mViewMatrix, 0, eyeX, eyeY, eyeZ, lookX, lookY,lookZ, upX, upY, upZ);
// // Load the texture
mTextureDataHandle0 = TextureHelper.loadTexture(mBitmap);
// Load the texture
mTextureDataHandle1 = TextureHelper.loadTexture(mActivityContext,R.drawable.filter2);
// Load the texture
mTextureDataHandle2 = TextureHelper.loadTexture(mActivityContext,R.drawable.hefe);
// Load the texture
mTextureDataHandle3 = TextureHelper.loadTexture(mActivityContext,R.drawable.hudson);
// Load the texture
mTextureDataHandle4 = TextureHelper.loadTexture(mActivityContext,R.drawable.toaster);
}
#Override
public void onSurfaceChanged(GL10 glUnused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
final float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.0f;
width_surface = width ;
height_surface = height ;
Matrix.frustumM(mProjectionMatrix, 0, left, right, bottom, top, near,far);
}
#Override
public void onDrawFrame(GL10 glUnused) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
final String vertexShader = getVertexShader();
final String fragmentShader = getFragmentShader();
final int vertexShaderHandle = ShaderHelper.compileShader(GLES20.GL_VERTEX_SHADER, vertexShader);
final int fragmentShaderHandle = ShaderHelper.compileShader(GLES20.GL_FRAGMENT_SHADER, fragmentShader);
mProgramHandle = ShaderHelper.createAndLinkProgram(vertexShaderHandle,
fragmentShaderHandle, new String[] { "a_Position","a_TexCoordinate" });
// Set our per-vertex lighting program.
GLES20.glUseProgram(mProgramHandle);
// Set program handles for cube drawing.
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle,"u_MVPMatrix");
mTextureUniformHandle0 = GLES20.glGetUniformLocation(mProgramHandle,"u_Texture0");
mTextureUniformHandle1 = GLES20.glGetUniformLocation(mProgramHandle,"u_Texture1");
mTextureUniformHandle2 = GLES20.glGetUniformLocation(mProgramHandle,"u_Texture2");
mTextureUniformHandle3 = GLES20.glGetUniformLocation(mProgramHandle,"u_Texture3");
mTextureUniformHandle4 = GLES20.glGetUniformLocation(mProgramHandle,"u_Texture4");
mPositionHandle = GLES20.glGetAttribLocation(mProgramHandle,"a_Position");
mTextureCoordinateHandle = GLES20.glGetAttribLocation(mProgramHandle,"a_TexCoordinate");
/**
* First texture map
*/
// Set the active texture0 unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle0);
// Tell the texture uniform sampler to use this texture in the shader by
// binding to texture unit 0.
GLES20.glUniform1i(mTextureUniformHandle0, 0);
/**
* Second texture map filter
*/
// Set the active texture1 unit to texture unit 1.
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle1);
// Tell the texture uniform sampler to use this texture in the shader by
// binding to texture unit 1.
GLES20.glUniform1i(mTextureUniformHandle1, 1);
/**
* Third texture map filter hefe
*/
// Set the active texture1 unit to texture unit 1.
GLES20.glActiveTexture(GLES20.GL_TEXTURE2);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle2);
// Tell the texture uniform sampler to use this texture in the shader by
// binding to texture unit 1.
GLES20.glUniform1i(mTextureUniformHandle2, 2);
/**
* Fouth texture map filter hudson
*/
// Set the active texture1 unit to texture unit 1.
GLES20.glActiveTexture(GLES20.GL_TEXTURE3);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle3);
// Tell the texture uniform sampler to use this texture in the shader by
// binding to texture unit 1.
GLES20.glUniform1i(mTextureUniformHandle3, 3);
/**
* Fifth texture map filter toaster
*/
// Set the active texture1 unit to texture unit 1.
GLES20.glActiveTexture(GLES20.GL_TEXTURE4);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle4);
// Tell the texture uniform sampler to use this texture in the shader by
// binding to texture unit 1.
GLES20.glUniform1i(mTextureUniformHandle4, 4);
// Draw some cubes.
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, 0.0f, 0.0f, -3.2f);
Matrix.rotateM(mModelMatrix, 0, 0.0f, 1.0f, 1.0f, 0.0f);
drawCube();
createBitmap();
}
/**
* Draws a cube.
*/
private void drawCube() {
// Pass in the position information
mCubePositions.position(0);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize,
GLES20.GL_FLOAT, false, 0, mCubePositions);
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Pass in the texture coordinate information
mCubeTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(mTextureCoordinateHandle,
mTextureCoordinateDataSize, GLES20.GL_FLOAT, false, 0,
mCubeTextureCoordinates);
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
// Pass in the combined matrix.
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
// Draw the cube.
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 6);
}
public void createBitmap(){
try {
if (printOptionEnable) {
printOptionEnable = false ;
int w = width_surface ;
int h = height_surface ;
int b[]=new int[(int) (w*h)];
int bt[]=new int[(int) (w*h)];
IntBuffer buffer=IntBuffer.wrap(b);
buffer.position(0);
GLES20.glReadPixels(0,0,w,h,GLES20.GL_RGBA,GLES20.GL_UNSIGNED_BYTE, buffer);
for(int i=0; i<h; i++){
//remember, that OpenGL bitmap is incompatible with Android bitmap
//and so, some correction need.
for(int j=0; j<w; j++){
int pix=b[i*w+j];
int pb=(pix>>16)&0xff;
int pr=(pix<<16)&0x00ff0000;
int pix1=(pix&0xff00ff00) | pr | pb;
bt[(h-i-1)*w+j]=pix1;
}
}
Bitmap inBitmap = null ;
if (inBitmap == null || !inBitmap.isMutable()
|| inBitmap.getWidth() != w || inBitmap.getHeight() != h) {
inBitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
}
//Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
inBitmap.copyPixelsFromBuffer(buffer);
//return inBitmap ;
// return Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
inBitmap = Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
inBitmap.compress(CompressFormat.JPEG, 90, bos);
byte[] bitmapdata = bos.toByteArray();
ByteArrayInputStream fis = new ByteArrayInputStream(bitmapdata);
final Calendar c=Calendar.getInstance();
long mytimestamp=c.getTimeInMillis();
String timeStamp=String.valueOf(mytimestamp);
String myfile="vijay"+timeStamp+".jpeg";
File dir_image=new File(Environment.getExternalStorageDirectory().toString());
dir_image.mkdirs();
try {
File tmpFile = new File(dir_image,myfile);
FileOutputStream fos = new FileOutputStream(tmpFile);
byte[] buf = new byte[1024];
int len;
while ((len = fis.read(buf)) > 0) {
fos.write(buf, 0, len);
}
fis.close();
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}catch(Exception e) {
e.printStackTrace() ;
}
}
}
make your glSurfaceView as translucent like in this example
code snippet from same example
// Create our Preview view and set it as the content of our
// Activity
mGLSurfaceView = new GLSurfaceView(this);
// We want an 8888 pixel format because that's required for
// a translucent window.
// And we want a depth buffer.
mGLSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
// Tell the cube renderer that we want to render a translucent version
// of the cube:
mGLSurfaceView.setRenderer(new CubeRenderer(true));
// Use a surface format with an Alpha channel:
mGLSurfaceView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
and then save Image as .png not .jpeg so as to preserve transparency.

Only half a triangle shows when rotating in OpenGL ES 2.0

I'm trying to rotate a triangle around the the Y axis. When I rotate it about the Z axis, everything is fine. But when I try rotating about the Y axis, all I get is a half triangle, rotating about the Y axis. I'm using PowerVRs OpenGL ES 2.0 SDK. My Init and Draw functions are below.
int Init(ESContext* esContext)
{
UserData* userData = (UserData *)esContext->userData;
const char *vShaderStr =
"attribute vec4 vPosition; \n"
"uniform mat4 MVPMatrix;"
"void main() \n"
"{ \n"
" gl_Position = MVPMatrix * vPosition;\n"
"} \n";
const char *fShaderStr =
"precision mediump float; \n"
"void main() \n"
"{ \n"
" gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); \n"
"} \n";
GLuint vertexShader;
GLuint fragmentShader;
GLuint programObject;
GLint linked;
GLfloat ratio = 320.0f/240.0f;
vertexShader = LoadShader(GL_VERTEX_SHADER, vShaderStr);
fragmentShader = LoadShader(GL_FRAGMENT_SHADER, fShaderStr);
programObject = glCreateProgram();
if (programObject == 0)
return 0;
glAttachShader(programObject, vertexShader);
glAttachShader(programObject, fragmentShader);
glBindAttribLocation(programObject, 0, "vPosition");
glLinkProgram(programObject);
glGetProgramiv(programObject, GL_INFO_LOG_LENGTH, &linked);
if (!linked)
{
GLint infoLen = 0;
glGetProgramiv(programObject, GL_INFO_LOG_LENGTH, &infoLen);
if (infoLen > 1)
{
char* infoLog = (char *)malloc(sizeof(char) * infoLen);
glGetProgramInfoLog(programObject, infoLen, NULL, infoLog);
free(infoLog);
}
glDeleteProgram(programObject);
return FALSE;
}
userData->programObject = programObject;
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glViewport(0, 0, esContext->width, esContext->height);
glClear(GL_COLOR_BUFFER_BIT);
glUseProgram(userData->programObject);
userData->angle = 0.0f;
userData->start = time(NULL);
userData->ProjMatrix = PVRTMat4::Perspective(ratio*2.0f, 2.0f, 3.0f, 7.0f, PVRTMat4::eClipspace::OGL, false, false);
userData->ViewMatrix = PVRTMat4::LookAtLH(PVRTVec3(0.0f, 0.0f, -3.0f), PVRTVec3(0.0f, 0.0f, 0.0f), PVRTVec3(0.0f, 1.0f, 0.0f));
return TRUE;
}
void Draw(ESContext *esContext)
{
GLfloat vVertices[] = {0.0f, 0.5f, 0.0f,
-0.5f, -0.5f, 0.0f,
0.5f, -0.5f, 0.0f};
GLint MVPHandle;
double timelapse;
PVRTMat4 MVPMatrix = PVRTMat4::Identity();
UserData* userData = (UserData *)esContext->userData;
timelapse = difftime(time(NULL), userData->start) * 1000;
if(timelapse > 16.0f) //Maintain approx 60FPS
{
if (userData->angle > 360.0f)
{
userData->angle = 0.0f;
}
else
{
userData->angle += 0.1f;
}
}
userData->ModelMatrix = PVRTMat4::RotationY(userData->angle);
MVPMatrix = userData->ViewMatrix * userData->ModelMatrix;
MVPMatrix = userData->ProjMatrix * MVPMatrix;
MVPHandle = glGetUniformLocation(userData->programObject, "MVPMatrix");
glUniformMatrix4fv(MVPHandle, 1, FALSE, MVPMatrix.ptr());
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, vVertices);
glEnableVertexAttribArray(0);
glClear(GL_COLOR_BUFFER_BIT);
glDrawArrays(GL_TRIANGLES, 0, 3);
eglSwapBuffers(esContext->eglDisplay, esContext->eglSurface);
}
PVRTMat4::Perspective(ratio*2.0f, 2.0f, 3.0f, 7.0f, PVRTMat4::eClipspace::OGL, false, false); puts the near clipping plane at 3.0f units away from the camera (via the third argument).
PVRTMat4::LookAtLH(PVRTVec3(0.0f, 0.0f, -3.0f), PVRTVec3(0.0f, 0.0f, 0.0f), PVRTVec3(0.0f, 1.0f, 0.0f)); places the camera at (0, 0, -3), looking back at (0, 0, 0).
You generate a model matrix directly using PVRTMat4::RotationY(userData->angle); so that matrix does no translation. The triangle you're drawing remains positioned on (0, 0, 0) as per its geometry.
So what's happening is that the parts of the triangle that get closer to the camera than 3 units are being clipped by the near clipping plane. The purpose of the near clipping plane is effectively to place the lens of the camera relative to where the image will be perceived. Or it's like specifying the distance from user to screen, if you prefer.
You therefore want either to bring the near clip plane closer to the camera or to move the camera further away from the triangle.

Resources