Solar System using OpenGL-ES 2.0 and GLSL 1.0 - opengl-es

I am trying to implement the classic solar system (Sun & Earth only - cubes in place of spheres) application using OpenGLES 2.0 and GLSL 1.0. I am not getting how to
write the translation and rotation matrix to get the Earth cube revolving around the Sun.
what should be order of matrix multiplication.
I am doing all the matrix operation in the vertex shader and have got the two cubes rotating along x and y axis respectively.
But facing problem in getting the Earth cube revolve around the Sun cube :-(

first you have to understand Matrix
ES 1.X is better If you dont't know exactly
1. translation matrix is
1000
0100
0010
xyz1
Change X and Z valute
2. angle matrix is
c s 00
-s c 00
0 0 1 0
0 0 0 1
also change X & Y axis
then operation martix in code ( not shader code )
and just give matirx to shader's uniforms each obejct

Related

threejs: How to perform rotation of object around it's local axes?

I am trying to rotate object on it's local axis using absolute rotation values. I am facing two issues
1. It looks like my object is rotating around world axes. For example set X, Y and Z to 45 degrees and try changing X.
2. If Y is set to 90 degrees, I get into gimbal lock issue.
I am using quaternion by setting rotation as follow
cube.quaternion.setFromEuler(new THREE.Euler(x, y , z));
A demo snippet is JSFiddle
I came across this article ThreeJS - rotation around object's own axis, but that didn't helped me.
I have been trying different solution using Euler and Matrix but still not able to make it work.
Looking for help to figure out the solution to this problem.

obtaining Q matrix from matlab stereoparams

I am working on 3d image reconstrcution using stereo camera. I started with opencv 3.2 and visual studio. I was unable to correctly register two point clouds from two scenes with an overlap correctly. So, I have my doubt on the Q matrix obtained from camera calibration process. So I did the camera calibration using the matlab calibrator app. I want to manually create the Q matrix from the calibration parameters obtained from matlab and then use it in opencv. I found from this post how to create a Q matrix. Now the problem is i don't know the focal length i should use in this matrix. Matlab provides the calibration parameters in a stereoparam object which contains camera parameters for both camera sensors separately. So i have fx and fy from camera1 and fx and fy from camera2. So how do i obtain a single focal length for the stereo camera?
As reported here, fx and fy are expressed in pixels.
F, the focal length in world unit (typically in millimeters) can be computed as
F = fx * px or F = fy * py,
where px and py are the size of the pixel along x and y, respectively.
In particular,
px = image width [pixel] / image sensor width [mm]
py = image height [pixel] / image sensor height [mm].
This is the Q matrix aka reprojection matrix
Since you have the camera intrinsic matrix and extrinsic matrix, then just fill it in accordingly.
Take note the Tx should be in mm.

Gimbal lock at y axis 90 degrees

I found a problem with rotations.
Using the transform controls helper, if I rotate a mesh on the Y axis, when I reach 90 degrees everything is flipped by -180 degrees.
I think this is due to the software avoiding gimbal lock, how to avoid it?
That is, I would like the x- and z-angles to remain 0 degrees in the display.
I tried even on the threejs editor (https://threejs.org/editor/) and it occurs even there.
Please help me :)!
What you are describing is has nothing to do with Gimbal lock.
three.js is quaternion-based. An equivalent Euler angle representation is provided for convenience.
Euler angles are not unique; there are many Euler angles that represent the same orientation. See this answer for info on how Euler angles work in three.js.
If you want to rotate an object on the y-axis only, and have object.rotation.y be continuous, you can do so by changing the rotation order like so:
object.rotation.order = 'YXZ';
three.js r.87

webgl - model coordinates to screen coordinates

Im having an issue with calculating screen coordinates of a vertex. this is not a specifically a webgl issue, more of a general 3d graphics issue.
the sequence of matrix transformations that Im using is:
result_vec4 = perspective_matrix * camera_matrix * model_matrix * vertex_coords_vec4
model_matrix being the transformation of a vertex in its local coordinate system into the global scene coord system.
so my understanding is that the final result_vec4 is in clip space? which should then be in the [-1,1] range. which is not what Im getting... result_vec4 just ends up containing some standard values for the coords, not corresponding to the correct screen position of the vertex.
does anyone have any ideas as to what might be the issue here?
thank you very much for any thoughts.
To go in clip space you need to project result_vec4 on the hyperplane w=1 using:
result_vec4 /= result_vec4.w
By applying this perspective division result_vec4.xyz will be in [-1,1].

Algorithm for converting Latitude and Longitude to World Coordinates (OpenGL)

I have a set of latitude and longitude co-ordinates which are to be rendered in a GL program I'm working on. I have information such as the number of units per latitude and longitude degree (I have the co-ordinates in decimal degrees, eg, 27.1234) - for example, 15 units per longitude, and 10 units per latitude. However, I've had problems with rendering.
This calculation gives me locations I can use for rendering, but it's not perfect. I initially tested only with co-ordinates such as S026.33.01.806 E148.46.27.009, and when I switched to co-ordinates such as N039.52.19.030 W075.14.28.107 the rendering ended up upside-down and horizontally flipped.
It may be a fundamental lack of understanding of OpenGL and how it interprets co-ordinates, or perhaps I'm approaching the problem the wrong way. I'm using Python and PyOpenGL, but I presume this algorithm is something that can be done without a specific language requirement.
EDIT: I have uploaded the code that seems to be most relevant to http://slexy.org/view/s21LKiD9tj.
Erm, the number of units for longitude is not constant? What strange kind of conversion function are you using?
Assuming the earth to be spherical with radius r, centered in the root of the coordinate system, with z-axis pointing north, x-axis pointing towards longitude 0, and y-axis pointing towards longitude 90, you can get cartesian coordinates as follows:
x = r * cos(latitude) * cos(longitude);
y = r * cos(latitude) * sin(longitude);
z = r * sin(latitude);
Note: If your language's trigonometric functions expect the arguments to be specified in radians rather than degrees, be sure to convert them first.

Resources