Blender to Unity rotations screwed up - rotation

I managed to get rotations on the console in blender, but when I try to apply it in Unity, it is just very wrong. I am using Quaternion.Set
to set the desired rotation. I know that blender uses (WXYZ) quaternion, but when I got these values and set properly it in to Unity3D (XYZW), it gives me nonsense rotations.
http://pastebin.com/bKzUVCih
here is the link to my script. Please help me point out what is wrong there.
P.S.: Euler rotations are not an option, because they're lossy as far as I know...

I have solved this problem to my satisfaction. The problem is that the XYZ rotations of Unity are different from those of Blender. If you wish to convert positioning and rotation of an object perfectly from Blender to Unity, use the following steps:
Rotate the object -90 degrees on the X axis.
Calculate the Blender quaternion. Remember that the Blender quaternion is going to come out in a WXYZ matrix, and a Unity quaternion is going to use a XYZW order.
Rotate your object back 90 degrees on the X axis, export to FBX using the experimental, "apply transform," check box.
Translate your object in Unity using the translation in Blender, BUT use the equivalent of (-X, Z, -Y) for your translation.
If you're looking for a Python script to get the quaternion out of Blender, I've put together one and put most of it here
I might as well put a YouTube clip together on this, it's simplistic but ridiculously hard to figure out.

Related

Object Rotation in Three.js / Threemap

I am trying to create a 3D Visualization of an RC airplane in Threebox. The RC plane sends live telemetry, including:
GPS Coordinates
Gyro sensor data, showing the pitch, roll and heading of the plane.
I have now loaded a Model of an airplane in Threebox, no problems with that.
My problem comes down to the rotation of the plane. I want the plane object to represent the current orientation of the RC plane. Since I have the live telemetry from the flight controller, this should be possible.
In the Documentation, I have found this method, which seemed like exactly what i needed:
plane.setRotation({x: roll, y: pitch, z: yaw/heading})
And it basically works. I can rotate the Plane around its axes. But things get messed up when I combine the rotations.
For example: When I just update the roll axis, the Object behaves just like I want it to. However, when i change the heading of the plane by 90 degrees, the roll axis suddenly becomes the pitch axis. It seems to me, that the axes of the Plane object don't rotate with the plane itself.
I've prepared a recreation of the issue on jsfiddle. You can change the heading of the plane using the slider in the bottom right.
I've been stuck on this for days, would be super happy for any help!
There are lots of issues with your jsfiddle that prevent it from running. To isolate an issue and make it easier to test you should eliminate as many variables as possible - you're using two third-party libraries that will play a big hand in how transformations behave, particularly threebox.
I would recommend sticking with three.js's built in transformation tools unless you specifically need some lat/lng transformations, or other transformations to move between a local cartesian space and a global coordinate system. In this case, a very basic plane.setRotationFromEuler(new THREE.Euler(yaw, pitch, roll)) should do the trick. Be aware of how much order in euler rotations can affect the outcome, and that three.js uses radians for all its rotations, not degrees.

Animation rotation in unity3D is not as accurate as it in maya

I have a turn on spot animation that actually turns the player 61 degrees in maya. But when import the animation to unity and apply it to a animator controller to control a character, it turns the character 56 degrees. Why the turn angles are different?
Maya and some other 3D packages export their models with Z-axis faces upward. Standard scripts in Unity assume the Y-axis represents up in your world axis. It is better to fix the rotation in Unity than to modify the scripts to make things fit.
How do I fix the rotation of an imported model?
The same problem refers to orientation in Unity:
Rotation and Orientation in Unity
See also Euler vs Quaternion and The order of transformation in Maya.

Compose two rotations in D3 geo projection?

Having fun with D3 geo orthographic projection to build an interactive globe, based on all the great examples I found.
You can see my simple mockup at http://bl.ocks.org/patricksurry/5721459
I want the user to manipulate the globe like a trackball (http://www.opengl.org/wiki/Trackball). I started with one of Mike's examples (http://mbostock.github.io/d3/talk/20111018/azimuthal.html), and improved slightly to use canvas coordinates and express the mouse locations in 'trackball coordinates' (i.e. rotation around canvas horizontal and vertical axes) so that a fixed mouse movement gives more rotation near the edges of the globe (and works outside the globe if you use the hyberbolic extension explained above), rather than Mike's one:one correspondence.
It works nicely when the globe starts at an unrotated position (north pole vertical), but when the globe is already rotated (manipulate the example so the north pole is facing out of the page) then the trackball controls become non-intuitive because you can't simply express a change in trackball coordinates as a delta in the d3.geo.rotate lat/lon coordinates. D3's 3-axis rotation involves applying a longitude rotation (spin around north pole), then a latitude rotation (spin around a horizontal axis in the canvas plane), and then a 'yaw' rotation (spin around an axis perpendicular to the plane) - see http://bl.ocks.org/mbostock/4282586.
I guess what I need is a method for composing my two rotation matrices (the one currently in the projection, with a new one to rotate the trackball slightly), but I can't see a way to do that in D3, other than digging into the source (https://github.com/mbostock/d3/blob/master/src/geo/rotation.js) and trying to do the math to define the rotation matrix. The code looks elegant but comment-free and I'm not sure I can correctly decipher the closures with the orthographic projection instance.
On the last point, if someone knows the rotation matrix form of d3.geo.projection that would probably solve my problem too.
Any ideas?
There is an alternative solution to patricksurry's answer, by using quaternion representations, as inspired by Jason Davies. I, too, thought D3 would've already supported this composition natively! And hoped Jason Davies posted his code...
Took sometime to figure out the math. A demo is uploaded here, with an attempt to explain the math too. http://bl.ocks.org/ivyywang/7c94cb5a3accd9913263
With my limited math knowledge, I think, one of the advantages quaternion over Euler is the ability to compound multiple rotations over and over, without worrying about coordinate references. So it would always work, no matter where your north pole faces, and no matter how many rotations you'll have. (Someone please correct me, if I got this wrong).
I decided that solving for the combined rotation matrix might not be so hard. I got http://sagemath.org to do most of the hard work, so that I could express the composition of the original projection rotate() orientation plus a trackball rotation as a single equivalent rotate().
This gives much more natural behavior regardless of the orientation of the globe.
I updated the mockup so that it has the improved version - see http://bl.ocks.org/patricksurry/5721459
The sources are at http://bl.ocks.org/patricksurry/5721459 which include an explanation of the math - cool that you can use proper greek letters in javascript for almost readable math sourcecode!
It would still be good if D3 supported composition of rotate operations natively (or maybe it does already?!)

The Big Rotation / Orientation Quest in Unity

I'm working on a custom blender to unity exporter/importer(school project), and I have a bigger problem: The Rotation.
I can get the Euler rotations and/or quaternions from blender, but when I want them to apply in one of Unity's objects, it just totally messes up the rotation.
I've already tried to swap Y and Z coordinates, but it doesn't seems to work either.
I tried but applying a rotation matrix isn't possible in unity... but if somehow we can figure it out, maybe that would be the solution?
I need to find a universal solution as soon as possible.
Thanks.
... for a further description, please read the comments bellow.

Basic approach to pupil constriction/dilation of eye model in OpenGL

I'm new to OpenGL-ES and looking for the best approach for creating a realistic model of an eye whose pupil can dilate and constrict so I have a plan in mind while running through tutorials.
I've made a mesh in blender that is basically a sphere with a hole (the 'pole' or central vertex is removed and a couple surrounding circle edges).
I plan to add an iris texture directly to the sphere's polys surrounding the hole.
To change pupil size, do I just need a function to reposition the vertices of the hole so the hole dilates or contracts?
I'm going to use OpenGL within an Objective-C app. I have Jeff Lamarche's Objective C export script. Is it standard to export only the mesh from blender, and add textures in code later in xcode? Or is it easier/better to setup the textures on the meshes in blender first and export the more finished product's data to xcode?
Your question is a bit old, so I'm not sure how much progress you've made, but as I've been climbing up the learning curve myself I thought I'd take a shot at answering.
If you want to animate the individual vertices of your model, I believe the method you'll want is Vertex Skinning. I can't speak much on that front as I haven't yet had reason to experiment with it, although it's a technique only available in OpenGL ES 2.0. (Which is probably where you want to start anyway, the increased flexibility over 1.1 is more than worth any additional incline to the learning curve.)
The answer to your texturing question is somewhat mixed. You'll need to actually apply the texture in OpenGL. But what Blender can do for you is determine the texture coordinates. Each vertex of your mesh will have a texture coordinate associated with it. The texture coordinate will be X, Y coordinates which map to a location on the texture image. The coordinates are in a range from 0.0 to 1.0 -- so, since your image texture is a rectangle, the texture coordinate {0, 0} maps to the bottom left corner; {1 , 1} maps to the top right corner; {0.5, 0.5} maps to the exact center of the image.
So in blender, you'd want to go ahead and texture the object with UV mappings. When you export, although your exported mesh won't contain any of the image content, it will retain the texture coordinates which map to your image content. This will allow you to apply the texture in OpenGL so that the texture is applied the same way it appeared in blender.
I've personally had some trouble getting Jeff Lamarche's script to spit out the texture coordinates, as Blender api seems to change significantly with each release. I've had more success with an .obj converter. So I've been exporting from blender to .obj, and using a command line tool to go from .obj to a C header file.
If you encounter similar problems with Lamarche's script, this post might help solve it: http://38leinad.wordpress.com/2012/05/29/blender-2-6-exporting-uv-texture-coordinates/
And this is a good resource for a .obj to .h script:
http://heikobehrens.net/2009/08/27/obj2opengl/

Resources