I made a simple cubic guy and armature for this guy in blender. Then I made some animations which are walk and 3 aim animation to use in blend tree. In blender everything looks good but when I import this animations to Unity, I realise that rotation of the arms are different. Let me show my problem with some ss. This is how the model looks in blender.
and when I import this model to Unity it looks like this :
I believe you guys can see the rotation difference between the upper arms. I couldnt find what is wrong. Anyone know what I am doing wrong ?
It's possible that the problem could be an animation keyframe setting in Blender. I would scroll through the Blender timeline on that animation and see if anything appears wrong.
Also, this Unity script could help with rotation conversion between the different coordinate systems between Blender and Unity (just in case this might be the problem): http://wiki.unity3d.com/index.php/FixBlenderImportRotation
Related
It's been a week since I have been trying to find how to resolve the problem.
I am kind of new to Unity and I'm trying to curve an Image to a round shape.
For my example, the player imports an image from his hard drive, and then this image is curved to cover a bottle, which I created in Blender, the image should cover the girthier part of the bottle, like the bottom part.
Obviously, the image texture is changed to 2D and UI.
I have tried several things, such as downloading a script that curves the UI Image, but later realized the image can't be a UI because it should appear in the scene and be manipulated later by the player. I would like to add text that "covers" the bottle, in a curved way too but this is another problem.
So if anyone knows how I could curve an Image in Unity thank you in advance.
Ps: If I don't post any code it's because I haven't done anything yet in the project rather than the camera movement, therefore nothing interesting. I am just trying to figure out how to do this before continuing on anything else.
When I import my model (.fbx) to draw a texture in the substance painter and export it to the .gltf format, I find that the metal I paint is gone and replaced by the shiny black, and then when I put the exported model in the three.js/examples/webgl_load_gltf.html case, I found that my model was dark and needed to be added a directLight can only see a point, the effect is not very good, but the case of importing the painter painter does not have any problems.
It looks as good as the model in the original example.
I would like to know what causes this two problems.
The problem is that your demo has only a single instance of HemisphereLight. The materials of your meshes are instances of MeshStandardMaterial with metalness values of 1.0. Since hemisphere lights are sources of indirect diffuse light, they are not reflected by pure metals. More information about this topic in: https://github.com/mrdoob/three.js/issues/9228
You should use a different light setting e.g. add a directional light. Or try to fix the material settings in your model.
DUDE I JUST WENT THROUGH THIS, it was a nightmare. I had a very shiny gold car with windows and all types of shit going on. Looked great as a USDZ! Then tried to export GLTF. Black.
Turned off all channels except normal and base color, looked fine.
Then went and turned on metallic and again black.
Then i realized "Oh, I didn't bake my mesh maps" like an idiot. Boom, metallic, roughness and opacity all suddenly worked.
Always ALWAYS bake your mesh maps. Substance is pointless without it.
I know this is 2 years old but apparently all the early adopters of Substance figured this out long ago, so this is for you, random person just learning this stuff. AR is only going to become more in demand, thank god I know how to make it cross-platform now.
I’m a student using Maya for the first time to try make a ‘proof of concept’ for my future research work. I’m a newbie and I’ve been struggling to try figure this out, any help would be really appreciated:
I’m trying to animate an effect similar to the ‘space-time’ images you often see (fig 1). I’d like to have a plane, and when a sphere (planet) is animated over it the plane bends as shown in fig 1. I’ve gone through all the deformers in Maya and I just can’t see how to do this. I found iCollision, which can do something similar, but it's not quite right (it's more of a footprint than a curve on the plane). The soft manipulation deformer in Maya can make something like (Fig 2), which would work great, but I’m not sure if you can animate this (if I try, it just moves the apex of the deformation, not the whole deform along with the sphere). Ideally the deform ‘depth’ should be defined by the size or some other attribute of the sphere as it moves along.
Could anyone help or suggest how I can do this?
Thank you!
Fig 1
Fig 2
Sorry guys! I am super new to this, and I'm trying to teach myself Unity. I have a very small knowledge of Maya from a 3d modeling course I took a year ago in college, and have made a very simple object and animated it using keyframes (2 keyframes to be exact) in maya, but I can't seem to figure out how to get that animation into Unity.
I've been playing around with this for hours now, and all the tutorials and such I find online are for really complex objects like people, and they have joints and stuff going on. Are joints required for an animation to work in Unity?
I've saved my maya file into my assets folder, but I don't see any animations in the import settings in Unity.
Here is my object:
And it animates to this position:
I'm not shure but I think unity won't import animations from maya. Lights are not imported. I tested a comparable szenario a couple weeks ago with camera and motion path. But the animation was not imported.
Unity is a game engine and not a 3D-edit-programm.
Correct me when I'm wrong.
I managed to get rotations on the console in blender, but when I try to apply it in Unity, it is just very wrong. I am using Quaternion.Set
to set the desired rotation. I know that blender uses (WXYZ) quaternion, but when I got these values and set properly it in to Unity3D (XYZW), it gives me nonsense rotations.
http://pastebin.com/bKzUVCih
here is the link to my script. Please help me point out what is wrong there.
P.S.: Euler rotations are not an option, because they're lossy as far as I know...
I have solved this problem to my satisfaction. The problem is that the XYZ rotations of Unity are different from those of Blender. If you wish to convert positioning and rotation of an object perfectly from Blender to Unity, use the following steps:
Rotate the object -90 degrees on the X axis.
Calculate the Blender quaternion. Remember that the Blender quaternion is going to come out in a WXYZ matrix, and a Unity quaternion is going to use a XYZW order.
Rotate your object back 90 degrees on the X axis, export to FBX using the experimental, "apply transform," check box.
Translate your object in Unity using the translation in Blender, BUT use the equivalent of (-X, Z, -Y) for your translation.
If you're looking for a Python script to get the quaternion out of Blender, I've put together one and put most of it here
I might as well put a YouTube clip together on this, it's simplistic but ridiculously hard to figure out.