Image on Surface of a .obj File Unity - image

I'm new to Unity and I want to develop a small racing Game. I started to draw the basic parts of the track in an external 3D tool. I saved my designed parts as an .obj file and imported them into unity. An example looks like this.
What the .obj looks like
The curve had 3 parts, both walls and the ground between those, the ground is a surface object(comes from the .obj file). Now I tried to get something like this:
Desired Result
But the only way to achieve this optic was to make a new GameObject->Image and set it just above the Ground, there must be an easier way of getting this optic without an extra image. I defined a material out of an image showing asphalt and put it over the surface but the surface just got darker(seems like the average color inside the picture) and didn't show the material.
When I put in a GameObject->Plane and drop the same material onto the plane it looks just like the picture which is perfect, but why isn't it working on my surface out of my .obj file. Can you please show me the correct way to get this done.

Related

How to fix image texture not transfering to Unity from Blender

I textured an object in Blender because it wouldn't texture properly in Unity, and then imported the object and texture to Unity.
I don't know how to fix this, I'll put both pictures here.
Blender Texture Before Import
Object In Unity
Okay, so based on your screenshots..
You're going to select everything then add a modifier called "Solidify", then set this to something very small like .03. (Unity doesn't like objects that are just planes).
double check all your normals are facing out. Let me know if you don't know how to do this...
Go into edit mode select all edges, then right click, select "mark as seam".
Open UV editor window (should be split screen with Edit mode on one side, UV view on the other). In the Edit side select all, then go to UV dropdown menu and click "unwrap". You should then see your object unfolded into flat planes over on the UV window side. There's different unwrap options, like smart UV unwrap, etc. I think just "unwrap" has worked for me, but play around and there may be something that shows your object shapes in a less distorted way..
At this point, since your pattern is basically repeating. If you export the OBJ file and take it into unity, and you add the image file (make sure it's dimensions are perfectly square) it should receive the image file as an Albedo texture much better than in your screenshots. You can play with the 'tiling' and "X/Y offset" till it looks right (you might face issues with rotation though.
BUT
If you want to line it up very specifically, you can export the UV layout as a png from the UV window in blender. Then use photoshop or another photo editor to change/rotate and arrange your texture so the sides line up properly. In blender in the Edit window (assuming you still have both UV + editor open) when you select a side it will highlight in the UV window the corresponding flat plane, based on this you should be able to figure out what should be rotated up/down, etc. Then when you change that 2d image and drag it into Unity, it will adjust and wrap around the object.
I'm pretty new to both, but the advice I've been given is to not do the texturing in blender, but instead to do it in Unity.
This is a 10-month-old post but incase anyone is curious or struggling the same, Blender exports models with a scale of 100 so you need to scale up the tile of the material (in material settings) to see it.
This is a bad solution however because then you are not working with objects on a scale of 1 so you actually want to check "Apply transformations" when exporting the FBX model in blender

Missing textures when loading .obj and .mlt with ThreeJS

I'm working a simple proof of concept. This involves exporting a simple textured model from Blender into the an .obj model file, an .mtl material file, and some .jpg texture images. My model is a single plane with a grass texture on the top of it.
The problem I'm having is the grass texture isn't loading. I'm getting console messages and the JS code looks to be actually loading the image file, but it's not showing up. The plane is a green colour, which I don't understand, since I havent defined colours anywhere. I'm not sure if this is a rendering problem.
I had a look at the examples ThreeJS provides, but they're slightly different than the files I'm getting from blender. The obj loader example doesn't include any .mtl files. The obj + mtl loader example is using some .dds files for the models, which I don't have. I had a quick look into it and it has something to do with direct x.
My example is up at https://www.raydowe.com/three/
Does anyone have an example of how to load .obj, .mtl, and .jpg assets into ThreeJS?
so, this is easy you need texture coordinate to map the texture onto the vertices. In the *.obj it is represented into "vt"
https://github.com/koolkap/Obj-Mtl-Loader
This is your working sample
https://github.com/koolkap/Obj-Mtl-Loader/blob/master/obj/scene.obj
this is *.obj required.
feel free to ask

How do I attach a text to the vertices of a cube in three.js? Also, Can I add a Text at any point inside the Cube?

I am trying to create a cube using three.js for a project. I need to add text to vertices and at different points inside the cube. Any idea how this can be done?
For some basic code examples of using Sprite objects in Three.js, check out:
http://stemkoski.github.com/Three.js/Sprites.html
And for an easy way to create images that contain text to use as your sprite textures, check out the sample code at:
http://stemkoski.github.com/Three.js/Texture-From-Canvas.html
I think a combination of these two ideas will achieve what you are looking to do.
If you want to have label-style texts, so that the text begins at a specific point, but is always oriented with the camera and easily readable no matter the camera position, you can use sprites. (example of canvas-created text label sprites: http://i.imgur.com/e9I68xD.jpg - here they are rendered on a separate pass to that they are never obscured by the scene but you can do it on the same pass)
If that's what you are looking for, I'd suggest first checking the Sprites examples, and learn to attach some static image as a sprite to correct position in the scene. After you get that working, you modify the code so that you generate the text to an image canvas using standard Javascript Canvas functions, and using that image as the sprite.

How to make a skeletal animation in Blender and play it in three.js

What is the correct process for creating model in Blender and playing it in three.js?
Talking about skeletal animation. What to export, and which functions to call in order to play the animation on three.js. Also how should a trivial JSON exported file look like in order to work in three.js.
How exactly should a pipeline look like for correct results?
When exporting your animated model, you need to have "Skinning", "Bones" and "Skeletal animation" options checked in the exporter. The resulting json file should contain, among other things, non-empty skinWeights, bones and animation properties, like in this file: http://mrdoob.github.com/three.js/examples/obj/buffalo/buffalo.js
For playing the animation with Three.js, start with this example: http://mrdoob.github.com/three.js/examples/webgl_animation_skinning.html (available in the three.js source package).

Maya Mel Scripted Animation Not Animating as FBX

I've acquired a great quad model. I skinned and animated it to a rig build by a
Mel script. It works great as far as editing the animation using sliders and parameters
in Maya. When I export the file as an FBX file to Unity3D, it does not animate. Is something
being lost in the translation from Mel to the rig? Unity needs a boned rig, is this procedural rig not the equivalent of a rig built and animated with the skeleton tools in Maya? I've check that I have a 'Reference' folder, I've set keys, changed root name to "Hips".
Thanks for any insight on this question.
dDuane
If you are transferring the file with no errors and there is no animation then there are three issues to look at.
First, you may have accidentally not exported the keyframes. Make sure the box is checked to export animation on the FBX export UI.
Second, it's possible that the object that contains the actual keyframes is not being exported. When you animate using the MEL scripted GUI, find out where the actual keyframes are on the rig and make sure that object is exporting with the rest of the character.
Third, the object might be transferring fine but depending on the rig setup the connections/constraints/whatever might not be working or supported in Unity. You might consider baking the animation to the skeleton before transferring to the engine. To do this, select the skeleton, click [Edit -> Keys -> Bake Simulation].
I don't know what Maya you are using, but I've always used 2010. This is the workflow that we used for a small unity 3D game project:
Export all of the
animations in one scene as a .fbx. Be sure you just select the geometry (it
usually helps to have it all grouped, but if you can't for some reason that's
okay) and hit export selected.
These FBX export options should be checked:
Geometry:
Edge smoothing,
Tangents and Bi-normals
Animation:
Animation,
Bake Animations,
Bake Animations,
(range of animation),
step = 1
Deformed Models:
Deformed Models,
Skins,
Blend Shapes (if using these),
Curve Filters,
Resample as Euler Interpolation,
Input Connections,
Instances to Objects,
Referenced Containers Content (if using any references),
FBX File Format
Binary
FBX200900
When you bring this into Unity, set animation generation
to "store in root." If all of your
animation is in this one file (which it should be). The "split
animations" box should be checked and define the names and range of these in the
chart below. When you eventually create an animation blending script, drag
and drop it on the animation object within the player prefab, not the prefab
itself.

Resources