Unity 3D Does Not Apply Texture To Object Properly - unityscript

I have got a problem with texturing an object in Unity 3D. I have made a simple object in 3Ds Max and inserted it into Unity and then tried to apply an image as texture but it does not apply the texture and it only changes the color of the object! This is the print screen:
As you can see I have got two models. One of them is made in 3Ds Max and does not apply the texture and the other one is made in Unity and it's a cube and it gets the texture correctly.
So what's going wrong here ? Not that I also changed the tiling and offset settings of model's shader but still nothing's changed at all! :(

You didn't UV Unwrapped the 3d object before exporting to Unity3D.
To apply the textures on 3d mesh, any engine needs to know coordinates for texture, and this is called UV Mapping.
WIKIPEDIA - UV_mapping

Related

Texture is rendering not correctly with meshStandardMaterial three.js

I'm trying to render texture to material but it rendered is not correctly, maybe my texture is wrong, but when i add the texture in blender it renders correctly. I don't know why.
Sorry I just learned threejs, but the current project needs to render in 3d. So I came here to ask if anyone has a solution, please help me.
Here is my Codesanbox:
https://codesandbox.io/s/hero-kdhox?file=/src/App.js
The texture I have added in Blender something like this:
https://imgur.com/dYkD5u8
You must flip your textures vertically, threejs will by default flip them when you import them using 'useLoader' there is a property flipY for textures but it hasn't been working reliably
Your best bet is to flip them vertically manually before importing
Alternatively, pack the textures in your GLB on blender itself

Three.js Merge objects and textures

My question is related to this article:
http://blog.wolfire.com/2009/06/how-to-project-decals/
If my understanding is correct, a mesh made from the intersection of the original mesh and a cube is added to the scene to make a decal appear.
I need to save the final texture. So I was wondering if there is a way to 'merge' the texture of the original mesh and the added decal mesh?
You'd need to do some tricky stuff to convert from the model geometry space into UV coordinate space so you could draw the new pixels into the texture map. If you want to be able to use more than one material that way, you'd also probably need to implement some kind of "material map" similar to how some deferred rendering systems work. Otherwise you're limited to at most, one material per face, which wouldn't work for detailed decals with alpha.
I guess you could copy the UV coordinates from the original mesh into the decal mesh, and the use that information to reproject the decal texture into the original texture

Changing scale of mesh changes lighting brightness of scene?

I needed to refactor my custom mesh creation a bit
from:
create mesh of unified sizes (SIZE,SIZE,SIZE), than scale them as needed (setting scale for each axis)
to:
create mesh with correct size, do not scale later
meshes are custom generated (vertices, faces, normals, uvs), nothing of this process was altered, worked like a charm before
=> resulting meshes are the same size, position, etc.
The whole scene setup stays the same: lights, shadowing, materials, yet when using the second approach the whole lighting is very very bright and super reflective, is that a known issue?
material used is MeshPhongMaterial with map, bumMap, specMap, envMap
using three.js r68, no error/warning in console
before:
https://cloud.githubusercontent.com/assets/3647854/3876053/76b8f260-2158-11e4-9e96-c8de55eaec9a.png
after:
https://cloud.githubusercontent.com/assets/3647854/3876052/76b7fa86-2158-11e4-9393-8f3eece04c0b.png
Did you rescale the normals in the mesh?
The mesh format probably needs normalized normals, in which case, the new normals are now incorrect, but would've been correct, if you hadn't rescaled.
Alternately, you say the lights haven't been changed, maybe they need to be appropriately redirected in the scene. (Assuming you're applying different scaling factors in each axis.)

How to use part of a texture?

How do I create a three.js material/geometry which uses part of a texture?
I am first rendering a scene to a texture. This texture is used for a Mesh with a CubeGeometry and a MeshLambertMaterial. What I would like to do now is have only a part of the texture displayed on the cube face (like a window into the texture).
I've done this before using OpenGL ES directly, with shaders, but I don't see what parameters might make it possible using the standard three.js library.
That is what texture coordinates are for. See this question and answer:
http://stackoverflow.com/questions/19366991/drawing-part-of-open-gl-texture/19367844#19367844

Basic approach to pupil constriction/dilation of eye model in OpenGL

I'm new to OpenGL-ES and looking for the best approach for creating a realistic model of an eye whose pupil can dilate and constrict so I have a plan in mind while running through tutorials.
I've made a mesh in blender that is basically a sphere with a hole (the 'pole' or central vertex is removed and a couple surrounding circle edges).
I plan to add an iris texture directly to the sphere's polys surrounding the hole.
To change pupil size, do I just need a function to reposition the vertices of the hole so the hole dilates or contracts?
I'm going to use OpenGL within an Objective-C app. I have Jeff Lamarche's Objective C export script. Is it standard to export only the mesh from blender, and add textures in code later in xcode? Or is it easier/better to setup the textures on the meshes in blender first and export the more finished product's data to xcode?
Your question is a bit old, so I'm not sure how much progress you've made, but as I've been climbing up the learning curve myself I thought I'd take a shot at answering.
If you want to animate the individual vertices of your model, I believe the method you'll want is Vertex Skinning. I can't speak much on that front as I haven't yet had reason to experiment with it, although it's a technique only available in OpenGL ES 2.0. (Which is probably where you want to start anyway, the increased flexibility over 1.1 is more than worth any additional incline to the learning curve.)
The answer to your texturing question is somewhat mixed. You'll need to actually apply the texture in OpenGL. But what Blender can do for you is determine the texture coordinates. Each vertex of your mesh will have a texture coordinate associated with it. The texture coordinate will be X, Y coordinates which map to a location on the texture image. The coordinates are in a range from 0.0 to 1.0 -- so, since your image texture is a rectangle, the texture coordinate {0, 0} maps to the bottom left corner; {1 , 1} maps to the top right corner; {0.5, 0.5} maps to the exact center of the image.
So in blender, you'd want to go ahead and texture the object with UV mappings. When you export, although your exported mesh won't contain any of the image content, it will retain the texture coordinates which map to your image content. This will allow you to apply the texture in OpenGL so that the texture is applied the same way it appeared in blender.
I've personally had some trouble getting Jeff Lamarche's script to spit out the texture coordinates, as Blender api seems to change significantly with each release. I've had more success with an .obj converter. So I've been exporting from blender to .obj, and using a command line tool to go from .obj to a C header file.
If you encounter similar problems with Lamarche's script, this post might help solve it: http://38leinad.wordpress.com/2012/05/29/blender-2-6-exporting-uv-texture-coordinates/
And this is a good resource for a .obj to .h script:
http://heikobehrens.net/2009/08/27/obj2opengl/

Resources