3ds max material shows in render but not vrml export - render

I created a model of a bottle in 3ds max, and used a raytrace material (in Materials -> Standard -> Raytrace, not the raytrace map). Everything looks great in the 3ds max render, but when exported to vrml, all that shows up is the default blue colour as if it had no material assigned to it!
I have exported to vrml using "standard" materials with no problems, so is this material type unsupported in vrml or is there actually a way to get it to show up?
Thanks
edit: I'm in 3ds max 2013, in case that's relevant at all

It is unsupported. Usually exporters only handles certain materials during export (typically "standard" materials with bitmap textures).
Also raytrace material is not a material usually used in realtime applications, so there is no reason for them to support it.
Also rendering output and export output is two totally different things, so you cannot compare them, since the render output is made by the render engine you use, and the export output is dependent on both the format and the software used to display the output geometry.

Related

Exporting blender file for use in three.js

I am trying to figure out how to use a blend model in my three.js code.
My code looks like the following:
const loader = new THREE.JSONLoader();
loader.load( "models/test.blend", function(geometry){
let material = new THREE.MeshLambertMaterial({color: 0x55B663});
mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);
});
Nothing is showing. Every tutorial I can find directs me here which is now deprecated and I cannot find anything in the docs.
I have also tried using a dae file and followed the answer here, but this didn't work either. I used the new THREE.ColladaLoader(); to try and load this file.
read this
specifically it addresses a tool :
https://github.com/KhronosGroup/glTF-Blender-Exporter
Loading 3D models
3D models are available in hundreds of file formats, each with different purposes, assorted features, and varying complexity. Although three.js provides many loaders, choosing the right format and workflow will save time and frustration later on. Some formats are difficult to work with, inefficient for realtime experiences, or simply not fully supported at this time.
This guide provides a workflow recommended for most users, and suggestions for what to try if things don't go as expected.
Before we start
If you're new to running a local server, begin with how to run things locally first. Many common errors viewing 3D models can be avoided by hosting files correctly.
Recommended workflow
Where possible, we recommend using glTF (GL Transmission Format). Both .GLB and .GLTF versions of the format are well supported. Because glTF is focused on runtime asset delivery, it is compact to transmit and fast to load. Features include meshes, materials, textures, skins, skeletons, morph targets, animations, lights, and cameras.
this is from the above link and the THREE.js documentation. in it it explains that they deprecated that to increase workflow productivity, which means it wasn't working very well anyway....
the link you provided has substitute resources for exporting blender models as glTF which is recommended for transmission due to its compact size and speed

Blender export to three.js. All matirials have the same grey color

I'm playing around with some open source 3d models and when I'm trying to export them to three.js json format, all materials are exported in same grey color.
How model looks like in blender (you can see that all materials have different colors)
How the result looks like
So inside the exported json file all materials have the same color:
[{
...
"colorSpecular":[0.5,0.5,0.5],
"colorDiffuse":[0.64,0.64,0.64],
"colorEmissive":[0,0,0],
...
"DbgName":"HullPlain.002"
},{
...
"colorSpecular":[0.5,0.5,0.5],
"colorDiffuse":[0.64,0.64,0.64],
"colorEmissive":[0,0,0],
...
"DbgName":"HullColor.002"
},...]
here my export settings
I tried different models and I'm getting the same materials settings "colorSpecular":[0.5,0.5,0.5], "colorDiffuse":[0.64,0.64,0.64], for all of them.
Does anyone knows what can cause this issue? Thanks!
The three.js JSON exporter does not support Cycles node materials, and generally that exporter is no longer recommended.
I would suggest using KhronosGroup/glTF-Blender-Exporter and THREE.GLTFLoader instead. It supports Cycles Render, but only if you use the provided PBR nodes. For an easier setup process, use the default Blender Render materials. A quick conversion (I was not careful about getting the right colors) shows this working for your models:
^Note that you will need to apply modifiers before export, or in the export settings.
three.js r92

Is there a way to convert gltf to dae?

I can make gltf files with the collada to gltf converter.
But is there a way to reverse this?
UPDATE - October 2019 - Blender 2.80 has shipped with full glTF 2.0 import/export capability. It also has COLLADA import/export capability, so can be used to convert one to the other.
UPDATE - November 2018 rewrote answer for glTF version 2.0, which has almost completely replaced 1.0 in the time since this question was originally asked.
glTF 2.0 can be processed by a variety of tools, many of which are listed on the glTF Tools section of the official Khronos glTF README.
Older glTF 1.0
While there are numerous command-line utilities for converting to glTF, the options are much more limited going the other way, from glTF to COLLADA or anything else. One thing to understand about this is that glTF is intended to be a runtime delivery format, not an interchange format like COLLADA. glTF strives to store its internal data in as close to GPU-ready form as possible, with mesh data organized into data structures that can be used as vertex attributes, and so on. Khronos has a tagline that glTF is "the JPG of 3D" meaning that it has wide distribution to rendering engines of all kinds.
So, importing a glTF into a 3D editing package is something like loading a JPG into a paint program. You can do it, but after the import you want to avoid any unnecessary round-trips to and from the delivery format. So you would use the paint program's native save format (.psd or .xcf etc), or the 3D modeler's native save format, to keep your own editable copy of your work, and ship the exported JPG or PNG or glTF for wide distribution.
Even so, I do expect more importers to become available as time goes on. glTF version 1.0 had an internal structure that made this quite difficult (its vertex shaders would use swaths of attribute data without explicitly marking them as positions or normals, etc.) glTF 2.0 replaced those custom shaders with modern pbr pipeline assets, with clearly marked mesh position, normal, and other data, opening the door for future import tools and utilities of all kinds.

exporting Bump and Specular maps

I have seen the new MeshPhongMaterial Bump and Specular highlights, and can't wait to get them into my game engine.
Currently i am using the python converter to convert an OBJ file into a .js file. However release 51 exporter doesn't seem to handle these materials.
I am also concerned that most of my meshes have 2 or more materials, and are using MeshFaceMaterial.
Will changing to MeshPhongMaterial break the multiple textures?
Should i use a different exporter to achieve this?
What is the best workflow to convert from .3ds files with Bump and Specular maps?
Should i wait a while for this topic to settle down?
I'm assuming you already use a diffuse texture (Lambert material perhaps?) and as such have also exported texture coordinates.
You can add mapBump and mapSpecular properties manually to the materials in question in the .js model file. They are strings pointing to the textures just like mapDiffuse. Also change the shading property to "Phong", and you should be good to go, though you might also want to tweak specularCoef and colorSpecular material properties.
Simply switching material type won't break the face materials.

How to import Blender 3D animation to iPhone OpenGL ES?

I am trying to do animations on iPhone using OpenGL ES. I am able to do the animation in Blender 3D software. I can export as a .obj file from Blender to OpenGL and it works on iPhone.
But I am not able to export my animation work from Blender 3D to OpenGL. Can anyone please help me to solve this?
If you have a look at this article by Jeff LaMarche, you'll find a blender script that will output a 3D model to a C header file. There's also a followup article that improves upon the aforementioned script.
After you've run the script, it's as simple as including the header in your source, and passing the array of vertices through your drawing function. Ideally you'd want a method of loading arbitrary model files at runtime, but for prototyping this method is the simplest to implement.
Seeing as you already have a method of importing models (obj) then the above may not apply. However, the advantage of using a blender script is that you can then modify the script to suit your own needs, perhaps also exporting bone information or model keyframes.
Well first off, I wouldn't recommend .obj for this purpose since the obj file format doesn't support animation, only static 3D models. So you'll need to export the animation data as a separate file that you load at the same time as the obj.
Which file format I would recommend depends on what exactly your animations are. I don't remember off the top of my head what file formats Blender supports, but as I recall it does not export Collada files with animation, which would be the most general recommendation. Other options would be md2 for character animations, or 3ds for simple "rigid objects moving around" animations. I think Blender's FBX exporter will work, although that file format may be too complicated for your needs.
That said, and assuming you only need simple rigid object movements, you could use .obj for the 3D model shapes and then write a simple Python script to export a file from Blender that has at the keyframes listed, with the frame, position, and rotation for each keyframe. Then load that data in your code and play back those keyframes on the 3D model.
This is an old question and since then some new iOS frameworks have been released such as GLKit. I recommend relying on them as much as possible when you can, since they take care of many inherent conversions like this, though I haven't researched the specifics. Also, while not on iOS, the new Scene Graph technology for OS X (which will likely arrive on iOS) in the future, take all this quite a bit further and a crafty individual could do some conversions with that tool and then take the output to iOS.
Also have a look at SIO2.
I haven't used recent versions of Blender, but my understanding is that it supports exporting mesh animation as a sequence of .obj files. If you can already display a single .obj in your app, then displaying several of them one after another will achieve what you want.
Now, note that this is not the most efficient form to export this type of animation, since each .obj file will have a lot of duplicated info. If your mesh stays fixed over time (i.e. only the vertices move with the polygon structure, uv coords, etc. all fixed) then you can just import the entire first .obj and from the rest just read the vertex array.
If you wanted to optimize this even more, you could compress the vertex arrays so that you only store the differences from the previous frame of the animation.
Edit: I see that Blender 2.59 has export to COLLADA. According to the Blender manual, you can export object transformations, and you can also export baked animation for rigged objects. The benefit for you in supporting the COLLADA format in your iPhone app is that you are free to switch between animation tools, since most of them export this format.

Resources