I'm used to working with SolidWorks and Catia for 3D object and basic animations but i got a personal project that i work on where i need to build a LASER manufacturing machine and i need to make it look like it is in real life, make zoom inside the machine and see how the light gets polarized as it passes through a crystal, see how the laser beam hits the surface and particles fly of the hit surface,etc.
I thought about Unity Engine but don't know much about this area of 3D and in what program to do the models before importing into Unity, can you guys help me with better solutions ?
Thanks,
Adrian
If you want to use Unity, you can import the following file formats: .FBX, .dae, .3DS, .dxf, .obj (http://docs.unity3d.com/Manual/HOWTO-importObject.html)
From Solidworks, you can export as an STL, then import the STL into a 3d Application (ex. Blender), and in Blender you could export the model as an FBX.
I think you're able to achieve more in terms of animation with unity. You can design your model documents in SolidWorks or CATIA and save them in IGES (neutral format).
Related
I'm trying to create a custom 3D viewer that can show up a rigged model created with blender. Particulary, the model will be a mechanical one, so in need to move them with an IK solver: i ll move/rotate a single bone (which will be the motor axes) and all other components must follow it.
I've already done the Blender part (see link below), but now i'm stuck on how to make a stand-alone viewer (ideally a single executable file) that can communicate with another application/process (maybe with TCP socket?) and move the 3d model based on the information sent by it. I'm working on Win10.
Does anyone have any ideas?
(sorry my bad english, I hope explanation was thorough)
https://youtu.be/5rR7BKrGzFg
I'm trying to convert this model to the three.js model format:
http://tf3dm.com/3d-model/ninja-48864.html
Here's what i've tried so far:
I've imported the ms3d file in blender using the default addon. In blender, animations and mesh look correct; however, bones are only rendered as lines. Then I exported it to js using the three.js exporter. This results in a correct mesh, but the animation is not correctly exported. Only bone positions are exported (which are only rarely used in this specific model), NO rotations at all (except for a few identity quaternions).
It seems I have to modify the model in blender somehow, but since I'm a complete novice in 3d modelling, I'm kind of lost. I've also looked at other questions regarding blender+three.js but none of the tips (apply location/rotation/scale etc.) made a difference. It might also be a bug in the three.js exporter.
Can anybody help me do the conversion, one way or the other?
A nice Python utility is available for converting ms3d format to JSON format.
The link is: https://github.com/pyalot/parse-3d-files/blob/master/ms3d/convert.py
You can easily render this JSON model using THREE.JSONLoader() in three.js
Thanks.
I have a rigged (skeleton and soft bind) model in Maya. The model is all one seamless low poly with a single jpeg texture mapped. There is simple animation of the skeleton. (joint rotation). I need to get it to work with ThreeJs (webGL).
Do I try to export an OBJ with Morph Targets some how? I can do OBJ but how do I get the morph targets? Can the developer that I am working with read Maya's baked animation file (.MC or .XML) in webGL. Do I export a Collada DAE?
Any help that can steer us in the right direction would be greatly appreciated.
Thanks
THREE.js comes with an exporter for Maya, but it only works for static models. I have created an updated version that also supports exporting rigged and animated models. It doesn't require any intermediate steps: it just outputs straight to a .JS file. We have a pull request to integrate the updated exporter with the THREE trunk, but if you want to get the new and improved exporter immediately you can get it from this repository: https://github.com/BlackTowerEntertainment/three.js/tree/maya_animation_exporter. The exporter files are in utils/exporters/maya.
Hope this helps.
It is best to export a Collada DAE file from Maya in order to get your data into ThreeJS. You can preview and share your data via http://Clara.io (an online 3D editor, modeler, animation) which imports Collada DAEs and uses ThreeJS for display.
You should have read the FAQ as there is plenty of info there. https://github.com/mrdoob/three.js/wiki. Most probably you need to export to Collada as Wavefront obj's do not support animation.
I'm trying to get my Maya animated walk cycle into three.js. I have exported the animation with the model into the .dae format, changed the path to my model in the example. My model is being loaded, but it doesn't do any animation. What could be the problem? My main goal is to create a character who walks with WASD as his walk cycle is being played.
Any suggestions where should I start?
If you are using collada loader, the animation should work without any problem. I have used the collada loader to animate one of my models using three.js, and it works like charm.
A better example to take a cue on how to make it work is webgl_loader_collada_keyframe.html .
Converting to the DAE format and then to JS is rife with problems, and it rarely works for animations. THREE.js comes with an exporter for Maya, but it only works for static models.
I have created an updated version that also supports exporting rigged and animated models. It doesn't require any intermediate steps: it just outputs straight to a .JS file. We have a pull request to integrate the updated exporter with the THREE trunk, but if you want to get the new and improved exporter immediately you can get it from this repository: https://github.com/BlackTowerEntertainment/three.js/tree/maya_animation_exporter. The exporter files are in utils/exporters/maya.
Hope this helps.
I am trying to do animations on iPhone using OpenGL ES. I am able to do the animation in Blender 3D software. I can export as a .obj file from Blender to OpenGL and it works on iPhone.
But I am not able to export my animation work from Blender 3D to OpenGL. Can anyone please help me to solve this?
If you have a look at this article by Jeff LaMarche, you'll find a blender script that will output a 3D model to a C header file. There's also a followup article that improves upon the aforementioned script.
After you've run the script, it's as simple as including the header in your source, and passing the array of vertices through your drawing function. Ideally you'd want a method of loading arbitrary model files at runtime, but for prototyping this method is the simplest to implement.
Seeing as you already have a method of importing models (obj) then the above may not apply. However, the advantage of using a blender script is that you can then modify the script to suit your own needs, perhaps also exporting bone information or model keyframes.
Well first off, I wouldn't recommend .obj for this purpose since the obj file format doesn't support animation, only static 3D models. So you'll need to export the animation data as a separate file that you load at the same time as the obj.
Which file format I would recommend depends on what exactly your animations are. I don't remember off the top of my head what file formats Blender supports, but as I recall it does not export Collada files with animation, which would be the most general recommendation. Other options would be md2 for character animations, or 3ds for simple "rigid objects moving around" animations. I think Blender's FBX exporter will work, although that file format may be too complicated for your needs.
That said, and assuming you only need simple rigid object movements, you could use .obj for the 3D model shapes and then write a simple Python script to export a file from Blender that has at the keyframes listed, with the frame, position, and rotation for each keyframe. Then load that data in your code and play back those keyframes on the 3D model.
This is an old question and since then some new iOS frameworks have been released such as GLKit. I recommend relying on them as much as possible when you can, since they take care of many inherent conversions like this, though I haven't researched the specifics. Also, while not on iOS, the new Scene Graph technology for OS X (which will likely arrive on iOS) in the future, take all this quite a bit further and a crafty individual could do some conversions with that tool and then take the output to iOS.
Also have a look at SIO2.
I haven't used recent versions of Blender, but my understanding is that it supports exporting mesh animation as a sequence of .obj files. If you can already display a single .obj in your app, then displaying several of them one after another will achieve what you want.
Now, note that this is not the most efficient form to export this type of animation, since each .obj file will have a lot of duplicated info. If your mesh stays fixed over time (i.e. only the vertices move with the polygon structure, uv coords, etc. all fixed) then you can just import the entire first .obj and from the rest just read the vertex array.
If you wanted to optimize this even more, you could compress the vertex arrays so that you only store the differences from the previous frame of the animation.
Edit: I see that Blender 2.59 has export to COLLADA. According to the Blender manual, you can export object transformations, and you can also export baked animation for rigged objects. The benefit for you in supporting the COLLADA format in your iPhone app is that you are free to switch between animation tools, since most of them export this format.