I am trying to rotate, scale or move Drone 3D fbx object in Spark AR.
But unable to join the patches in correct sequence .
Please help,as i am a beginner in the AR domain.
I have attached the screenshots for reference.
Open this sample project, and change 3d model to yours.
Related
I'm trying to create a custom 3D viewer that can show up a rigged model created with blender. Particulary, the model will be a mechanical one, so in need to move them with an IK solver: i ll move/rotate a single bone (which will be the motor axes) and all other components must follow it.
I've already done the Blender part (see link below), but now i'm stuck on how to make a stand-alone viewer (ideally a single executable file) that can communicate with another application/process (maybe with TCP socket?) and move the 3d model based on the information sent by it. I'm working on Win10.
Does anyone have any ideas?
(sorry my bad english, I hope explanation was thorough)
https://youtu.be/5rR7BKrGzFg
I work on a project that takes photo use DJI ,and I rotate the camera to ground ,and I can take photo use this method
rotation = [DJIGimbalRotation gimbalRotationWithPitchValue:[NSNumber numberWithInt:-90] rollValue:nil yawValue:nil time:0.01f mode:DJIGimbalRotationModeAbsoluteAngle];
but when I use this method in dpi spark, I got an error and it tells me failed ,how can I make the camera to ground in Spark ,anyone can help me ?
The Gimbal pitch value for Spark is [0,-85], so it's not available to set it as -90 and make the camera to ground in Spark using DJI Mobile SDK. You can check the capabilities property to get the pitch rotation range info: https://developer.dji.com/api-reference/ios-api/Components/Gimbal/DJIGimbal.html?search=capabilities&i=0&#djigimbal_gimbalcapabilities_inline
I'm using Autodesk Fusion 360 to create some assemblies and I'd like to be able to display them using a three.js app.
Fusion can download any of the following formats: F3D (own format), Inventor 2014, IGES, SAT, SMT, STEP, DWG, DXF, STL (binary) & FBX.
My current workflow is Fusion 360 -> STL -> MeshLab -> OBJ & MTL -> three.js
This gives the following results:
In Fusion 360: external faces are solid red, internal ones are white (removed this because I can only post two links)
In MeshLab: MeshLab - faces are solid blue (actually, I didn't think STL carried any colour information, so I'm not sure how it has worked out that outside faces are one colour and internal faces are a different one) and otherwise it is an accurate representation of what I see in Fusion
In three.js: three.js - face colours are now shaded following the triangles.
Based on answers to other questions, I've tried using object.geometry.computeVertexNormals() and mergeVertices() in the onLoad function, but I get the error "cannot read property ... of undefined", which seems to mean there is no geometry object.
So, my questions are:
Is this the best workflow?
Is there a way to get real material info from Fusion to three.js?
How can I smooth out the faces?
Thanks.
Depending on your needs, you might be able to use the JavaScript viewer available here:
https://developer.autodesk.com/api/view-and-data-api/
It's extensible, uses ThreeJS, and provides full support for viewing Fusion models.
I'm used to working with SolidWorks and Catia for 3D object and basic animations but i got a personal project that i work on where i need to build a LASER manufacturing machine and i need to make it look like it is in real life, make zoom inside the machine and see how the light gets polarized as it passes through a crystal, see how the laser beam hits the surface and particles fly of the hit surface,etc.
I thought about Unity Engine but don't know much about this area of 3D and in what program to do the models before importing into Unity, can you guys help me with better solutions ?
Thanks,
Adrian
If you want to use Unity, you can import the following file formats: .FBX, .dae, .3DS, .dxf, .obj (http://docs.unity3d.com/Manual/HOWTO-importObject.html)
From Solidworks, you can export as an STL, then import the STL into a 3d Application (ex. Blender), and in Blender you could export the model as an FBX.
I think you're able to achieve more in terms of animation with unity. You can design your model documents in SolidWorks or CATIA and save them in IGES (neutral format).
I have a rigged (skeleton and soft bind) model in Maya. The model is all one seamless low poly with a single jpeg texture mapped. There is simple animation of the skeleton. (joint rotation). I need to get it to work with ThreeJs (webGL).
Do I try to export an OBJ with Morph Targets some how? I can do OBJ but how do I get the morph targets? Can the developer that I am working with read Maya's baked animation file (.MC or .XML) in webGL. Do I export a Collada DAE?
Any help that can steer us in the right direction would be greatly appreciated.
Thanks
THREE.js comes with an exporter for Maya, but it only works for static models. I have created an updated version that also supports exporting rigged and animated models. It doesn't require any intermediate steps: it just outputs straight to a .JS file. We have a pull request to integrate the updated exporter with the THREE trunk, but if you want to get the new and improved exporter immediately you can get it from this repository: https://github.com/BlackTowerEntertainment/three.js/tree/maya_animation_exporter. The exporter files are in utils/exporters/maya.
Hope this helps.
It is best to export a Collada DAE file from Maya in order to get your data into ThreeJS. You can preview and share your data via http://Clara.io (an online 3D editor, modeler, animation) which imports Collada DAEs and uses ThreeJS for display.
You should have read the FAQ as there is plenty of info there. https://github.com/mrdoob/three.js/wiki. Most probably you need to export to Collada as Wavefront obj's do not support animation.