I want to do some work with ARKit, but when I add .dae or .scn to art.scnassets group in my Xcode project, it doesn't show anything in Xcode 10.1 editor (see the picture), so I can't edit the model. Adding light or other things doesn't help.
Does anyone know how to solve it?
Suppose, you've successfully imported your dae model into Xcode's Scene graph (and this dae model doesn't have flipped poly normals) but you can't see it because your model has no texture.
Here's a simple example of ostensibly empty scene:
But this scene isn't empty. To see what it contains you need to click Show Scene Graph View button at the bottom of Xcode's UI and choose geometry's group in a list.
If you can't still see your whole dae model in Scene graph just press F on a keyboard for framing. And now you're able to assign a jpeg or png texture for your mesh via Diffuse slot of Material's properties.
In your case 3D object is called maze. Just select it, press F on your keyboard to frame your model in Scene graph and then assign a texture via Diffuse slot in Utility area (press Cmd-Alt-0 to activate a Utility area).
Related
I textured an object in Blender because it wouldn't texture properly in Unity, and then imported the object and texture to Unity.
I don't know how to fix this, I'll put both pictures here.
Blender Texture Before Import
Object In Unity
Okay, so based on your screenshots..
You're going to select everything then add a modifier called "Solidify", then set this to something very small like .03. (Unity doesn't like objects that are just planes).
double check all your normals are facing out. Let me know if you don't know how to do this...
Go into edit mode select all edges, then right click, select "mark as seam".
Open UV editor window (should be split screen with Edit mode on one side, UV view on the other). In the Edit side select all, then go to UV dropdown menu and click "unwrap". You should then see your object unfolded into flat planes over on the UV window side. There's different unwrap options, like smart UV unwrap, etc. I think just "unwrap" has worked for me, but play around and there may be something that shows your object shapes in a less distorted way..
At this point, since your pattern is basically repeating. If you export the OBJ file and take it into unity, and you add the image file (make sure it's dimensions are perfectly square) it should receive the image file as an Albedo texture much better than in your screenshots. You can play with the 'tiling' and "X/Y offset" till it looks right (you might face issues with rotation though.
BUT
If you want to line it up very specifically, you can export the UV layout as a png from the UV window in blender. Then use photoshop or another photo editor to change/rotate and arrange your texture so the sides line up properly. In blender in the Edit window (assuming you still have both UV + editor open) when you select a side it will highlight in the UV window the corresponding flat plane, based on this you should be able to figure out what should be rotated up/down, etc. Then when you change that 2d image and drag it into Unity, it will adjust and wrap around the object.
I'm pretty new to both, but the advice I've been given is to not do the texturing in blender, but instead to do it in Unity.
This is a 10-month-old post but incase anyone is curious or struggling the same, Blender exports models with a scale of 100 so you need to scale up the tile of the material (in material settings) to see it.
This is a bad solution however because then you are not working with objects on a scale of 1 so you actually want to check "Apply transformations" when exporting the FBX model in blender
I've bought a 3D model form TurboSquid, and I need to scale it down to an exact size. I'm a complete newbie when it comes to 3D modelling and Blender but I've managed to find enough tutorials to do this and export the file.
The problem is that I can only get it to export a view from the corner of the object, which I problematic as I want to line-up several of the objects together as part of my ARKit app.
I'm exporting as a Collada .dae file, selecting "Selection Only" as I only need the model, no lights or camera (is this right?).
Is there any specific way it needs to point in Blender? Would aligning it with the camera help (even though I'm exporting the object only)? Can I select a side to be the "front"?
I can angle it somewhat correctly in the SceneKit Editor but I'd prefer to do it in Blender.
I can't remember where I got this or I would happily give credit, but it's how I got my Turbo Squid models to work. I needed to flip mine around and fix the Y-Up axis, but you might be able to fix your rotation with this.
Make sure everything is selected and press R (for rotate). Just like with scaling, you’ll find that mouse movement rotates the selected objects. But we want to do a particular rotation: 180 degrees around Blender’s Z-axis. To do that quickly, assuming you already pressed R, press Z, then type 180, and press Enter. From my experience, doing this rotation will correct the orientation of your model.
Just as with scale, to apply rotation permanently, press Ctrl+A. In the menu that pops up, click “Rotation”.
Set Y-Up for Scenekit if you need it.
/Applications/Xcode.app/Contents/Developer/usr/bin/scntool --convert fighter0.dae --format c3d --output out.dae --force-y-up --force-interleaved --look-for-pvrtc-image
Please forgive my ignorance in the area, but does anyone know how to go about exporting a maya fluid so that it is importable into three.js?
I'm new to Maya and I just made my first model to a game I'm making, see below:
The afterburner effect from the endgine are created with Maya fluids. I have no problem exporting the aircraft, and weapons, but I have no idea how to use the fluid in a three.js game. I've done hours of research trying to find any information on this topic without avail. Any references or assistance would be greatly appreciated!
EDIT: Here is an update to the steps I'm taking and the errors I'm receiving. Really hope this is meaningful and solvable. I can't find any good information online in relation to these errors.
Step 1: Modify > Convert > Fluid to Polygons ( all works as expected )
Step 2: Select created geometry > Edit > Keys > Bake Simulation, screenshot of options below. ( all works as expected )
Step 3: Select the generated geometry from the Fluid to Polygons step, select my fluidShape2 in the hypershader and perform Hypershade Edit > Convert to File Texture (Maya Software) ( does not work as expected, get an error )
Here is a screenshot of hypershader and polygons:
Here is the error Maya spits out :
// Error: file: /Applications/Autodesk/maya2017/Maya.app/Contents/scripts/others/copyConvertSolidTx.mel line 93: Cannot convert a texture connected to initialShadingGroup
To me it seems like the fluidShape2 node in the hypershader is not actually a shader and the created polygons are really assigned lambert1. I'm not really sure though considering I started using Maya last week.
EDIT #2:
Ok so here is a screenshot showing the fluidShape set as the surface material.
Here is a screenshot of the fluid shading attributes.
As you can see the color is black and the only illumination for this object is that which it produces. It is not lit from outside sources at all.
Cast / Receive shadows are unchecked in Render Stats as well.
Question: Is there any way to bake incandesces?
EDIT #3:
After following your 10 step program, I am having the same results I did previously. Here are screenshots to show before and after. A green plane has been added behind the flame to show it easier in it's second state.
As a fluid:
As polygons with 10 steps applied:
Here is attribute editor:
And here is the material applied through hypershader:
I repeated the 10 steps you gave me several times with the same results each time.
Before exporting you need to convert fluid to polygons.
Here's a MEL script for testing. Run it in Script Editor.
// opening nuke explosion example
file -import -type "mayaAscii" -ignoreVersion -ra true -mergeNamespacesOnClash false -namespace "Nuke" -options "v=0;" -pr "/Applications/Autodesk/maya2016.5/Maya.app/Contents/Examples/FX/Fluid_Examples/Explosions/Nuke.ma" ;
// selecting fluid container
select -r Nuke:fluid ;
// converting fluid to polygons
fluidToPoly ;
// changing mesh resolution under shapeNode/outputMesh tab in Attribute Editor
setAttr "Nuke:fluidShape.meshResolution" 3 ;
To convert fluids to polygons via menu, select fluid container and choose:
Modify – Convert – Fluid to Polygons
then select Shape node's tab in Attribute Editor and change Mesh Resolution in Output Mesh area.
Look at useful information on exporting: Three.js Maya Export
When conversion is done, default Maya shader (Lambert1) is assigned to new poly mesh. You have to reassign your fluid texture to new poly mesh. For this, open Hypershade and MMB-drag-and-drop fluid texture to Surface Material slot of poly mesh.
Remember: if you've got animation in your scene you need to bake it before export using:
Edit – Keys – Bake Simulation
bakeResults -simulation true -t "1:100" -sampleBy 1 -disableImplicitControl true -preserveOutsideKeys true -sparseAnimCurveBake false -removeBakedAttributeFromLayer false -removeBakedAnimFromLayer false -bakeOnOverrideLayer false -minimizeRotation true -controlPoints false -shape true {"nurbsSphere1"} ;
You don't need any type of curve interpolation (linear or bezier), just baked key on every frame. You can export animation via FBX or ABC file formats. There's additional method to export animated mesh as OBJ sequence: Free OBJ Sequences Import/Export.
Also, If you have any problems with exporting the light, bake the light and its shadows in object's texture. The better way for you to export animation and textures on per vertex basis.
STEP-BY-STEP INSTRUCTION:
Select a fluid container and apply Modify–Convert–FluidToPolygons. You'll see in Viewport a polygonal Object with Lambert1 shader.
Select FluidObject in Outliner, then in AE (Attribute Editor) change Mesh Resolution and Mesh Smoothing Iterations (for example 0.5 and 2 respectively) in fluidShape–Output Mesh area.
After that select PolyObject in Outliner and choose tab with its initialShadingGroup in AE.
Open Windows–RenderingEditors–Hypershade, select FluidTexture and assign it using MMB-drag-and-drop to Surface Material Slot in initialShadingGroup of PolyObject in AE.
Create Spot Light in the scene. Run test Render. You'll see a lit Poly.
To change Transparency, Opacity or Incandescence just select PolyObject in Outliner and in AE go to fluidShape tab and choose Shading area.
Bake a Light (and Shadows if needed) using Arnold Renderer properties in Maya 2017 or Mental Ray properties in Maya 2016 and earlier.
To output a Poly Mesh from Maya you need to select mesh and apply a command Edit–Keys–BakeSimulation.
To export a sequence of textures for every frame you need to select a FluidTexture in Hypershade and apply to it Edit–RenderTextureRange (you have to assign a frame range for exported texture sequence, for example 1-200, as well as file format, for example TIFF)
Export OBJ sequence.
I am new to SceneKit and not able to get a hold of the Xcode Scene Editor. I am doing some research on iOS ARKit which was introduced recently. I created a sample project with SceneKit template. It comes with a default ship.scn.
I deleted this ship.scn and created a new SCN, drag and dropped a sample .DAE model into this SCN. I am able to view the object that I placed in .SCN file clearly with all the textures.
But when I run it in the device, the object appears on top of camera and zoomed in and not positioning properly in the camera frame.
I had to give the position of the object as x: 0, y: -60, z: -60 in Xcode Scene Editor to make it available in the center of the camera frame. But if I do this, the object will always be -60 degree from camera and I will not be able to move/resize it correctly with touch gestures.
Anyone please help me on how to make the object available at the center of camera frame such that the object also supports rotation/resize/movement etc.
I created the scene like below and added it to scene view:
let scene = SCNScene(named: "art.scnassets/Lowpoly_Notebook_2.scn")!
I have used Apple's link as reference for moving/resizing objects.
As orangenkopf quoted, My problem was with the model size. It was huge and hence didn't get rendered properly. When i change the scale of the object in Xcode Scene editor to a minimum value, i am able to view the object correctly and place it anywhere needed.
The best way is adding .dae file into the project and then Editor -> convert to SceneKit scene file format (.scn).
↪️ I want to embed a street view in app so I created a new Project
↪️ Added Street View To Skybox from the asset store
↪️ Imorted a street view from google maps to a Skybox
↪️ Drag and droped the sky box in the scene
↪️ how I do move the camera in response to the touch and mouse moment like a panorama in this example
↪️
1.Unit 3d documentation
2.Unity Touch Swipe Camera Rotation - Touch FPS Controller
3.Street View Convert
4.Street view converter unity 3D panorama
5.Unity3D Tutorial: How to make Skybox 360 single texture
I'd do the following:
Create an empty GameObject called "RotorY" which will be responsible for y rotation of your camera (looking around).
Create an empty GameObject "RotorX" as a child of "RotorY". This one will control x rotation (looking up and down)
Make camera a child of RotorX. (also make sure RotorX and Camera have (0,0,0) position in local space)
Add this script to any gameObject and assign rotorY and rotorX to the slots (while experimenting I ended up writing this script, lol)
EDIT January, 2017: added 2 lines in the code to make it work with unity 5.5