I can't make any of USD primitives (sphere, cylinder, cube), show up in UE.
They open correctly, I can add them to the level, but they don't show up.
If I have a very simple USD primitive:
#usda 1.0
def Cube "box" {
double size = 4.0
}
I can import it in Unreal Engine, but it doesn't render it.
Did they really not think about instantiating UE sphere asset in its place?
Is there a tool that can tessellate USD primitives into static meshes?
Related
I textured an object in Blender because it wouldn't texture properly in Unity, and then imported the object and texture to Unity.
I don't know how to fix this, I'll put both pictures here.
Blender Texture Before Import
Object In Unity
Okay, so based on your screenshots..
You're going to select everything then add a modifier called "Solidify", then set this to something very small like .03. (Unity doesn't like objects that are just planes).
double check all your normals are facing out. Let me know if you don't know how to do this...
Go into edit mode select all edges, then right click, select "mark as seam".
Open UV editor window (should be split screen with Edit mode on one side, UV view on the other). In the Edit side select all, then go to UV dropdown menu and click "unwrap". You should then see your object unfolded into flat planes over on the UV window side. There's different unwrap options, like smart UV unwrap, etc. I think just "unwrap" has worked for me, but play around and there may be something that shows your object shapes in a less distorted way..
At this point, since your pattern is basically repeating. If you export the OBJ file and take it into unity, and you add the image file (make sure it's dimensions are perfectly square) it should receive the image file as an Albedo texture much better than in your screenshots. You can play with the 'tiling' and "X/Y offset" till it looks right (you might face issues with rotation though.
BUT
If you want to line it up very specifically, you can export the UV layout as a png from the UV window in blender. Then use photoshop or another photo editor to change/rotate and arrange your texture so the sides line up properly. In blender in the Edit window (assuming you still have both UV + editor open) when you select a side it will highlight in the UV window the corresponding flat plane, based on this you should be able to figure out what should be rotated up/down, etc. Then when you change that 2d image and drag it into Unity, it will adjust and wrap around the object.
I'm pretty new to both, but the advice I've been given is to not do the texturing in blender, but instead to do it in Unity.
This is a 10-month-old post but incase anyone is curious or struggling the same, Blender exports models with a scale of 100 so you need to scale up the tile of the material (in material settings) to see it.
This is a bad solution however because then you are not working with objects on a scale of 1 so you actually want to check "Apply transformations" when exporting the FBX model in blender
Please forgive my ignorance in the area, but does anyone know how to go about exporting a maya fluid so that it is importable into three.js?
I'm new to Maya and I just made my first model to a game I'm making, see below:
The afterburner effect from the endgine are created with Maya fluids. I have no problem exporting the aircraft, and weapons, but I have no idea how to use the fluid in a three.js game. I've done hours of research trying to find any information on this topic without avail. Any references or assistance would be greatly appreciated!
EDIT: Here is an update to the steps I'm taking and the errors I'm receiving. Really hope this is meaningful and solvable. I can't find any good information online in relation to these errors.
Step 1: Modify > Convert > Fluid to Polygons ( all works as expected )
Step 2: Select created geometry > Edit > Keys > Bake Simulation, screenshot of options below. ( all works as expected )
Step 3: Select the generated geometry from the Fluid to Polygons step, select my fluidShape2 in the hypershader and perform Hypershade Edit > Convert to File Texture (Maya Software) ( does not work as expected, get an error )
Here is a screenshot of hypershader and polygons:
Here is the error Maya spits out :
// Error: file: /Applications/Autodesk/maya2017/Maya.app/Contents/scripts/others/copyConvertSolidTx.mel line 93: Cannot convert a texture connected to initialShadingGroup
To me it seems like the fluidShape2 node in the hypershader is not actually a shader and the created polygons are really assigned lambert1. I'm not really sure though considering I started using Maya last week.
EDIT #2:
Ok so here is a screenshot showing the fluidShape set as the surface material.
Here is a screenshot of the fluid shading attributes.
As you can see the color is black and the only illumination for this object is that which it produces. It is not lit from outside sources at all.
Cast / Receive shadows are unchecked in Render Stats as well.
Question: Is there any way to bake incandesces?
EDIT #3:
After following your 10 step program, I am having the same results I did previously. Here are screenshots to show before and after. A green plane has been added behind the flame to show it easier in it's second state.
As a fluid:
As polygons with 10 steps applied:
Here is attribute editor:
And here is the material applied through hypershader:
I repeated the 10 steps you gave me several times with the same results each time.
Before exporting you need to convert fluid to polygons.
Here's a MEL script for testing. Run it in Script Editor.
// opening nuke explosion example
file -import -type "mayaAscii" -ignoreVersion -ra true -mergeNamespacesOnClash false -namespace "Nuke" -options "v=0;" -pr "/Applications/Autodesk/maya2016.5/Maya.app/Contents/Examples/FX/Fluid_Examples/Explosions/Nuke.ma" ;
// selecting fluid container
select -r Nuke:fluid ;
// converting fluid to polygons
fluidToPoly ;
// changing mesh resolution under shapeNode/outputMesh tab in Attribute Editor
setAttr "Nuke:fluidShape.meshResolution" 3 ;
To convert fluids to polygons via menu, select fluid container and choose:
Modify – Convert – Fluid to Polygons
then select Shape node's tab in Attribute Editor and change Mesh Resolution in Output Mesh area.
Look at useful information on exporting: Three.js Maya Export
When conversion is done, default Maya shader (Lambert1) is assigned to new poly mesh. You have to reassign your fluid texture to new poly mesh. For this, open Hypershade and MMB-drag-and-drop fluid texture to Surface Material slot of poly mesh.
Remember: if you've got animation in your scene you need to bake it before export using:
Edit – Keys – Bake Simulation
bakeResults -simulation true -t "1:100" -sampleBy 1 -disableImplicitControl true -preserveOutsideKeys true -sparseAnimCurveBake false -removeBakedAttributeFromLayer false -removeBakedAnimFromLayer false -bakeOnOverrideLayer false -minimizeRotation true -controlPoints false -shape true {"nurbsSphere1"} ;
You don't need any type of curve interpolation (linear or bezier), just baked key on every frame. You can export animation via FBX or ABC file formats. There's additional method to export animated mesh as OBJ sequence: Free OBJ Sequences Import/Export.
Also, If you have any problems with exporting the light, bake the light and its shadows in object's texture. The better way for you to export animation and textures on per vertex basis.
STEP-BY-STEP INSTRUCTION:
Select a fluid container and apply Modify–Convert–FluidToPolygons. You'll see in Viewport a polygonal Object with Lambert1 shader.
Select FluidObject in Outliner, then in AE (Attribute Editor) change Mesh Resolution and Mesh Smoothing Iterations (for example 0.5 and 2 respectively) in fluidShape–Output Mesh area.
After that select PolyObject in Outliner and choose tab with its initialShadingGroup in AE.
Open Windows–RenderingEditors–Hypershade, select FluidTexture and assign it using MMB-drag-and-drop to Surface Material Slot in initialShadingGroup of PolyObject in AE.
Create Spot Light in the scene. Run test Render. You'll see a lit Poly.
To change Transparency, Opacity or Incandescence just select PolyObject in Outliner and in AE go to fluidShape tab and choose Shading area.
Bake a Light (and Shadows if needed) using Arnold Renderer properties in Maya 2017 or Mental Ray properties in Maya 2016 and earlier.
To output a Poly Mesh from Maya you need to select mesh and apply a command Edit–Keys–BakeSimulation.
To export a sequence of textures for every frame you need to select a FluidTexture in Hypershade and apply to it Edit–RenderTextureRange (you have to assign a frame range for exported texture sequence, for example 1-200, as well as file format, for example TIFF)
Export OBJ sequence.
I am attempting to modify the ThreeJS materials example seen here, and have been fairly successful so far in reverse engineering it in to my own minimalist demo.
The problem comes when I attempt to modify the materials.
I have changed mlib["Orange metal"] to the following:
"Orange metal": new THREE.MeshLambertMaterial( {
map: carTexture,
envMap: skyBox,
combine: THREE.MultiplyOperation
} )
carTexture is a reference to the following:
var carTexture = THREE.ImageUtils.loadTexture('texture/blue.jpg');
carTexture.wrapS = carTexture.wrapT = THREE.RepeatWrapping;
carTexture.repeat.set(3,3 );
carTexture.wrapS = THREE.RepeatWrapping;
carTexture.wrapT = THREE.RepeatWrapping;
And while this has changed my final output, the detail in the texture is missing.
For reference: here is the texture file:
Which clearly has a metalic flake texture.
Meanwhile my final product looks like this:
Completely smooth.
Yet, if I add a taurus with the exact same texture, I can see the details quite clearly:
I have played around with the dimensions of the texture file (up to several thousand percent), and the reflectivity of the MeshLambertMaterial, but not been able to see any change at all.
Your model needs UV coordinates.
On a model like this, it will need to be done in 3d software such as 3ds Max, Maya etc.
If you could get your hands on a version of the model which already has the correct UV coordinates set it would save you all the hassle.
Setting up UV coordinates is not so easy on a model like this if you have never done it before.
An alternative may be generate your paint flake in your shader (without using UV) rather than in a texture (I willl soon be attempting this myself for a personal project).
HERE are some youtube videos on UV unwrapping in 3ds Max
When I import a model made in Blender to Unity the preview image rotates, when I drag it into the editor it's rotation is fine. It only happens though when the model consists more objects and I join them with CTRL+J in Blender. Let me show an example. I made this house in a few clicks, for the "seperated" one I left the roof as an individual object and for the "joined" one I hit CTRL+J so that they became one object. Here is the image. The "seperated" shows up fine while the "joined" rotates in the preview. I tried to import it straight to Unity as a blend file and also exported it as an fbx from Blender but the result is the same. For more complex models which consists of a lot of objects I like to join them so they appear as one mesh. I haven't find any fix for this, I hope someone can help. I use Unity 5.3.2 and Blender 2.76b.
Maybe you haven't noticed that the up axis for Unity and Blender is different.
In Unity it is 'Y' and in Blender it is 'Z'.
So when you import any model from Blender Unity sets Rotation x = 90.
It's not a big deal, just group it to fix it.
I wrote some PHP script, to get static image from Street View Static image API after giving the script usual Google Maps URL.
But, then I set position of Street view to the ground, and run my script, I get the image of sky, and vice-verse.
Here is an example.
Usual Google Maps url:
https://maps.google.com/?ll=54.899267,23.884749&spn=0.022086,0.062485&t=m&z=15&layer=c&cbll=54.898264,23.885077&panoid=eu75VjoUqNejdSOUJEoCdA&cbp=12,17.61,,0,36.53 <- pitch = 36.53 )
And here is static image from API:
http://maps.googleapis.com/maps/api/streetview?size=640x400&location=54.898264,23.885077&heading=17.61&pitch=36.53&fov=70&sensor=false
As you can see, pitch is same size, but picture shows the sky.
If you invert pitch (-36.53), then everything is ok.(I can't show i, because of reputation, no more than 2 links).
Is this a some bug or what? I don't find any information about this thing.
It really appears that the values are inverted, but there is no bug.
The parameters for Google-Maps AFAIK are not official documented, so the mistake here is that you rely on these parameters.
But the parameters for the Street View Image API are documented:
* pitch (default is 0) specifies the up or down angle of the camera relative to the Street View vehicle. This is often, but not always, flat horizontal. Positive values angle the camera up (with 90 degrees indicating straight up); negative values angle the camera down (with -90 indicating straight down).*