Rendering Error 3ds max - max

Hello i made a 3d environment:
Red parts.
I try to change materials but nothing.
I delete reflection and error fixed, but why rendering makes red parts?
I haven't red objects in the scene.
Info.: 3ds max 2015 - vRay 3.
Thanks

This happens when you have unsupported materials on the scene or when there are invalid colors generated.
Make sure you're using vray materials and that reflected objects do as well.

Related

Fix Three.js model when scale hasn't been applied in Blender

I built a .glb viewer with Three.js. Whenever I want to display a model from Blender where the scale has not been applied before exporting (e.g. scale at 0.5 instead of 1), I get sizing issues in Three.js. It usually manifests in the model appearing smaller than its bounding box.
Is there a way to get consistent scale with Three.js?
Downloadable files:
This is the file that causes the problem:
https://firebasestorage.googleapis.com/v0/b/fotura3d-dev.appspot.com/o/cassette.glb?alt=media&token=27e08f59-68b4-4011-89a4-04bdc56365db
This is the file with transforms applied in Blender which works as expected in Three.js:
https://firebasestorage.googleapis.com/v0/b/fotura3d-dev.appspot.com/o/cassetteApplied.glb?alt=media&token=fc6decb1-aa36-4ab2-8582-9ce21111585b
Instead of scene.add(yourModel) ing your model to the scene.. try scene.attach(yourModel)

I can't get the aoMap showing in three.js using a glb/gltf asset

I’m having a hard time getting an aoMap working in three.js…
I have a glb asset with an aoMap on the red channel or something. When I bring it into to the babylon viewer, I can see the ao just fine, but it wont show up in the three.js viewer or my project. I think this has something to do with a second set of uvs, but I can't find a resource that involves doing that on top of using the gltf loader… I really don't know what to do here. Any response would be greatly appreciated!
Here is my code (I’m using a html-canvas as the texture)
And I get the model’s geometry and diffuse texture (all white) as desired, but the aomap isnt showing…
code
babylon viewer
three.js viewer
working application with shadows included in diffuse
not working, diffuse is just white, and aoMap is not showing
You're right about needing a second set of UVs. The reason behind this is that diffuse textures often repeat (think of a brick wall, or checkered t-shirt). AO shading, however, is more likely to be unique on each part of the geometry, so it's almost never repetitive. Since this often would need an alternative UV mapping method, the default is to use a second set of UVs.
You could do 2 things:
Re-export your GLTF asset with a duplicate set of UVs.
Duplicate existing UVs in Three.js by creating a new BufferAttribute in your geometry:
// Get existing `uv` data array
const uv1Array = mesh.geometry.getAttribute("uv").array;
// Use this array to create new attribute named `uv2`
mesh.geometry.setAttribute( 'uv2', new THREE.BufferAttribute( uv1Array, 2 ) );
.getAttribute and .setAttribute are methods of BufferGeometry, if you want to read more about them.

Export Maya fluid so that it can be imported into a three.js game

Please forgive my ignorance in the area, but does anyone know how to go about exporting a maya fluid so that it is importable into three.js?
I'm new to Maya and I just made my first model to a game I'm making, see below:
The afterburner effect from the endgine are created with Maya fluids. I have no problem exporting the aircraft, and weapons, but I have no idea how to use the fluid in a three.js game. I've done hours of research trying to find any information on this topic without avail. Any references or assistance would be greatly appreciated!
EDIT: Here is an update to the steps I'm taking and the errors I'm receiving. Really hope this is meaningful and solvable. I can't find any good information online in relation to these errors.
Step 1: Modify > Convert > Fluid to Polygons ( all works as expected )
Step 2: Select created geometry > Edit > Keys > Bake Simulation, screenshot of options below. ( all works as expected )
Step 3: Select the generated geometry from the Fluid to Polygons step, select my fluidShape2 in the hypershader and perform Hypershade Edit > Convert to File Texture (Maya Software) ( does not work as expected, get an error )
Here is a screenshot of hypershader and polygons:
Here is the error Maya spits out :
// Error: file: /Applications/Autodesk/maya2017/Maya.app/Contents/scripts/others/copyConvertSolidTx.mel line 93: Cannot convert a texture connected to initialShadingGroup
To me it seems like the fluidShape2 node in the hypershader is not actually a shader and the created polygons are really assigned lambert1. I'm not really sure though considering I started using Maya last week.
EDIT #2:
Ok so here is a screenshot showing the fluidShape set as the surface material.
Here is a screenshot of the fluid shading attributes.
As you can see the color is black and the only illumination for this object is that which it produces. It is not lit from outside sources at all.
Cast / Receive shadows are unchecked in Render Stats as well.
Question: Is there any way to bake incandesces?
EDIT #3:
After following your 10 step program, I am having the same results I did previously. Here are screenshots to show before and after. A green plane has been added behind the flame to show it easier in it's second state.
As a fluid:
As polygons with 10 steps applied:
Here is attribute editor:
And here is the material applied through hypershader:
I repeated the 10 steps you gave me several times with the same results each time.
Before exporting you need to convert fluid to polygons.
Here's a MEL script for testing. Run it in Script Editor.
// opening nuke explosion example
file -import -type "mayaAscii" -ignoreVersion -ra true -mergeNamespacesOnClash false -namespace "Nuke" -options "v=0;" -pr "/Applications/Autodesk/maya2016.5/Maya.app/Contents/Examples/FX/Fluid_Examples/Explosions/Nuke.ma" ;
// selecting fluid container
select -r Nuke:fluid ;
// converting fluid to polygons
fluidToPoly ;
// changing mesh resolution under shapeNode/outputMesh tab in Attribute Editor
setAttr "Nuke:fluidShape.meshResolution" 3 ;
To convert fluids to polygons via menu, select fluid container and choose:
Modify – Convert – Fluid to Polygons
then select Shape node's tab in Attribute Editor and change Mesh Resolution in Output Mesh area.
Look at useful information on exporting: Three.js Maya Export
When conversion is done, default Maya shader (Lambert1) is assigned to new poly mesh. You have to reassign your fluid texture to new poly mesh. For this, open Hypershade and MMB-drag-and-drop fluid texture to Surface Material slot of poly mesh.
Remember: if you've got animation in your scene you need to bake it before export using:
Edit – Keys – Bake Simulation
bakeResults -simulation true -t "1:100" -sampleBy 1 -disableImplicitControl true -preserveOutsideKeys true -sparseAnimCurveBake false -removeBakedAttributeFromLayer false -removeBakedAnimFromLayer false -bakeOnOverrideLayer false -minimizeRotation true -controlPoints false -shape true {"nurbsSphere1"} ;
You don't need any type of curve interpolation (linear or bezier), just baked key on every frame. You can export animation via FBX or ABC file formats. There's additional method to export animated mesh as OBJ sequence: Free OBJ Sequences Import/Export.
Also, If you have any problems with exporting the light, bake the light and its shadows in object's texture. The better way for you to export animation and textures on per vertex basis.
STEP-BY-STEP INSTRUCTION:
Select a fluid container and apply Modify–Convert–FluidToPolygons. You'll see in Viewport a polygonal Object with Lambert1 shader.
Select FluidObject in Outliner, then in AE (Attribute Editor) change Mesh Resolution and Mesh Smoothing Iterations (for example 0.5 and 2 respectively) in fluidShape–Output Mesh area.
After that select PolyObject in Outliner and choose tab with its initialShadingGroup in AE.
Open Windows–RenderingEditors–Hypershade, select FluidTexture and assign it using MMB-drag-and-drop to Surface Material Slot in initialShadingGroup of PolyObject in AE.
Create Spot Light in the scene. Run test Render. You'll see a lit Poly.
To change Transparency, Opacity or Incandescence just select PolyObject in Outliner and in AE go to fluidShape tab and choose Shading area.
Bake a Light (and Shadows if needed) using Arnold Renderer properties in Maya 2017 or Mental Ray properties in Maya 2016 and earlier.
To output a Poly Mesh from Maya you need to select mesh and apply a command Edit–Keys–BakeSimulation.
To export a sequence of textures for every frame you need to select a FluidTexture in Hypershade and apply to it Edit–RenderTextureRange (you have to assign a frame range for exported texture sequence, for example 1-200, as well as file format, for example TIFF)
Export OBJ sequence.

Strange lines ("seams") appearing when applying texture to threejs model

I am rendering a model imported from a .ctm file into threejs v71. I'm then adding a texture using a MeshBasicMaterial with map.
The original model was made in Agisoft Photoscan, exported as .obj, and then converted to OpenCTM format using the official OpenCTM viewer program. The .ctm model itself is here.
It looks correct, except that strange "seams" appear on the texture when I load the .ctm . The .obj loads fine in three.js with no seams. What are these, and how do I get rid of them?
Here's a screenshot:
These "seams" are not present in the texture file:
UPDATE: I noticed that the seams are also visible when viewing the .ctm in the ctm viewer, so this is probably an OpenCtm conversion problem rather than a threejs loading issue.
To my chagrin it seems this is a longstanding bug in OpenCTM.
The other answers must not be reproducing the situation described in the question.
Edit: I now fully understand this problem and have a workaround for it. The issue is that most programs (Photoscan, Blender) have "per-loop" vertices instead of actual "per-vertex" textures. This just means that when a vertex is shared by two polygons, there can be multiple UV coordinates for that vertex. CTM can only have one UV coordinate per vertex and that's what causes the problem at texture seams.
My workaround in blender is:
Seams from Islands
Select an edge on seam, select similar --> seam. Now all seams should be selected
Mesh --> Edges --> Edge Split
Export to .obj, use ctmviewer.exe to import and export to .ctm.
The seams are still visible if you look closely but no longer obvious multicolored bands.
I had the same issue with my Agisoft Photoscan model/texture, so I opened the texture in Photoshop and noticed it had transparency between all the patches of texture. I filled all the gaps using content aware fill and saved the texture out as a .tif without layers. This solved the issue for me.
Or you can just remove the alpha-channel from the texture image file (or use JPG format during export).

Render scene onto custom mesh with three.js

After messing around with this demo of Three.js rendering a scene to a texture, I successfully replicated the essence of it in my project: amidst my main scene, there's a now sphere and a secondary scene is drawn on it via a THREE.WebGLRenderTarget buffer.
I don't really need a sphere, though, and that's where I've hit a huge brick wall. When trying to map the buffer onto my simple custom mesh, I get an infinite stream of the following errors:
three.js:23444 WebGL: INVALID_VALUE: pixelStorei: invalid parameter for alignment
three.js:23557 Uncaught TypeError: Cannot read property 'width' of undefined
My geometry, approximating an annular shape, is created using this code. I've successfully UV-mapped a canvas onto it by passing {map: new THREE.Texture(canvas)} into the material options, but if I use {map: myWebGLRenderTarget} I get the errors above.
A cursory look through the call stack makes it look like three.js is assuming the presence of the texture.image attribute on myWebGLRenderTarget and attempting to call clampToMaxSize on it.
Is this a bug in three.js or am I simply doing something wrong? Since I only need flat rendering (with MeshBasicMaterial), one of the first thing I did when adapting the render-to-texture demo above was remove all trace of the shaders, and it worked great with just the sphere. Do I need those shaders back in order to use UV mapping and a custom mesh?
For what its worth, I was needlessly setting needsUpdate = true on my texture. (The handling of needsUpdate apparently assumes the presence of a <canvas> that the texture is based on.)

Resources