i am using the filter to compute curvature principal directions on an imported obj file (Meshlab 2022.02).
Everything runs well. But is there is an option to safe the results of the curvature analysis as a texture. So that I can export it again as obj with texture to import it for example in Blender.
Right now I can safe it, but the results are not taken with the obj.
Is there an option? Any idea?
Thanks for helping
I tried to use Texture Map Defragmenation but here Meshlab crashes.
Related
I am working in three.js to make terrain from the heightmap data that i have in json file. I now have a chunk of mesh which add up to make a complete terrain but since multiple chunk make multiple draw call which i believe is expensive and making my system very slow.
This is my code:
https://github.com/pravinpoudel/three.js-boilerplate/blob/main/src/client/terrain.ts
This is my terrain after i apply the heightmap and is of size 4104x1856 madeup from chunk of size (256, 256)
I checked the documentation and example of instanced rendering in three.js but all ask for same geometry but in my case i have deformed the y value of each vertex according to the heightmap data from elevation json file.
I believe that i can make it perform better with instanced rendering but i dont know how can i implement that with each geometry in the chunk have altered y value.
Moreover, is there anything more you can suggest?
I am learning a lot from this group and i believe that i can get more insight from here.
Thank you everyone for being here and helping people like me !!
I have a photo-realistic scene already created in 3ds max. I want to render the scene on the web using WebGl and three.js. TO get the realistic effects created in 3ds max using mental ray renderer, I tried to bake the light maps from 3ds max to JPEG files and then map objects in three.js to the texture(exported JPEG) files. But the efeects in three.js seem to be stretched out and not positioned properly. Is my approach correct in the first place? If yes, could it be a problem with the UV mapping from the 3ds Max? Please provide some links if possible to map UVs properly in 3ds max while baking if that's the issue.
Also, do I need to use any custom shaders to get such effects? (I honestly know nothing about shaders if this question seems silly)
Thanks in advance.
I would highly recommend using the THREEjs exporter:
https://github.com/mrdoob/three.js/tree/dev/utils/exporters/max
I have had a lot of trouble with Maya and other programs using any of the built in export options. Face winding, UVs and other stuff seems pretty iffy.The exporter helps.
Once you've done that, there's something else to keep in mind - THREEjs allows two sets of UVs only per piece of geometry. One for the map, bumpmap, displacementmap, etc, and another for the lightmap. So if those two UV sets end up different from one another, you might need to swap which you assign as map and which as lightmap.
Link a fiddle with what results you have so far and we might be able to help more. The only thing i can recommend is to use the THREEjs exporter without seeing what code you're using.
Exporter for 3ds Max has been dropped from the official Three.js repos, you should use glTF format instead. See this official page for the list of glTF-compatible Max exporters:
https://github.com/KhronosGroup/glTF
Has anyone got any ideas on how to load real terrain data into a three.js scene.
I would like to have a 3D model on a the actual terrain , i.e the elevations and overlayed satellite imagery .
Create scene : ok
Load and animate models : ok
Terrain and satellite imagery : ???
Thanks in advance.
Jon
Three.js has an example on how to make a terrain, so that one's covered.
Regarding the satellite imagery, you'll use that as a texture on your terrain. The only thing that is important is to get the texture coordinates right, so that may end up being tricky.
This blog post gives a good example and its code is available online, too.
If you some how have, or able to calculate, the elevation data of the points needed in grid mode.
You can use plane geometry and javascript xml Loader to load your data to the planes' geometry vertices.
Use any type of material for the plane you need and define the "map" attribute to add the image texture loaded with ImageLoader
If you have random placed elevation data you can use face3 or other type of three.js geometry and an algorithm to create a TIN (triangulated irregular network) to visualize the terrain.
Also you might want to take a look at cesium library and cesium.js documentation as about the geospatial part of the question, about the terrain loading using this three.js method and this osg.js demo.
I'm working on a vector field over perlin noise and I was suggested to boost it up using shaders. My graphics knowledge is still very basic but I would like to ask if my thinking how to do it is correct.
Here is what I have. (it is not the latest version with 3rd dimension, but You will get the concept I guess).
So I will pass attribute: time, and noise value to the vertex shader. Unfortunatelly Im using noise function from some library which requires positions that should be calculated every frame in the shader. Is it possible to output from the shader a variable with position calculated inside for every particle?
I've found also something like "https://github.com/ashima/webgl-noise/wiki" for generating the noise inside the shader, but how to update the particles x,y,z position after moving it by the field value and keep it for the next frame? GLSL shaders should also have built in functions for noise generation but i don't think You can use them with threejs?
Thank You for any advice in advance!
have a look in this example... http://threejs.org/examples/#webgl_terrain_dynamic
it will give you some idea for noise creation with shaders and dynamic position of it....
I'm new to OpenGL-ES and looking for the best approach for creating a realistic model of an eye whose pupil can dilate and constrict so I have a plan in mind while running through tutorials.
I've made a mesh in blender that is basically a sphere with a hole (the 'pole' or central vertex is removed and a couple surrounding circle edges).
I plan to add an iris texture directly to the sphere's polys surrounding the hole.
To change pupil size, do I just need a function to reposition the vertices of the hole so the hole dilates or contracts?
I'm going to use OpenGL within an Objective-C app. I have Jeff Lamarche's Objective C export script. Is it standard to export only the mesh from blender, and add textures in code later in xcode? Or is it easier/better to setup the textures on the meshes in blender first and export the more finished product's data to xcode?
Your question is a bit old, so I'm not sure how much progress you've made, but as I've been climbing up the learning curve myself I thought I'd take a shot at answering.
If you want to animate the individual vertices of your model, I believe the method you'll want is Vertex Skinning. I can't speak much on that front as I haven't yet had reason to experiment with it, although it's a technique only available in OpenGL ES 2.0. (Which is probably where you want to start anyway, the increased flexibility over 1.1 is more than worth any additional incline to the learning curve.)
The answer to your texturing question is somewhat mixed. You'll need to actually apply the texture in OpenGL. But what Blender can do for you is determine the texture coordinates. Each vertex of your mesh will have a texture coordinate associated with it. The texture coordinate will be X, Y coordinates which map to a location on the texture image. The coordinates are in a range from 0.0 to 1.0 -- so, since your image texture is a rectangle, the texture coordinate {0, 0} maps to the bottom left corner; {1 , 1} maps to the top right corner; {0.5, 0.5} maps to the exact center of the image.
So in blender, you'd want to go ahead and texture the object with UV mappings. When you export, although your exported mesh won't contain any of the image content, it will retain the texture coordinates which map to your image content. This will allow you to apply the texture in OpenGL so that the texture is applied the same way it appeared in blender.
I've personally had some trouble getting Jeff Lamarche's script to spit out the texture coordinates, as Blender api seems to change significantly with each release. I've had more success with an .obj converter. So I've been exporting from blender to .obj, and using a command line tool to go from .obj to a C header file.
If you encounter similar problems with Lamarche's script, this post might help solve it: http://38leinad.wordpress.com/2012/05/29/blender-2-6-exporting-uv-texture-coordinates/
And this is a good resource for a .obj to .h script:
http://heikobehrens.net/2009/08/27/obj2opengl/