Three js loaded objects are pixelated (not having a high finishing) - three.js

When I load objects to Three.Js (.obj or .js), the edges are not rendered perfectly.
It is shown with pixilated edges (not having a high finish or sharp edges).
I tried both THREE.JSONLoader() and THREE.OBJLoader() and it gives me the same results.
I opened these objects with Blender to see how it looks like when using Blender renderer. Blender renderer shows a high finished and sharp edges as expected.
The following links contain a picture and a video to give more information about my problem.
In the video, the upper edges can explain the problem.
Image, Video
My Question is how to remove pixelation and having sharp edges for objects loaded on Three js?

Related

THREE.Shape to mimic context.clip()

Using THREE.Shape, I can create holes, but rather than holes, I wish to define a clip mask.
I ONLY want to render the shape within a mask, similar to html's canvas/context .clip()
Is there a way to do this using holes or other method?
EDIT:
So, more background, I was using canvas to render segments, and imported them into three as planes.
The mouth was 1 canvas, and I was able to clip mask the teeth and tongue onto the black part.
See the whole movie at http://zsenji.com (rendered using the old canvas method)
Anyway, now I'm updating everything to use threejs and no more canvases rendered as planes.
I'm going to try three csg , which can hopefully intersect two geometries. https://stemkoski.github.io/Three.js/CSG.html
Then all I would have to do would be extrude the black of the mouth, and intersect it with the teeth/tongue. I will update
It worked.
I used very simple intersect, similar to https://github.com/chandlerprall/ThreeCSG/blob/master/examples.html
It's a little slow and there are still some other problems relating to overlapping paths, but for this issue, this was the fix.
All the different fills you see are three shapes

Alpha Blending and face sorting using OpenGL and GLSL

I'm writing a little 3D engine. I've just added the alpha blending functionality in my program and I wonder one thing: do I have to sort all the primitives compared with the camera?)
Let's take a simple example : I have a scene composed by 1 skybox and 1 tree with alpha blended leafs!
Here's a screenshot of a such scene:
Until here all seems to be correct concerning the alpha blending of the leafs relative to each others.
But if we get closer...
... we can see there is a little trouble on the top right of the image (the area around the leaf forms a quad).
I think this bug comes from the fact these two quads (primitives) should have been rendered later than the ones in back.
What do you think about my supposition ?
PS: I want to precise all the geometry concerning the leafs is rendered in just one draw call.
But if I'm right it would means when I need to render an alpha blended mesh like this tree I need update my VBO each time my camera is moving by sorting all the primitives (triangles or quads) from the camera's point of view. So the primitives in back should be rendered in first...
What do you think of my idea?

Getting distortion in animated images unity2d?

I am new in Unity2d creating animation via series of images of the player and images are very clear but when i add them in my animation they are getting distortion.
I am following this tutorial:
http://www.41post.com/4742/programming/unity-animated-texture-from-image-sequence-part-2
and it working perfectly for it's own images, Note: my images has empty spaces (as png pics has) and only that part of the images getting distortion while tutorial images has no empty spaces.
The Print Screen of my problem: Image sample
Didn't went through the entire tutorial line by line. What I suspect though is that you are overlapping your textures by just looking at the screenshot.
Imagine you are drawing pictures on the only piece of paper you have at hand. Suppose there are 1, 2, ..., k images, you draw the first image i1 on the piece of paper. Now, you wanna draw the second one i2. Since you have only one piece of paper, you have to rub away your previous drawing first. Without clearing the drawings on your paper, your new drawings will always overlap the old ones. Unless you are using a new piece of paper everytime, of course.
Back to the question. If the images used in the animation are fully opaque (not even a single pixel), then of course you will not notice the difference even if you draw new paintings over the old ones. But in your case, there are many transparent areas in the images. If the canvas is not cleaned everytime before drawing a new painting, it is obvious that the results will be something similar to what you have in the screenshot.
The images used in the tutorial are fully opaque, I suppose.

WebGL- After rendering the mesh correctly some triangles disappear

My problem is the following. I have a canvas in which I am drawing a piece using WebGL.
When it renders, it is fine.
But then, two seconds later or so, without moving the camera or anything, some of the triangles disappear.
And after moving the camera or something, the triangles that are gone stays the same (I have read that in some cases is due to the buffer and the distance to the object so by zooming in or out the triangles that are gone can change).
What could be the problem?
I am applying both color and texture to each element in order to print black lines around each "square" (my texture is a square with black border and white inside). That means that the final color is computed in the fragment shader by multiplying the color times the texture. That also means that some of the nodes are duplicated or even more (in order to give the TextureVertex attribute to a node I need a different node as it belongs to each element) It is important to notice that, when I create a mesh with less number of nodes, they do not disappear. Anyway, I have seen WebGL examples on the net very complex, and I may have just 1000 nodes so I don't think it could be a problem of my graphic hardware.
What do you think could be the problem? How would you solve it? If you require more info just let me know. I didn't include code because it seems to be rendered OK at the beginning, and furthermore I only have this problem with "big" meshes.
Thanks for the comment. please find here both images:
First draw
A few seconds later.
EDITED: Im gonna give some more details in case this helps to find the problem. I will give you the information regarding one of the squares (the rest of the squares would follow same scheme). Notice that they are defined in the code behind as public variables and then I pass them to the html script:
Nodes for vertex buffer:
serverSideInitialCoordinates = {
-1.0,-1.0,0.0,
1.0,-1.0,0.0,
1.0,1.0,0.0,
-1.0,1.0,0.0,
0.0,-1.0,0.0,
1.0,0.0,0.0,
0.0,1.0,0.0,
-1.0,0.0,0.0,
0.0,0.0,0.0,
};
Connectivity to form triangles:
serverSideConnectivity = {
0,4,8,
0,8,7,
1,5,8,
1,8,4,
2,6,8,
2,8,5,
3,7,8,
3,8,6
};
Colors:not relevant.
TextureVertex:{
0.0,0.0
1.0,0.0
1.0,1.0
0.0,1.0
0.5,0.0
1.0,0.5
0.5,1.0
0.0,0.5
0.5,0.5
};
As I mentioned I have an image which is white with just few pixels black around the borders. So in the fragment shader I have something similar to this:
gl_FragColor = texture2D(u_texture, v_texcoord) * vColor;
Then I have a function that loads the image and gets the texture.
In the function InitBuffers I create the buffers and assign to them the vertexPosition, The colors and the connectivity of the triangles.
Finally in the Draw function I bind the buffers again : vertexPosition, color (bind it as colorattribute), texture (bind it as textureVertex), and connectivity, and then set Matrix Uniform and draw. I don think the problem is here because it works fine for smaller meshes, but I still dont know why it doesn't for larger ones. I thought maybe performance of firefox is worse than other browsers' but then I ran on firefox difficult WebGL models I found on the web and they work fine, no triangles missing. If I print same objects without the texture (just colors) it works fine and no triangles are missing. Do you think that maybe it takes a lot of effort for the shader to get the color everytime by multiplying both things? Can you think of another way?
My idea was just to draw black lines between some nodes instead of using a complete texture, but I cant get it working, either I draw the triangles or I draw the lines but it doesn't allow me to draw both at same time. If I put code for both, only the last "elements" are drawn.

Using multiple primitives in WebGL at the same time

I am actually trying to develop a web application that would visualize a Finite Element mesh. In order to do so, I am using WebGl. Right now I have a page with all the code necessary to draw the mesh in the viewport using triangles as primitives (each quad element of the mesh was splitted into two triangles to draw it). The problem is that, when using triangles, all the piece is "continuous" and you cant see the separation between triangles. In fact, what I would like to achieve is to add lines between the nodes so that, around each quad element (formed by two triangles) we have these lines in black, and so the mesh can actually be shown.
So I was able to define the lines in my page, but since one shader just can have one type of primitive, if I add the code for the line buffers and bind them it just show the lines, not the element (as they were the last buffers binded).
So the closest solution I have found is using multiple shaders, and managing them with multiple programs, but this solution would just enable me whether to plot the geometry with trias or to draw just the lines, depending on which program is currently selected.
Could any of you help me about how to approach this issue? I have seen a windows application that shows FE meshes using OpenGL and it is able to mix the triangles with points and lines, apart from using different layers, illumination etc. So I am aware that this may be complicated, but I assume that if it is possible somehow with OpenGl it should be as well with webGL.
Please if you provide any solution I would appreciate a lot that it contains some code as an example, for instance drawing a single triangle but including three black lines at its borders and maybe three points at the vertices.
setup()
{
<your current code here>
Additional step - Unbind the previous textures, upload and bind one 1x1 black pixel as a texture. Let this texture object be borderID;
}
Draw loop()
{
Unbind the previous textures, bind your normal textures, and draw the mesh like your current setup. This will fill the entire area with different colours, without border (the current case)
Bind the borderID texture, and draw the same vertices again except this time, use context.LINE_STRIP instead of context.TRIANGLES. This will draw lines with the black texture, and will appear as border, on top of the previously drawn colors for each triangle. You can have something like below
if(currDrawMode==0)
context3dStore.bindTexture(context3dStore.TEXTURE_2D, meshTextureObj[bindId]); else context3dStore.bindTexture(context3dStore.TEXTURE_2D, borderTexture1pixObj[bindId]);
context3dStore.drawElements((currDrawMode == 0) ? context3dStore.TRIANGLES: context3dStore.LINE_LOOP, indicesCount[bindId], context3dStore.UNSIGNED_SHORT, 0); , where currDrawMode toggles between drawing the border and drawing the meshfill.
Since the line texture appears as a border over the flat colors you had earlier, this should solve your need
}

Resources