ThreeJS WebVR camera Settings - three.js

ThreeJS WebVR mode forgets the perspective camera near and far frustum plane settings. Hence parts of my objects get hidden. I wish to have a far frustum plane value of 10000 but when I enter VR mode my far frustum plane value is default to 1000. But this weirdness happens only in google chrome. When I use the Samsung internet browser and enter VR everything looks fine.

VR mode uses 2 different cameras as opposed to the single camera in regular rendering mode.. I'm wondering if there is a bug in the state synchronization between the 2 when VR mode starts.. or maybe you're starting up the VR mode before the main camera state is initialized, and the VR cameras start up with a default state or something?

Related

Porting deferred rendering WebGL pipeline to aframe.js

I have a WebGL 2 app that renders a bunch of point lights w/a deferred pipeline. I would like to port this to Aframe for use with Oculus Rift S.
My questions relate only to rendering. Now I know next to nothing about VR specific rendering; other than the fact that two images are rendered for each eye and then passed through some distortion filters. I see that there exist components that (were last updated quite a while ago) provide this functionality. My pipeline is written with a low level WebGL lib and I do not want to port it to some other component (performance, compatibility reasons + my own vanity).
I would also like to avoid as much direct integration of this pipeline with three.js as possible. Right now I have a basic three.js scene with a full screen quad textured with the output of my deferred renderer. I assume leaving this as-is and shoving this scene into Aframe wouldn't render properly on a Rift, so how would I go about rendering two full-screen quads for each eye in Aframe? Are the camera frustums and views for each eye easily exposed in Aframe? Is my thinking way off entirely?
Thanks for any help, I've looked through the aframe git for some time now and cannot find any clear place to start.

Aframe orbit control camera pan and rotation ease animation when reach limit

I am trying to do a navigation system like google cloud infrastructure like this:
google cloud infrastructure.
I want to do this using aframe rather than threejs. So I am now customising aframe orbit control by keven ngo:
aframe orbit control.
The problem is that, I succeeded in limiting the auto rotation in a certain angle, so as pan. But I have some following problem that I do not know how to do after searching every posiblities and tried my self:
how to achieve the same effect of bouncing back smoothly after reaching out the pan limit;
for some reason if I pan and after mouseup then when mouse moves, it still pans rather than rotate. Why is that?
how to make camera rotates slightly like in google's example(I modified the original library to rotate camera when mousemove rahter than mousedown)?
Below is the glitch link of my experiment:
aframe customized orbit control
what I customized(I notated my change with slashes and ADDITION text):
autorotates between set angle;
mouse click only pans; when mouse move, camera rotates and autorotate stops;
pan can be limited.
This is a long question, very appreciated if anyone can help!!
how to achieve the same effect of bouncing back smoothly after reaching out the pan limit?
One idea: on mouseup (ie release of pan mode), check if the camera target is outside of the camera's viewing frustrum. If so, calculate a new target position, say half way between current target position, and a point in line with the camera z axis( ie, the center of the screen). Then make an animation, that moves the camera target from current location, to new location.
for some reason if I pan and after mouseup then when mouse moves, it still pans rather than rotate. Why is that?
It seems that the navigation mode (panning, orbitting,or zooming), does not change on mouse up. Make a new (mouseup) listener, that forces orbit mode back to a default mode (orbit?).
how to make camera rotates slightly like in google's example(I modified the original library to rotate camera when mousemove rahter than mousedown)?
It looks like in the google example, orbit direction is determined by which side of center the cursor is in. Left side makes autorotate go clockwise, and right side counterclockwise. You will need to use the cursor component to detect this, and change the orbit direction accordingly.
Also, it appears that in the google version, orbitting is not determined by mousedown (ie dragging), but by cursor distance to center, and this is added to the auto rotate. It appears to be a buffer system, where distance to center initially creates a value to alter the auto orbit (by adding or subtracting to the orbit amount), but that value is a buffer, meaning that it degrades to 0 over time (each frame the value is reduced slowly to 0).

Different orientation for PlaneBufferGeometry in chrome vs. firefox

I've created a sample to show what I'm seeing: https://haddons.net/threejs/chrome_firefox.html
I have a basic model that's loading a texture onto a plane and also pulling in some heightmap data. If I load it in chrome the texture is oriented in one direction and if I load it in firefox it's rotated 180 degrees.
Am I doing something wrong or is this a firefox, chrome or three.js bug?
Unfortunately, this is a bug in the current version of three.js (R102). It will be fixed with the next release by this PR.
The problem was that textures were flipped if a resize was necessary. This always happens if you try to use mipmapping with a NPOT texture. A workaround is to configure your texture like the following, to use POT textures or WebGL 2:
texture.minFilter = THREE.LinearFilter;

How to setup exactly same two 3d viewport

First of all, sorry for my english
I have two 3d viewport, the first one for editing and the other for live render. The problem is i need to orbit again so they have same viewing angle. So is there any other way to do it more easily?
If you have both 3D viewports showing the camera view numpad 0 you can enable Lock Camera to View and both viewports will show the same movement. You can find the lock to view option in the View panel in the properties region N. You only need to lock the one viewport that you will be using to move around in.
As this will actually move the location of the camera, you may want to have a second camera to use for this and switch the active camera between your viewport and rendering camera in the scene properties.

Screen tearing using OpenGL

I am encountering a screen tear issue using OpenGL on Windows 10. . I am doing a ray marching demo (rendering just 2 triangles and ray marching and shading terrain in fragment shader). Rendering with GTX 860M, the app does not have an issue with staying at 60 FPS (at least at lower resolution). A screen tearing is present however (and the following video was rendered at especially low resolution to make sure this has nothing to do with the complexity of the fragment shader code):
Screen tearing video at YouTube.
What I've tried:
I am creating the OpenGL context using the recommended settings.
I have tried manually turning VSync on using (wglSwapIntervalEXT(1)), although that should be the default behavior.
I have tried placing glFinish before the SwapBuffers call, i.e.:
while (isRunning)
{
ProcessMessages(window, &context);
App::Update(&context);
glFinish();
SwapBuffers(windowHandle);
}
I have tried setting VSync to "Always on" in the Nvidia control panel. I have also tried creating the window using GLFW instead of WinApi. I have tried creating the window both in fullscreen and resizable mode.
I haven't seen any improvement though.

Resources