Post-processing with VR effect - three.js

I'm working on a web VR project with Three.js (r79) and I'm wondering how can I use the EffectComposer to add some post-processing in addition to VR effect (one rendering for each eye).
As far as I know, ThreeJS provides 2 example scripts for VR rendering :
VREffect
StereoEffect
But none of the 2 seem to manage post-processing effects.

Related

Selective Bloom in Three.js?

I am trying to make a Neon sign using three.js and I am using BloomPass and emmissive texture to create this effect. I am primarily following this example as I only want One of the models in my scene to glow. ((https://threejs.org/examples/webgl_postprocessing_unreal_bloom_selective.html))
The issue I am running in to is that this is occuring.
Some elements in my background model and my pacman glb model are glowing and the neon sign is barely glowing. When I tried to change the color of the neon sign or see any changes to it nothing happens. It also makes my scene super dark, it isn't generally this dark.
This is another example with higher exposure and settings 1: https://i.stack.imgur.com/KUPW9.jpg
An issue I was having before the exposure went really low is that the entire scene had the Bloom effect and was super blurry.
Based on other examples I have written the code with the intention of rendering everything in the scene black and then render the Neon scene with bloom then render everything back to original colors however, this is obviously not working.
Github
Don't worry about how messy the code is im just trying everything here lol
VERY IMPORTANT
I have the current code in the NeonSign.js file that is where I have been doing my post processing work. DO NOT use the code in postprocessing.js it is just for reference and is not correct and will not reproduce this error

The best approach to use only ThreeJS for building interactive UI without HTML DOM overlays

May I have a 2D layer for UI, Text, Buttons, etc over the 3D scene in ThreeJS?
Ideally something like engine from PixiJS inside ThreeJS? I've seen PixiJS offers some 3D features so why not combine both libraries in something super-powerful? I just do not want to place any HTML Dom elements over WebGL canvas as this will probably slow down performance on Mobile devices.
One way to solve this issue is to implement the UI as screen space sprites like demonstrated in the following official example (check out how the red sprites are rendered):
https://threejs.org/examples/webgl_sprites
The idea is to render them with a separate orthographic camera and an additional call of WebGLRenderer.render(). Besides, instances of THREE.Sprite do support raycasting which is of course useful when implementing interaction.
Building up on Mugen87's answer, you can also use THREE.Shape to make visual containers adapted to the user screen size :
https://threejs.org/docs/#api/en/extras/core/Shape
You can use THREE.Shape to make mesh-based text, is illustrated in this example :
https://threejs.org/examples/?q=text#webgl_geometry_text_shapes
You should also have a look at three-mesh-ui, an add-on for building mesh-based user interface with three.js :
https://github.com/felixmariotto/three-mesh-ui

Is it possible to export transparency effects from Blender to ThreeJS?

I'm looking at this particular example http://tf3dm.com/3d-model/glass-91748.html
I have exported this glass into JSON with ThreeJS' exporter, but it does not show up with the material transparency achieved in blender upon render in the browser. In blender this glass effect is achieved with the via the Z-Transparency Alpha, Fresnel, and Blend settings. These settings are the magic sauce it seems. Without them in Blender, the glass appears as it does in the browser. While I might be able to correct this with a shader in Three.JS, I'm trying to determine if it is possible to do so without intervention.
ThreeJS 76
Blender 2.77

Three.js Skeletal Animation Export Issue

I've been working with an animator to help with my game. The animations all work fine using morph targets, but the file size just gets way too large. Skeletal animations are the answer. We've spent a week working to get the animations exported from blender correctly.
After reading many many articles we were able to get basic animations working correctly. I make sure to set the armature to rest pose and export on the first frame and all that, but the more complicated animations are off.
You can see in this example here (click to cycle animations):
http://www.titansoftime.com/beta/animation2.html
My animator said the problems are related to bone constraints using his controllers. He said his technique is called "Inverse Kinematics".
Anyone have any ideas?
I have found the answer. For one you can not scale the geometry in the json loader (however you can scale the mesh object once created).
The main thing is that my animator was using inverse kinematics, which apparently three.js does not play nice with.

3d models animation in webgl

I am new to webgl and am trying to animate objects, not simply rotating/moving them but complex motions. For example how can we make movements of hand/leg in a human model(as if the person is walking)?Right now I am using Three.js to import the OBJ model.
I used Blender 2.64 and Three.js exporter successfully to load animated model in the scene. Here is nice tutorial that explains everything from animation in Blender to exporting to Three.js (although it uses older version of exporter so the things you make won't work in their test application).
http://www.kadrmasconcepts.com/blog/2012/01/24/from-blender-to-threefab-exporting-three-js-morph-animations/

Resources