WebVR - update scene independently for each eye - three.js

I try to implement a scene, where an object is updated in different way for each eye (eg. I want to project opposite rotation of box for each eye).
I have demo application written with WebGL using Three.js as described in Google Developers tutorial.
But there is only one scene, containing one mesh, with single update function. I can't find way to implement separation of update, so it's done seperately for each eye (just as rendering is done) and wonder if it's even possible.
Anybody has any experiences with similar case?

Your use case is rather unusual (and may I say, eye-watering), so basically the answer is no: Three.js has abstracted away the left/right eye dichotomy of VR. Internally it renders the scene using an array of 2 camera's, with the correct left/eye setting.
Fortunately, every object has an onBeforeRender(renderer, scene, camera, ...) event. If you hook that event, and find a way to distinguish the left/right eye camera you should be able to modify the orientation just before it gets rendered.
A (perhaps too) simple way to distinguish the camera's would be to keep a track of the index with a counter.

Related

A-Frame camera rotation using lookAt()

Word up SO,
I'm trying to pull together something akin to an 'anchor look' component in A-Frame – the idea was supposed to be like a combination of aframe-href-component and aframe-look-at-component, where clicking a link to an anchor (Link) would make the camera "look at" the entity whose id="" matches the anchor.
I thought I had a working concept just by modifying the look-at component a bit, i.e. poll for hash updates and Object3D.lookAt() the anchor, but there seems to be a problem I wasn't accounting for that probably comes from my poor understanding of Euler/quaternion/etc:
When the camera's rotation gets updated by lookAt(), it seems to lose its previous rotational reference – dragging the camera has strange rotation results, and the results get stranger the more you've rotated the camera before calling lookAt().
I've set up a basic codepen at http://codepen.io/wosevision/pen/JWRMyK containing my version of the component to demonstrate; what is causing this and what is the proper way?
You know the perspective camera nested in a group, when you drag the mouse to change the rotation, the group rotation changed, but the perspective camera didn't. the rotation would be strange if the perspective camera rotation is not (0,0,0).
It's very hard to set camera rotation, If you really want to do that, you need to look deep int the implementation of camera control and modify.

Create floor with dynamic soft reflections

Using three.js am trying to create a floor that reflects the objects that sit upon it. Preferably the floor material should reflect not like a mirror but in a more 'matte' or diffused way.
To achieve this I looked to Jaume Sanchez Elias who has made a great example using a cube camera: Look for the "smooth material" example on this page:
http://www.clicktorelease.com/blog/making-of-cruciform
Here is my attempt using the same technique. But as you see the reflections are misplaced, they do not appear underneath the mountain objects as expected.
http://dev.udart.dk/stackoverflow_reflections/
I am looking to correct this or to use any other technique that will achieve a more correct diffused reflection.
There are three.js examples using the cube camera technique but they all create mirror-like effects not a soft reflection.
Vibber. Parallax-corrected cubemaps, the technique used in cru·ci·form, only works for closed volumes, like cubes. It works really well to simulate correct reflections inside a room, but not so much for outdoors or open/large scenes. They also can't reflect anything that it's inside the cubemap, you'd have to split the volume in many sub-volumes.
I can think of a couple of solutions for what you want to achieve:
SSR: Screen-space reflections, you can find more info in many places on the internet. It's not the most trivial of effects to implement, and you might have to change the way you render your scene.
Simpler post-processing approach: since you have a flat floor, render the mountains vertically flipped on a framebuffer object, blur it, and render the regular scene on top. For extra effect, render the depth of the flipped mountains, and use that value as the blur radius, to get diffuse reflections.
As always, there's a ton of ways to achieve the (un)expected result :)

How to collide .fbx animation in Unity

I am new to unity. I have two animation in .fbx format.They can move..Now i want when both will collide with each other a sound will produce.Is there any idea how i will do this.Thanks in advance
I think you need to read about how Physics work, and then how Trigger-Events and Colission detection is handled.
Read this here, and this. The first one gives you insight on how the Unity engine works. The latter provides a video tutorial on how to do Collision Detection.
If you don't want to do that and just want the code, I found this on a quick Google:
var crashSound : AudioClip; // set this to your sound in the inspector function
OnCollisionEnter (collision : Collision) {
// next line requires an AudioSource component on this gameobject5.
audio.PlayOneShot(crashSound);
}
You can add a MeshCollider to the fbx meshes. Anyway, this is not a good idea because this will cause performance issues.
You can create an empty gameobject for each character, and add to them: the fbx animation and a simple collider (some cube, sphere, capsule, etc). Then, when you use a script for them, you attach it to the parent object and from there you handle the whole thing.
If you want that the collider moves from specific places from the animation (Like the punch movement, or a kick),then you can ask to your 3D animator/modeler to add a simple mesh on that points. For example, a sphere on one punch, which will move with the animation. Then, in Unity, you will hide the mesh of the sphere but add a mesh collider to it. :)
Hope it helps!
Most of the time, if you apply an animation to an object then you'll loose the physics reaction. Don't trust me? See here: http://www.youtube.com/watch?v=oINKQUJZc1Q
Obviously, animation are not part of Unity physics. Think about it... Unity physics decide position and rotation of objects accordingly to Newton and friends laws. How do you think these laws can accord to a keyframe arbitrary animation? They can't: hence the crazy results you get when you try.
How to solve it? Use Unity physics also for animation: learn to master rigidbody.AddForce and all the other stuff described here.
You may always want to keep the physics and the animation separated. That's how you get out of trouble.
If you really want to know: here's my personal experience on how to mediate physics with animation.
Sometimes, even binding a simple parameter to the physics and another
to an animation (or a script which mediates user input) may result in
catastrophic results. I've made a starship: rotation controller by
user mouse (by flagging "block rigidbody rotation"), direction and
speed by physics. It was inside a box collider. Imagine what happens
if a cube, orientated by a few degrees angles, meets a flat ground: it
should fall and rotate until one of the faces lays completely on the
ground. This was impossible, as I blocked any physics interaction with
the rotation of the body: as a result the box wanted to get flat on
the ground but couldn't. This tension eventually made it move forward
forever: something impossible in real world. To mediate this error,
I've made the "block rotation" parameter change dynamically according
to the user input: as the ship is moving the rotation is controlled by
the user but as soon as the user stop controlling the ship the
rotation parameter is given back to the physics engine. Another
solution would be to cast a ray down the collider, check if the ground
is near and avoid collisions if the ship is not moving (this is how
the banshee in Halo Combat Evolved is controlled, I think). When
playing videogames, always have a look at how your user input is
mediated into the physics engine: you may discover things which a
normal player normally wouldn't notice.

Particles vs ParticleSystem in three.js

I'm struggling with a visualization I'm working on that involves a stream of repeated images. I have it working with a single sprite with a ParticleSystem, but I can only apply a single material to the system. Since I want to choose between textures I tried creating a pool of Particle objects so that I could choose the materials individually, but I can't get an individual Particle to show up with the WebGL renderer.
This is my first foray into WebGL/Three.js, so I'm probably doing something bone-headed, but I thought it would be worth asking what the proper way to go about this is. I'm seeing three possibilities:
I'm using Particle wrong (initializing with a mapped material, adding to the scene, setting position) and I need to fix what I'm doing.
I need a ParticleSystem for each sprite I want to display.
What I'm doing doesn't fit into particles at all and I really should be using another object type.
All the examples I see using the canvas renderer use Particle directly, but I can't find an example using the WebGL renderer that doesn't use ParticleSystem. Any hints?
Ok, I am going from what I have read elsewhere on this github issues page. You should start by reading it. It seems that the Particle is simply for the Canvas Renderer, and it will become Sprite in a further edition of Three.JS. ParticleSystem, however is not going to fulfill your needs either it seems. I don't think these classes are going to help you accomplish this in WebGL in 3D. Depending on what you are doing you might be better off with the CanvasRenderer anyway. ParticleSystem will only allow you to apply a single material which will serve as the material for each particle in the system as you suggested.
Short answer:
You can render THREE.Particle using THREE.CanvasRenderer only.

How do I animate clouds?

I have 3 nice and puffy clouds I made in Photoshop, all the same size, now I would like to animate them so they appear like they were moving in the background. I want this animation to be the base background in all my scenes (menu,settings, score, game).
I'm using cocos2d, I have setup the menus and the buttons so the work but how do I accomplish this?
I was thinking to add this as a layer, any other suggestions?
Can anyone show me how some code how to make this please?
David H
A simple way to do it is with sine and cosine. Have slighly different parameters (period and amplitude) to ensure that the user doesn't realise (as easily) that they are programmatically animated.
You may also want to play with animating the opacity value. I'm not sure if layers have those, otherwise you'll have to add the clouds to separate nodes or images and applying it to them.
It's hard to be more specific without knowing what the images look like.
The simplest way to animate anything is to add the sprite to the scene, set the position and call something like...
[myClouds runAction:[CCMoveBy actionWithDuration:10 position:CGPointMake(200, 0)]];
This will slide the sprite 200px to the right over 10 seconds. As Srekel suggested, you can play around with some trig functions to get a more natural feel and motion paths, but you'll have to schedule a selector and iteratively reposition the elements.
The more difficult part of your questions is about getting the animation in the background of all scenes. Keep in mind that when you switch scenes, you're unloading one node hierarchy and loading a new one. The background cannot be shared. You CAN, however, duplicate the sprites and animation in all scenes, but when you transition between them there will be a jump.

Resources