Three.js: Sprite Material with file texture disappears when changing camera location - three.js

I am working with Three.js and I encounter strange behavior: when I manipulate the camera location (after user's gestures), this object disappear and come back after additional manipulations. Sometimes they don't exist at first and get shown after relocating the camera.
The camera is defined that way:
camera = new THREE.PerspectiveCamera(45, window.innerWidth / window.innerHeight, 0.01, 100000);
The objects are defined that way:
new THREE.TextureLoader().load(imageUrl,
function(texture){
texture.magFilter = THREE.LinearFilter;
texture.minFilter = THREE.LinearMipMapLinearFilter;
var material = new THREE.SpriteMaterial({map: texture, useScreenCoordinates: true});
var marker = new THREE.Sprite(material);
scene.add(marker); })
Thanks!

Related

Manually specifying camera matrices in ThreeJS

I'm working on a project where I will draw 3D graphics on a video that is filmed with a real camera. I get provided a static projection and view matrix and my task is to draw the actual graphics on top. I've got it working in pure WebGL and know I'm trying to do it in ThreeJS. The problem is that I can't find a way to manually set the projection and view matrix for the camera in ThreeJS.
I've tried to set camera.matrixAutoUpdate = false and calling camera.projectionMatrix.set(matrix) and camera.matrixWorldInverse.set(matrix), but it doesn't seam to work. It is like the actual variable isn't passed to the shader.
Does anyone know how this can be done?
This is what I've got so far:
var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera(
75,
window.innerWidth / window.innerHeight,
0.1,
1000
);
camera.matrixAutoUpdate = false;
camera.projectionMatrix.set(...)
camera.matrixWorldInverse.set(...)
var renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
var geometry = new THREE.BoxGeometry();
var material = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
var cube = new THREE.Mesh(geometry, material);
scene.add(cube);
var animate = function() {
requestAnimationFrame(animate);
renderer.render(scene, camera);
};
animate();

Sphere object deformation after changing position of sphere

I have two sphere objects on the scene. Both of them been made with default position (scene center). There is no problem when objects are in the middle, however when I want to move one to the right and second to the left, strange deformation has a place. When spheres moving away from the center on X axis they seems to be more squeezed on Y axis. It is kind of "FishEye" lens effect. Is it possible that some default cameras value is interfering to make such a result? FOV value does not bring solution for that, and I did not find information about camera lens properties. What is wrong with that?
I have tried to play with Vector3 as a position provider and spheres has the same result.
I have tried object.position.set(vector3) with no result.
Also object.position.copy(vector3) gave the same result.
Trying translate position without animation gave the same result.
Playing with FOV parameter in camera object also did not solve the problem.
// init
var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera(100, window.innerWidth/window.innerHeight, 0.1,10000);
var renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
//creating sphere
var geometry = new THREE.SphereGeometry(6, 16, 16);
var material = new THREE.MeshBasicMaterial({color: 0xffffff, wireframe: true});
var sphere = new THREE.Mesh(geometry, material);
//earth
var geometry = new THREE.SphereGeometry(3, 16, 16);
var material = new THREE.MeshBasicMaterial({color: 0x0000ff, wireframe: true});
var earth = new THREE.Mesh(geometry, material);
scene.add(sphere, earth);
camera.position.z = 10;
var animation = function(){
requestAnimationFrame(animation);
update();
renderer.render(scene,camera);
}
var update = function(){
earth.rotation.y +=0.001;
sphere.rotation.y -=0.001;
sphere.rotation.x -=0.001;
sphere.position.x +=0.001;
}
I expect to move sphere.postion.x -=1; and earth.position.x+=1; without squeezing and deformation of spheres on Y axis.
Welcome to Stack Overflow. Thank you for taking the time to take the tour, and for including your code.
The fish-eye effect is likely being caused by your camera defintion:
var camera = new THREE.PerspectiveCamera(100, window.innerWidth/window.innerHeight, 0.1,10000);
The 100 is the FOV (Field of View) for your camera, and is quite wide. In a PerspectiveCamera, the wider your FOV, the more distortion you'll see for objects closer to the edges. Try setting it lower to get a more natural effect.
Do some searches for "Perspective Distortion" and you should find a host of articles on why it happens and how to mitigate it. For starters, here's the Wikipedia page: https://en.wikipedia.org/wiki/Perspective_distortion_(photography) which has a nice animation of changing the FOV for an image of a house.

Three.VRControls initial rotation

I've been doing through this Unable to change camera position when using VRControls and Three.js - VRControls integration - How to move in the scene? but it's not quite doing what I need.
I have a VR video app and I've just switched to VRControls with the WebVR polyfill from something old and custom. This is working well, however I'm really struggling to set the initial camera angle.
I.e. I want the camera to start pointing at a particular angle, and then rotate with the controls - however the controls always override this angle.
I've tried adding the camera to a Dolly Group or Persepective Camera, and it seems like I can move the camera but not set the initial viewing angle.
Here is how the camera is set up
container = document.getElementById('container');
camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 1, 1024);
scene = new THREE.Scene();
target = new THREE.Vector3();
renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
container.appendChild(renderer.domElement);
var vrEffect = new THREE.VREffect(renderer);
vrEffect.setSize(window.innerWidth, window.innerHeight);
var params = {
hideButton: false, // Default: false.
isUndistorted: false // Default: false.
};
manager = new WebVRManager(renderer, vrEffect, params);
dolly = new THREE.PerspectiveCamera();
dolly.add( camera );
//scene.add( dolly );
controls = new THREE.VRControls(camera);// (I've tried using Dolly here)
controls.standing = true;
And I've tried various ways to rotate the camera, dolly or scene
camera.position.y = currentScene.pan * Math.PI/180;
//controls.resetPose();
//dolly.position.x = currentScene.tilt * Math.PI/180;
//camera.updateProjectionMatrix();
I can rotate the mesh, but then all the objects inside the mesh are in the wrong place.. I could try moving them but this seems like the wrong approach to point the camera to the place I want it..
The camera was getting re-added to the scene at a later time, overriding the dolly. It's working now.

Why does this ThreeJs plane appear to get a kink in it as the camera moves down the y-axis?

I have an instance of THREE.PlaneBufferGeometry that I apply an image texture to like this:
var camera, scene, renderer;
var geometry, material, mesh, light, floor;
scene = new THREE.Scene();
THREE.ImageUtils.loadTexture( "someImage.png", undefined, handleLoaded, handleError );
function handleLoaded(texture) {
var geometry = new THREE.PlaneBufferGeometry(
texture.image.naturalWidth,
texture.image.naturalHeight,
1,
1
);
var material = new THREE.MeshBasicMaterial({
map: texture,
overdraw: true
});
floor = new THREE.Mesh( geometry, material );
floor.material.side = THREE.DoubleSide;
scene.add( floor );
camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 1, texture.image.naturalHeight * A_BUNCH );
camera.position.z = texture.image.naturalWidth * 0.5;
camera.position.y = SOME_INT;
camera.lookAt(floor.position);
renderer = new THREE.CanvasRenderer();
renderer.setSize(window.innerWidth,window.innerHeight);
appendToDom();
animate();
}
function handleError() {
console.log(arguments);
}
function appendToDom() {
document.body.appendChild(renderer.domElement);
}
function animate() {
requestAnimationFrame(animate);
renderer.render(scene,camera);
}
Here's the code pen: http://codepen.io/anon/pen/qELxvj?editors=001
( Note: ThreeJs "pollutes" the global scope, to use a harsh term, and then decorates THREE using a decorator pattern--relying on scripts loading in the correct order without using a module loader system. So, for brevity's sake, I simply copy-pasted the source code of a few required decorators into the code pen to ensure they load in the right order. You'll have to scroll down several thousand lines to the bottom of the code pen to play with the code that instantiates the plane, paints it and moves the camera. )
In the code pen, I simply lay the plane flat against the x-y axis, looking straight up the z-axis, as it were. Then, I slowly pan the camera down along the y-axis, continuously pointing it at the plane.
As you can see in the code pen, as the camera moves along the y-axis in the negative direction, the texture on the plane appears to develop a kink in it around West Texas.
Why? How can I prevent this from happening?
I've seen similar behaviour, not in three.js, not in a browser with webGL but with directX and vvvv; still, i think you'll just have to set widthSegments/heightSegments of your PlaneBufferGeometry to a higher level (>4) and you're set!

why can't orbit control work with a sphere instead of camera in three js?

I want to spin a sphere, so I wonder if orbit controls could work for that.
However, code below won't work:
var geometry = new THREE.SphereGeometry(16, 16, 16);
var material = new THREE.MeshNormalMaterial();
var mesh = new THREE.Mesh( geometry, material );
scene.add( mesh );
var controls = new THREE.OrbitControls(mesh);
//then inside the animation loop
controls.update();
It seems like orbit controls only works when argument is camera. why?
You must pass the camera to the Orbit.
Like this:
camera = new THREE.PerspectiveCamera( 45, window.innerWidth / window.innerHeight, 1, 10000 );
camera.position.set( 0, 1000, 1000 );
var controls = new THREE.OrbitControls( camera );
This helps?
I don't think orbit controls work like that, the camera revolves around a point, it doesn't 'spin' that point. You can however change the target of the orbit controls to the sphere's position vector, allowing the camera to revolve around the sphere's position:
controls.target.copy(mesh.position);

Resources