XNA 2D Camera Not Moving - matrix

I am creating a game engine and a test game, and I am currently working on cameras (using Matrix transforms). However, I am running into the strangest problem. I have a variable called CameraPosition (Vector2) that sets the position of the camera. The camera will move fine when I move the camera by doing something like
CameraPosition += new Vector2(0, 4);
(the camera will move up and all objects will appear to move down). However, when I do this:
CameraPosition = ship.Position;
nothing happens: the camera stays in the exact same spot, even though the ships position is changing. I've checked, and the matrix is changing as well, but it's not affecting the drawing.
PS: The code I'm using to calculate the matrix is this:
CameraMatrix = Matrix.CreateTranslation(new Vector3(CameraPosition, 0));
and the code I'm using to draw is this:
spriteBatch.Begin(SpriteSortMode.Immediate, null, null, null, null, null, CameraMatrix);
//draw stuff here...
spriteBatch.End();
If anyone has any ideas on what's going on/needs to see more of the code, feel free to ask/tell!
EDIT: Rotation and scaling/zooming work fine: it's just the translations that are messing up.

Related

Three.js keep object static relative to outside container div - EDIT now with jsfiddle

So I have the basic setup of a three.js-canvas rendered inside of a html-div. Inside the 3d-world I want to position an object in such a way that it appears to be glued on to this outside div (should never move in any way). Currently I have this solution:
render {
object.position.x = camera.position.x;
object.position.y = camera.position.y;
object.position.z = camera.position.z - 200;
}
This works for panning the camera (i have rotation disabled since i don't need it). However, once I zoom in or out it obviously doesn't work any more, since zooming doesn't change the camera's position values. My approach was to incorporate the camera.zoom factor into the above function, but i couldn't get it to work properly. Is there an easy transformation function or something i can use?
Edit: I created a jsfiddle, hopefully this helps figuring out the solution. As long as you pan the camera with right mouse the yellow plane doesn't move at all (wanted behaviour). When you zoom in or out it starts to move (unwanted behaviour): https://jsfiddle.net/rdyLp7uc/2/

Threejs orbitalcontrols set target breaking camera rotation using mouse/touch

I am developing a standard panorama viewer, where a 360 picture is placed inside of a sphere and the user can look around using mouse and touch. I am using OrbtialControls for this and it is working fine.
The user can also load a new 360 picture, after loading the picture, I am trying to set the camera direction so that the user is looking in a certain direction. As I am using orbitalControls, I am using control.target.set(x,y,z) to do so. However that causes the camera to lock at that point and if I use the mouse or touch to look around, the camera position changes and it revolves around that point, rather than looking around inside the sphere.
Has anyone else seen this kind of behavior? Do I need to do something
The code is pretty simple.
controls.reset();
controls.target.set(window.newLookAt.x,window.newLookAt.y,window.newLookAt.z);
The purpose of controls.target.set(x,y,z) is to set the pivot point, so what you are facing is the expected behavior
Instead of setting the target (that has to be (0, 0, 0) in your case), why not putting the camera inside a THREE.Object3D and rotate this object
var camera = new THREE.PerspectiveCamera()
var container = new THREE.Object3D()
container.add( camera )
camera.position.set( 0, 0, 0.1 )
var controls = new THREE.OrbitControls( camera, renderer )
controls.target.set( 0, 0, 0 ) // Optional
container.rotate.y = Math.PI / 2 // Or whatever you want
So I ended up solving this myself. The issue was that my understanding of orbitControls was slightly off. All I needed to do was to set the target point in the same direction but way closer to the camera and presto, issue solved and things are working fine now.

lensflare disppearing and rendering too slow

I'm trying to get lensflare to work in ThreeJS.
It seem to function okay when there is distance to camera but if I camera is moved to about 50 units or less distance to lensflare the flare disappears! Why?
Update:
After further investigation I noticed that lensflare works fine in webgl_lensflares.html example. The problem is when I try to add it to ThreeJS Editor. Adding it to Editor causes 3 problems:
Rendering becomes painfully slow.
When I rotate the scene the lensflare rotates fine, but when I move the scene the lensflare moved the opposite direction.
If I put the lensflare at (0,0,0) it disappears when I get too close to it, but if I put it in locations away from origin such as (0,10,0) it doesn't have that problem.
Here is the code that I added to Editor in Viewport.js:
var textureLoader = new THREE.TextureLoader();
var textureFlare0 = textureLoader.load("textures/lensflare/lensflare0.png");
var flareColor = new THREE.Color(0xffffff);
flareColor.setHSL(0.55, 0.9, 0.5 + 0.5);
var lensFlare = new THREE.LensFlare(textureFlare0, 100, 1.0, THREE.AdditiveBlending, flareColor);
lensFlare.position.set(0, 0, -10);
scene.add(lensFlare);
I figured out the answer to all my 3 problems:
Netbean debugger was slowing down the rendering. Once I turned off Netbean debugger it became much faster. I still notice flare rendering slows down rendering a little but it's at least usable now.
The reason lensfare would move the opposition direction was because I passed 1.0 as its 3rd parameter. Should've been 0.0
The reason why at (0,0,0) I don't see the flare is because there is another shape located on that position. Apparently flare is not visible if it is position insider another shape. I had wrongly assumed that flare is rendered last and hence always visible.

Select the zone where to draw shadows with DirectionalLight

I use a directional light to cast shadow on the ground of my scene. I only have a single object. I decided to have a very small shadow to keep good shadow quality. The problem is I can't manage to shift the position of the shadow camera and keep it on top of the object.
I tried several things.
Targeting my object works but the center of the camera don't change, just its angle. I don't want this behavior. I want to keep the same light direction.
Changing the shadowCameraRight of the light but nothing change
Changing the shadowCameraRight of the shadowCamera, nothing change
changing the position of the sahdowCamera. The "debugger" moves but the shadow stop being drawn as if I did not do anyting
I think there must be a pretty easy way of doing this but I could not find it.
Edit :
var light = THREE.DirectionalLight(...);
var myObject = THREE.Mesh(); //moving everyFrame
//here I just want to move the shadowCamera so the object stay in the frustum
Best
Why not using an Object3D?
var light = THREE.DirectionalLight(...);
var myObject = THREE.Mesh();
val allTogether = new Object3D();
allTogether.add(light);
allTogether.add(myObject);
// here set light position you want (relative)
scene.add(allTogether);
In your loop move the Object3D not your mesh, so everything is moved along.

Three.js Retrieve data from WebGLRenderTarget (water sim)

I am trying to port this (http://madebyevan.com/webgl-water/‎) over to THREE. I think I'm getting close (just want the simulation for now, don't care about caustics/refraction yet). I'd like to get it working with shaders for the GPU boost.
Here's my current THREE setup using shaders: http://jsfiddle.net/EqLL9/2/
(the second smaller plane is for debugging what's currently in the WebGLRenderTarget)
What I'm struggling with is reading data back from the WebGLRenderTarget (rtTexture in my example). In the example you'll see the 4 vertices surrounding the center point are displaced upwards. This is correct (after 1 simulation step) as it starts with the center point being the only point of displacement.
If I could read the data back from the rtTexture and update the data texture (buf1) each frame, then the simulation should properly animate. How does one read the data directly from a WebGLRenderTarget? All the examples demonstrate how to send data TO the target (render to it), not read FROM it. Or am I doing it all wrong? Something's telling me I'll have to work with multiple textures and somehow swap back and forth similar to how Evan did it.
TL;DR: How can I copy data from a WebGLRenderTarget to a DataTexture after a call like this:
// render to rtTexture
renderer.render( sceneRTT, cameraRTT, rtTexture, true );
EDIT: May have found the solution at jsfiddle /gero3/UyGD8/9/
Will investigate and report back.
Ok, I figured out how to read the data using native webgl calls:
// Render first scene into texture
renderer.render( sceneRTT, cameraRTT, rtTexture, true );
// read render texture into buffer
var gl = renderer.getContext();
gl.readPixels( 0, 0, simRes, simRes, gl.RGBA, gl.UNSIGNED_BYTE, buf1.image.data );
buf1.needsUpdate = true;
The simulation now animates. However, it doesn't seem to be functioning properly (probably a dumb error I'm overlooking). It seems that the height values are never being damped and I'm not sure why. The data from buf1 is used in the fragment shader, which calculates the new height (red in RGBA), damps the value (multiplies by 0.99), then renders it to a texture. I then read this updated data from the texture back into buf1.
Here's the latest fiddle: http://jsfiddle.net/EqLL9/3/
I'll keep this updated as I progress along.
EDIT: Works great now. Just got normals implemented, and now working on environment reflection and refraction (again purely though shaders). http://relicweb.com/webgl/rt.html

Resources