Farseer/XNA Assertion Failed, Vector2 position for body modified by camera matrix - matrix

I created a camera with a matrix and used it to move the view point in 2D. Basically I started from this template:
http://torshall.se/?p=272
I also had in one of my class, a simple code to spawn boxs with the mouse:
public void CreateBodies()
{
mouse = Mouse.GetState();
if (mouse.RightButton == ButtonState.Pressed)
{
Bodies += 1;
if (Bodies >= MaxBodies)
Bodies = 0;
rectBody[Bodies] = BodyFactory.CreateRectangle(world, ConvertUnits.ToSimUnits(rectangle.Width), ConvertUnits.ToSimUnits(rectangle.Height), 1);
rectBody[Bodies].Position = ConvertUnits.ToSimUnits(mouse.X, mouse.Y);
rectBody[Bodies].BodyType = BodyType.Dynamic;
}
}
This Worked perfectly fine but when I moved the ''camera'' the mouse didn't change in the right location, Si I did this little modification in game1.cs and in my method to have the world coord. of my mouse:
mouse = Mouse.GetState();
Matrix inverse = Matrix.Invert(camera.transform);
Vector2 mousePos = Vector2.Transform(new Vector2(mouse.X, mouse.Y), inverse);
TE.CreateBodies(mousePos);
public void CreateBodies(Vector2 mousePosition)
{
mouse = Mouse.GetState();
MousePosition = mousePosition;
if (mouse.RightButton == ButtonState.Pressed)
{
Bodies += 1;
if (Bodies >= MaxBodies)
{
Bodies = 0;
}
rectBody[Bodies] = BodyFactory.CreateRectangle(world, ConvertUnits.ToSimUnits(rectangle.Width), ConvertUnits.ToSimUnits(rectangle.Height), 1);
rectBody[Bodies].BodyType = BodyType.Dynamic;
rectBody[Bodies].Position = ConvertUnits.ToSimUnits(MousePosition);
}
}
Now this is supposed to give me the world coords. of my mouse, but I have a problem, when I run the program and click somewhere on the screen to create a box I get this error:
http://img68.xooimage.com/files/6/a/4/bob-2c526f4.png
What's going on? :/
Edit:
This is at the line 439 of body.cs:
Debug.Assert(!float.IsNaN(value.X) && !float.IsNaN(value.Y));

Related

How do I scale translate x,y values?

I am working on a 2d grid with scale touch functionality. I've managed to set the translate boundaries so that the screen viewport doesn't go beyond the grid boundaries. I'm now struggling with the algorithm for determining the new translate values when scaling on both two finger touch and mouse wheel events.
touchStarted sets the vector angle between the two initial touches. lastTouchAngle is for comparison in touchMoved.
function touchStarted() {
if(touches.length == 2) {
let touchA = createVector(touches[0].x, touches[0].y);
let touchB = createVector(touches[1].x, touches[1].y);
lastTouchAngle = touchA.angleBetween(touchB);
}
return false;
}
touchMoved makes the current touches vectors, compares the angle, and then scales accordingly.
t_MinX and t_MinY set the lowest possible translate value for the constrains, but determining what the new translate value should be is where I'm lost. I know it's going to require the current scale, the center point between the two touches, and the width and height of the Canvas.
function touchMoved() {
if(touches.length == 1) {
panTranslate(translateX, translateY, mouseX, mouseY, pmouseX, pmouseY);
} else if (touches.length == 2) {
let touchA = createVector(touches[0].x, touches[0].y);
let touchB = createVector(touches[1].x, touches[1].y);
scl = (abs(lastTouchAngle) < abs(touchA.angleBetween(touchB)) ? (scl+sclStep < sclMax ? scl+sclStep : sclMax) : (scl-sclStep > sclMin ? scl-sclStep : sclMin));
let t_MinX = (screenH/sclMin) * (sclMin-scl);
let t_MinY = (screenW/sclMin) * (sclMin-scl);
let tX = translateX;
let tY = translateY;
if(abs(lastTouchAngle) > abs(touchA.angleBetween(touchB))) {
console.log("Scale out");
translateX = constrain(tX+mX, t_MinX, 0);
translateY = constrain(tY+mY, t_MinY, 0);
} else {
console.log("Scale in");
if(scl != sclMax) {
translateX = constrain(tX-mX, t_MinX, 0);
translateY = constrain(tY-mY, t_MinY, 0);
}
}
// Set current touch angle to lastTouchAngle
lastTouchAngle = touchA.angleBetween(touchB);
}
return false;
}
Here is the bit getting me confused:
translateX = constrain(tX+mX, t_MinX, 0);
translateY = constrain(tY+mY, t_MinY, 0);
Full code: https://editor.p5js.org/OMTI/sketches/9ux6Rq6n5
https://stackoverflow.com/questions/5713174
I found the answer at the above link and was able to get this working from the answer there.

Unity3d HTC Vive Radial Menu - Weird Glitching

So i wrote this radial Menu controlled by the trackpad on the left-hand wand. It determine which button to magnify by my fingers position on trackpad.
The Weird movement can be seen here.
Here i attacked my code related to this problem, the code for left wand.
SteamVR_TrackedObject obj; //The wand
public GameObject buttonHolder; //All the buttons will be children of this object
public bool buttonEnabled;
void Awake() {
obj = GetComponent<SteamVR_TrackedObject>(); //this will be left hand controller
}
void Update() {
var device = SteamVR_Controller.Input((int)obj.index);
//if touchpad touched
if (device.GetTouch(SteamVR_Controller.ButtonMask.Touchpad))
{
if (buttonEnabled) //if radial menu is open
{
//touchPadAngle: Get the angle between touch coord and X-axis
Vector2 touchedCoord = device.GetAxis(EVRButtonId.k_EButton_Axis0); //what is this line each variable
float touchPadAngle = VectorAngle(new Vector2(1, 0), touchedCoord); //(1, 0) is X-axis
// ------------------- Find closest button ------------------------
//Description: The process will be done by calculating the angle between button_Vector2 and X-axis (button_V2_to_10)
// And then find the button with the closest angler difference with (touchPadAngle).
float minAngle = float.PositiveInfinity;
Transform minButton = transform; //Temperatry assign wand tranform to it.
float pad_N_button_Angle = 0.0f; //Angle between touchPadAngle and buttonAngle.
Vector2 button_V2_to_10;
float button_Angle;
foreach (Transform bt in buttonHolder.transform)
{
button_V2_to_10 = new Vector2(transform.position.x, transform.position.z) - new Vector2(bt.position.x, bt.position.z);
button_Angle = VectorAngle(new Vector2(1, 0), button_V2_to_10);
pad_N_button_Angle = Mathf.Abs(button_Angle - touchPadAngle);
//Both buttonAngle and touchPadAngle range from -180 to 180, avoid Abs(170 - (-170)) = 340
pad_N_button_Angle = (pad_N_button_Angle > 180) ? Mathf.Abs(pad_N_button_Angle - 360) : pad_N_button_Angle;
if (pad_N_button_Angle < minAngle)
{
minButton = bt;
minAngle = pad_N_button_Angle;
}
}
//Magnify the closest button
foreach (Transform bt in buttonHolder.transform)
{
GameObject btGO = bt.gameObject;
if (!btGO.GetComponentInChildren<ButtomHandler>().onHover && bt == minButton) {
//Magnify
}
else if (bt != minButton && btGO.GetComponentInChildren<ButtomHandler>().onHover)
{
//minify
}
}
}
else {
activateButtonMenu();
}
}
//dis-hover all button if leave touch pad
if (device.GetTouchUp(SteamVR_Controller.ButtonMask.Touchpad)) {
//Hover the closest button
foreach (Transform bt in buttonHolder.transform)
{
GameObject btGO = bt.gameObject;
if (btGO.GetComponentInChildren<ButtomHandler>().onHover)
{
//minify
}
}
}
I'm quite stucked here, Any help would really be appreciated
"the closest angler difference with (touchPadAngle)"
shouldn't you consider more than one axis for a radial dial?

Not able to Shoot in the direction of the hand

I have designed a model in blender and imported in Unity and applied ThirdPersonController, ThirdPersonCharacter, ThirdPersonUserControl on it and got animation y following the guidelines, now i have created a script for shooting the bullets and attached it to the rigged hand/gun. But whenever i click "Fire1" the bullet is getting shooted in other direction..
I want when i move the mouse, the hand should move in the direction of the mouse + body should rotate in the direction of the mouse (if on backside) and when i left click, it should fire a bullet in the direction of the mouse(one at a time).
Video for better understanding - http://tinypic.com/r/34yohli/9
I tried a script, but its not following the way i want.
Shoot.js
#pragma strict var projectile : GameObject;
var fireRate = 0.5;
private var nextFire = 0.0;
var shotDelay = .5;
function Update ()
{
if (Input.GetButton ("Fire1") && Time.time > nextFire)
{
nextFire = Time.time + fireRate;
var clone = Instantiate (projectile, transform.position, transform.rotation);
}
}
MouseMovement.cs
using UnityEngine;
using System.Collections;
public class MouseMovement : MonoBehaviour
{
public float speed = 1.5f;
private Vector3 target;
void Start()
{
target = transform.position;
}
void Update()
{
if (Input.GetMouseButtonDown(0))
{
target = Camera.main.ScreenToWorldPoint(Input.mousePosition);
target.x = transform.position.x;
}
transform.position = Vector3.MoveTowards(transform.position, target, speed * Time.deltaTime);
}
}
To detect the movement of the mouse you should use the Input.GetAxis("Mouse X") or Input.GetAxis("Mouse Y"). If you want the camera to move with the character you can set it as a child of the character. You can check the MouseLook Script for more info.

Scale UI for multiple resolutions/different devices

I have a quite simple unity GUI that has the following scheme :
Where Brekt and so are buttons.
The GUI works just fine on PC and is on screen space : overlay so it is supposed to be adapted automatically to fit every screen.
But on tablet the whole GUI is smaller and reduced in the center of the screen, with huge margins around the elements (can't join a screenshot now)
What is the way to fix that? Is it something in player settings or in project settings?
Automatically scaling the UI requires using combination of anchor,pivot point of RecTransform and the Canvas Scaler component. It is hard to understand it without images or videos. It is very important that you thoroughly understand how to do this and Unity provided full video tutorial for this.You can watch it here.
Also, when using scrollbar, scrollview and other similar UI controls, the ContentSizeFitter component is also used to make sure they fit in that layout.
There is a problem with MovementRange. We must scale this value too.
I did it so:
public int MovementRange = 100;
public AxisOption axesToUse = AxisOption.Both; // The options for the axes that the still will use
public string horizontalAxisName = "Horizontal"; // The name given to the horizontal axis for the cross platform input
public string verticalAxisName = "Vertical"; // The name given to the vertical axis for the cross platform input
private int _MovementRange = 100;
Vector3 m_StartPos;
bool m_UseX; // Toggle for using the x axis
bool m_UseY; // Toggle for using the Y axis
CrossPlatformInputManager.VirtualAxis m_HorizontalVirtualAxis; // Reference to the joystick in the cross platform input
CrossPlatformInputManager.VirtualAxis m_VerticalVirtualAxis; // Reference to the joystick in the cross platform input
void OnEnable()
{
CreateVirtualAxes();
}
void Start()
{
m_StartPos = transform.position;
Canvas c = GetComponentInParent<Canvas>();
_MovementRange = (int)(MovementRange * c.scaleFactor);
Debug.Log("Range:"+ _MovementRange);
}
void UpdateVirtualAxes(Vector3 value)
{
var delta = m_StartPos - value;
delta.y = -delta.y;
delta /= _MovementRange;
if (m_UseX)
{
m_HorizontalVirtualAxis.Update(-delta.x);
}
if (m_UseY)
{
m_VerticalVirtualAxis.Update(delta.y);
}
}
void CreateVirtualAxes()
{
// set axes to use
m_UseX = (axesToUse == AxisOption.Both || axesToUse == AxisOption.OnlyHorizontal);
m_UseY = (axesToUse == AxisOption.Both || axesToUse == AxisOption.OnlyVertical);
// create new axes based on axes to use
if (m_UseX)
{
m_HorizontalVirtualAxis = new CrossPlatformInputManager.VirtualAxis(horizontalAxisName);
CrossPlatformInputManager.RegisterVirtualAxis(m_HorizontalVirtualAxis);
}
if (m_UseY)
{
m_VerticalVirtualAxis = new CrossPlatformInputManager.VirtualAxis(verticalAxisName);
CrossPlatformInputManager.RegisterVirtualAxis(m_VerticalVirtualAxis);
}
}
public void OnDrag(PointerEventData data)
{
Vector3 newPos = Vector3.zero;
if (m_UseX)
{
int delta = (int)(data.position.x - m_StartPos.x);
delta = Mathf.Clamp(delta, -_MovementRange, _MovementRange);
newPos.x = delta;
}
if (m_UseY)
{
int delta = (int)(data.position.y - m_StartPos.y);
delta = Mathf.Clamp(delta, -_MovementRange, _MovementRange);
newPos.y = delta;
}
transform.position = new Vector3(m_StartPos.x + newPos.x, m_StartPos.y + newPos.y, m_StartPos.z + newPos.z);
UpdateVirtualAxes(transform.position);
}

Play an animation when touch moved is certain distance from touch began

i am new to unityscript and unity and i am trying to make an animation trigger when the touch moved position is +100 to the right of touch began, so i have also tried +500 and +1000 and it seems that the animation is playing when the touch is past 100,500,or 1000 on the screen, not the touch.began position + (the amount), any help is appreciated, thank you for your time as i am new to unityscript
#pragma strict
var distance : float = 10;
var joystick : GameObject;
private var first : boolean = false;
function Start () {
}
function Update () {
transform.eulerAngles = Vector3(0,Camera.main.transform.eulerAngles.y + 180,0);
var v3Pos : Vector3;
if (Input.touchCount > 0 &&
Input.GetTouch(0).phase == TouchPhase.Began) {
// Get movement of the finger since last frame
var touchDeltaPosition:Vector2 = Input.GetTouch(0).position;
if(!first){
var touchdet : Vector2 = touchDeltaPosition;
first = true;
}
// Move object across XY plane
v3Pos = Vector3(touchDeltaPosition.x, touchDeltaPosition.y, distance);
transform.position = Camera.main.ScreenToWorldPoint(v3Pos);
}
if (Input.touchCount > 0 &&
Input.GetTouch(0).phase == TouchPhase.Moved) {
// Get movement of the finger since last frame
var touchAlphaPosition:Vector2 = Input.GetTouch(0).position;
// Move object across XY plane
v3Pos = Vector3(touchAlphaPosition.x, touchAlphaPosition.y, distance);
transform.position = Camera.main.ScreenToWorldPoint(v3Pos);
}
if (Input.touchCount > 0 &&
(Input.GetTouch(0).phase == TouchPhase.Ended || Input.GetTouch(0).phase == TouchPhase.Canceled )) {
// Get movement of the finger since last frame
var touchBetaPosition:Vector2 = Input.GetTouch(0).position;
first = false;
// Move object across XY plane
v3Pos = Vector3(touchBetaPosition.x, 600, distance);
transform.position = Camera.main.ScreenToWorldPoint(v3Pos);
}
if(first)
{
if(touchAlphaPosition.x > touchdet.x + 100)
{
animation.Play("Right");
}
}
}
The variable touchDet is declared and initialized in the function Update, so the value is not persisted between function calls. touchDet in all but the iteration where TouchPhase.Began event fires will always be equal to Vector2.zero.

Resources