Unity: rotating object around Z axis with Lerp/Slerp - rotation

Im trying to rotate my player to face where I last clicked. I've acutally manged to do this, but now I want to see the player rotate at a set speed instead of the sprite just changing rotation instantly.
Ive tried several methods I've found online, but none of them work for me. Here's what I have
void Update()
{
if (Input.GetMouseButtonDown (0))
{
Vector3 diff = Camera.main.ScreenToWorldPoint(Input.mousePosition) - transform.position;
diff.Normalize();
float rot_z = Mathf.Atan2(diff.y, diff.x) * Mathf.Rad2Deg;
transform.rotation= Quaternion.Euler(0f, 0f, rot_z - 90);
Instantiate(ProjectilePrefab, transform.position, transform.rotation);
}
}
The code above works fine, but it shows no movement. I have tried to do this but the position is wrong and the rotation is instant as well:
Vector3 diff = Camera.main.ScreenToWorldPoint(Input.mousePosition) - transform.position;
var newRotation = Quaternion.LookRotation(diff);
newRotation.y = 0.0f;
newRotation.x = 0.0f;
transform.rotation = Quaternion.Slerp(transform.rotation, newRotation, Time.deltaTime * 30);
ANy ideas?

Two problems here. First, the Slerp function is only called once after the user pressed the mouse button. You have to move it outside the if part or use a coroutine. Second, Slerp expects a float from 0 to 1 to indicate the progress and not time or deltaTime. The official examples covering the lerp functions are just bad, because using the time value would work only during the first second the game is running.
You need something like this
if (Input.GetMouseButtonDown (0)) {
// your other stuff here
float starTime = Time.time;
}
float t = Time.time - startTime;
transform.rotation = Quaternion.Slerp(transform.rotation, newRotation, t);
The rotation would finish after one second. If you want it faster or slower just multiply t with a factor.

I found the answer. This is how I made it work
if (Input.GetMouseButtonDown (0)) {
target = Camera.main.ScreenToWorldPoint (Input.mousePosition);
rotateToTarget = true;
print ("NEW TARGET: " + target);
}
if (rotateToTarget == true && target != null) {
print ("Rotating towards target");
targetRotation = Quaternion.LookRotation (transform.position - target.normalized, Vector3.forward);
targetRotation.x = 0.0f;//Set to zero because we only care about z axis
targetRotation.y = 0.0f;
player.transform.rotation = Quaternion.Slerp (transform.rotation, targetRotation, Time.deltaTime * rotationSpeed);
if (Mathf.Abs (player.transform.rotation.eulerAngles.z - targetRotation.eulerAngles.z) < 1) {
rotateToTarget = false;
travelToTarget = true;
player.transform.rotation = targetRotation;
print ("ROTATION IS DONE!");
}
}

Related

Unity2D: How do I move the player to the touch position

I can't figure it out, how do I move the Player slowly, scaleable with a speed value to the point, where the touch happens?
My Camera is attached to the player.
You should use Vector2.Lerp, Input.GetTouch, and Touch.Position. Here is a code example that prints the current touch position:
if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began)
{
Debug.Log(Input.GetTouch(0).position);
}
Now, we should add this to a certain position in the world. We can use Camera.ScreenPointToRay for this part:
if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began)
{
Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(0).position);
RaycastHit2D hit;
Physics2D.Raycast(ray, out hit);
Debug.Log(hit.point);
}
Here, we actually get the position in the world that you have pressed. We can then use this along with Lerp to move an object to the position:
public float scaleFactor = 0.2f; //how fast to move object.
public GameObject moving // drag the object you want to move in this slot.
public Vector2 Lerped;//Vector3 if you’re working in 3D.
public float time = 0f;
public bool lerping = false;
private Vector2 newPos = new Vector2();
//Vector3 if you’re working in 3D.
RaycastHit2D hit;//remove the 2D after RaycastHit if you are in 3D. If you are in 2D, leave it.
float dist;
...
if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began)
{
Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(0).position);
Physics2D.Raycast(ray, out hit);//same with this one. Remove the 2D if you are in 3D.
Debug.Log(hit.point);
dist = Vector2.Distance(moving.transform.position, hit.point);
//Vector3 if you’re in 3D.
lerping = true;
}
if (lerping)
{
time += Time.deltaTime * dist * scaleFactor;
lerp(hit.point.x, hit.point.y, hit.point.z);
//remove hit.point.z if you are in 2D.
}
else
{
time = 0f;
}
if (moving.transform.position == new Vector3(hit.point.x, hit.point.y, hit.point.z);
//remove hit.point.z if in 2D, and switch Vector3 to Vector2 if in 2D.
{
lerping = false;
}
moving.transform.position = newPos;
And then in another part of the script:
public void lerp(float x, float y, float z)
//remove the float z and the comma before it if you are in unity 2d.
{
Vector3 pos = new Vector3(x, y, z);
//Vector2 and no z if you’re in 2D.
newPos = Vector2.Lerp(moveObj, pos, time);
//Vector3 if in 3D.
}
This is a pretty large script, and it’s pretty hard to explain, so I will say this: do what the comments tell you, and tell me if you get any errors. I haven’t actually tested the script, so if anyone notices them, please tell me.
Now, what the script does: basically, it gets the point of where you touched, and makes the player slowly go towards it (the speed changes by the distance and scaleFactor variable of which you can change.)

Weird values while debugging values in Unity

I found out while debugging floats and stuff in Unity the values are sometimes off. I.e. I have a float called currentSpeed and when the player is moving this is a value and when he is not it is 0. The problem is when debugging this value while walking the following happens; i.e. 1,1,1,1,1,1,0,1,1,1,1,1,1,0,1,1,1. I found out this is happening with multiple values and Vectors. The weird thing is, when my framerate is lower it doesn't happen (i.e. while having the player selected in the inspector the framerate drops from 60 to 48). I'm using Unity 2018.3 and have no clue how to fix this. I hope someone can help me.
Edit (added code):
void Move(Vector2 inputDir, bool running)
{
if (inputDir != Vector2.zero)
{
// When the player is not pushing we can create a moving direction
if (!playerIsPushing)
{
float targetRotation = Mathf.Atan2(inputDir.x, inputDir.y) * Mathf.Rad2Deg + cameraT.eulerAngles.y;
transform.eulerAngles = Vector3.up * Mathf.SmoothDampAngle(transform.eulerAngles.y, targetRotation, ref turnSmoothVelocity, GetModifiedSmoothTime(turnSmoothTime));
}
}
float targetSpeed = ((running) ? runSpeed : movementSpeed) * inputDir.magnitude;
currentSpeed = Mathf.SmoothDamp(currentSpeed, targetSpeed, ref speedSmoothVelocity, GetModifiedSmoothTime(speedSmoothTime));
velocityY += Time.deltaTime * gravity;
Vector3 velocity = transform.forward * currentSpeed + Vector3.up * velocityY;
// Here we cap the players velocity to a maximum speed. In this case the runSpeed. This means the player can never exceed the runSpeed.
if (velocity.x > runSpeed)
{
velocity.x = runSpeed;
}
if (velocity.x < -runSpeed)
{
velocity.x = -runSpeed;
}
if (velocity.z > runSpeed)
{
velocity.z = runSpeed;
}
if (velocity.z < -runSpeed)
{
velocity.z = -runSpeed;
}
controller.Move(velocity * Time.deltaTime);
currentSpeed = new Vector2(controller.velocity.x, controller.velocity.z).magnitude;
// Debugging the players velocity
Debug.Log(currentSpeed);
float animationSpeedPercent = ((running) ? currentSpeed / runSpeed : currentSpeed / movementSpeed * .5f);
Edit 2: I have tested some more and found out my pushforce isn't framerate independant. During pushing an object in my game I see the weird values while debugging i.e. 1,1,1,1,1,0,1,1,1,1,1,1,0,1,1,1,1,1. I'm working with a CharacterController so adding force to a rigidbody is done via script.
void OnControllerColliderHit(ControllerColliderHit hit)
{
Rigidbody body = hit.collider.attachedRigidbody;
if (body == null || body.isKinematic)
return;
if (hit.moveDirection.y < -.3f)
return;
Vector3 pushDirection = new Vector3(hit.moveDirection.x, 0, hit.moveDirection.z);
body.velocity = pushForce * pushDirection;
}
Might this be the issue for the weird values?

Path Tracing Shadowing Error

I really dont know what else do to to fix this problem.I have written a path tracer using explicit light sampling in c++ and I keep getting this weird really black shadows which I know is wrong.I have done everything to fix it but I still keep getting it,even on higher samples.What am I doing wrong ? Below is a image of the scene.
And The Radiance Main Code
RGB Radiance(Ray PixRay,std::vector<Primitive*> sceneObjects,int depth,std::vector<AreaLight> AreaLights,unsigned short *XI,int E)
{
int MaxDepth = 10;
if(depth > MaxDepth) return RGB();
double nearest_t = INFINITY;
Primitive* nearestObject = NULL;
for(int i=0;i<sceneObjects.size();i++)
{
double root = sceneObjects[i]->intersect(PixRay);
if(root > 0)
{
if(root < nearest_t)
{
nearest_t = root;
nearestObject = sceneObjects[i];
}
}
}
RGB EstimatedRadiance;
if(nearestObject)
{
EstimatedRadiance = nearestObject->getEmission() * E;
Point intersectPoint = nearestObject->intersectPoint(PixRay,nearest_t);
Vector intersectNormal = nearestObject->surfacePointNormal(intersectPoint).Normalize();
if(nearestObject->getBRDF().Type == 1)
{
for(int x=0;x<AreaLights.size();x++)
{
Point pointOnTriangle = RandomPointOnTriangle(AreaLights[x].shape,XI);
Vector pointOnTriangleNormal = AreaLights[x].shape.surfacePointNormal(pointOnTriangle).Normalize();
Vector LightDistance = (pointOnTriangle - intersectPoint).Normalize();
//Geometric Term
RGB Geometric_Term = GeometricTerm(intersectPoint,pointOnTriangle,sceneObjects);
//Lambertian BRDF
RGB LambertianBRDF = nearestObject->getColor() * (1. / M_PI);
//Emitted Light Power
RGB Emission = AreaLights[x].emission;
double MagnitudeOfXandY = (pointOnTriangle - intersectPoint).Magnitude() * (pointOnTriangle - intersectPoint).Magnitude();
RGB DirectLight = Emission * LambertianBRDF * Dot(intersectNormal,-LightDistance) *
Dot(pointOnTriangleNormal,LightDistance) * (1./MagnitudeOfXandY) * AreaLights[x].shape.Area() * Geometric_Term;
EstimatedRadiance = EstimatedRadiance + DirectLight;
}
//
Vector diffDir = CosWeightedRandHemiDirection(intersectNormal,XI);
Ray diffRay = Ray(intersectPoint,diffDir);
EstimatedRadiance = EstimatedRadiance + ( Radiance(diffRay,sceneObjects,depth+1,AreaLights,XI,0) * nearestObject->getColor() * (1. / M_PI) * M_PI );
}
//Mirror
else if(nearestObject->getBRDF().Type == 2)
{
Vector reflDir = PixRay.d-intersectNormal*2*Dot(intersectNormal,PixRay.d);
Ray reflRay = Ray(intersectPoint,reflDir);
return nearestObject->getColor() *Radiance(reflRay,sceneObjects,depth+1,AreaLights,XI,0);
}
}
return EstimatedRadiance;
}
I haven't debugged your code, so there may be any number of bugs of course, but I can give you some tips: First, go look at SmallPT, and see what it does that you don't. It's tiny but still quite easy to read.
From the look of it, it seems there are issues with either the sampling and/or gamma correction. The easiest one is gamma: when converting RGB intensity in the range 0..1 to RGB in the range 0..255, remember to always gamma correct. Use a gamma of 2.2
R = r^(1.0/gamma)
G = g^(1.0/gamma)
B = b^(1.0/gamma)
Having the wrong gamma will make any path traced image look bad.
Second: sampling. It's not obvious from the code how the sampling is weighted. I'm only familiar with Path Tracing using russian roulette sampling. With RR the radiance basically works like so:
if (depth > MaxDepth)
return RGB();
RGB color = mat.Emission;
// Russian roulette:
float survival = 1.0f;
float pContinue = material.Albedo();
survival = 1.0f / pContinue;
if (Rand.Next() > pContinue)
return color;
color += DirectIllumination(sceneIntersection);
color += Radiance(sceneIntersection, depth+1) * survival;
RR is basically a way of terminating rays at random, but still maintaining an unbiased estimate of the true radiance. Since it adds a weight to the indirect term, and the shadow and bottom of the speheres are only indirectly lit, I'd suspect that has something to do with it (if it isn't just the gamma).

Animate rotation in Unity3d

I'm working on a 2D game in Unity3D (using Orthello 2D).
Since I switched from Cocos2d and CoronaSDK, I'm wondering if there's a way of implementing the following behaviour for a sprite (or any Unity3D object) as it worked in Corona:
object = ...
transition.to ( object, { time = 1000, rotation = object.rotation + 100, onComplete = function ()
// do something
end })
So a sprite rotates by 100 degrees over 1 second.
In my script attached to a sprite I can have a rotation in my Update () function, but it's a bit different approach...
You can do it easily in an Update function.
float timer = 0f;
void Update()
{
if(timer <= 1)
{
// Time.deltaTime*100 will make sure we are moving at a constant speed of 100 per second
transform.Rotate(0f,0f,Time.deltaTime*100);
// Increment the timer so we know when to stop
timer += Time.deltaTime;
}
}
If you need to do another 100 degrees rotation you will just have to reset the timer.
You can see different version of the Rotate function here and more information about the lifesaver Time.deltaTime value here
There are several differnt ways of doing that. For example using a coroutine:
IEnumerator TweenRotation(Transform trans, Quaternion destRot, float speed, float threshold )
{
float angleDist = Quaternion.Angle(trans.rotation, destRot);
while (angleDist > threshold)
{
trans.rotation = Quaternion.RotateTowards(trans.rotation, destRot, Time.deltaTime * speed);
yield return null;
float angleDist = Quaternion.Angle(trans.rotation, destRot);
}
}

Rotation Automation

I've run into a few problems with this expression in Maya, basically anytime the radius is less than 1, the calculation is thrown off way too much.
float $radius = `getAttr prefix66_calculations_shape.rad`;
float $prevZval = `getAttr -time (frame -1) prefix66_driver.translateZ`;
float $prevXval = `getAttr -time (frame -1) prefix66_driver.translateX`;
float $Zval = prefix66_driver.translateZ - $prevZval;
float $Xval = prefix66_driver.translateX - $prevXval;
float $distance = ($Zval * $Zval) + ($Xval * $Xval);
float $direction;
$distance = sqrt($distance);
if ($prevZval > prefix66_driver.translateZ) {
$direction = 360;
}
else {
$direction = 360;
}
float $rotation = ($distance / (2 * 3.142 * $radius)) * $direction;
print $rotation;
pCube1.rotateX = pCube1.rotateX + $rotation;
Maybe my order of operations is wrong?
The rotation part of your code looks ok. However, you have an if/else block that returns the same thing in both cases, and as mentioned by #joojaa, you can avoid getAttr -time if you cache the translation values. In fact you should avoid getAttr and setAttr completely in expressions.
Instead, refer to the attributes you want directly and Maya will create connections for you. This is much faster and less prone to errors when you rename nodes and so on.
To cache the translation values, and calculate change in position you can add attributes to the node and use them in the expression.
Let's say you have a cylinder called wheel that rotates around its local X and is parented to a group node called control:
Add a vector attribute: control.lastTranslate
Add a vector attribute: control.deltaTranslate
Add a float attribute: control.distance
Here's an expression that will store the change in translation, then rotate the wheel based on the distance travelled.
// When deltaTranslate is calculated, lastTranslate still has its previous value.
control.deltaTranslateX = control.translateX - control.lastTranslateX;
control.deltaTranslateY = control.translateY - control.lastTranslateY;
control.deltaTranslateZ = control.translateZ - control.lastTranslateZ;
control.lastTranslateX = control.translateX;
control.lastTranslateY = control.translateY;
control.lastTranslateZ = control.translateZ;
control.distance = mag(<<control.deltaTranslateX,control.deltaTranslateY,control.deltaTranslateZ>>);
// Get radius from history node (or somewhere) and move the wheel's hub off the floor.
wheel.translateY = polyCylinder1.radius;
// add rotation to the wheel
float $tau = 6.283185307179586;
wheel.rotateX = wheel.rotateX + ( control.distance* -360.0) / (polyCylinder1.radius * $tau );
It's best to test this kind of thing by animating rather than dragging nodes around in the view.
If you wanted to make the wheel aim to the direction of travel, you could add a locator at translate + deltaTranslate and hook up an aim constraint.
e.g.
aimLocator.translateX = (control.deltaTranslateX / control.distance) + control.translateX;
aimLocator.translateY = (control.deltaTranslateY / control.distance) + control.translateY;
aimLocator.translateZ = (control.deltaTranslateZ / control.distance) + control.translateZ;
Dividing by distance will normalize the offset. You should probably check that distance is not zero.
I believe I have figured it out :)
Queering the old traslation average, with the new translation average will give me a true or false answer, which is what I needed to change direction.
Also added an if statement that if the ball is static and rotating, that the wheel doesn't turn automatically.
float $oldRotateAverage;
float $oldTransAverage;
float $direction;
nurbsCircle1.DeltaTranslateX = nurbsCircle1.translateX - nurbsCircle1.LastTranslateX;
nurbsCircle1.DeltaTranslateY = nurbsCircle1.translateY - nurbsCircle1.LastTranslateY;
nurbsCircle1.DeltaTranslateZ = nurbsCircle1.translateZ - nurbsCircle1.LastTranslateZ;
nurbsCircle1.LastTranslateX = nurbsCircle1.translateX;
nurbsCircle1.LastTranslateY = nurbsCircle1.translateY;
nurbsCircle1.LastTranslateZ = nurbsCircle1.translateZ;
nurbsCircle1.Distance = mag(<<nurbsCircle1.DeltaTranslateX,nurbsCircle1.DeltaTranslateY,nurbsCircle1.DeltaTranslateZ>>);
if ($oldTransAverage >= (nurbsCircle1.LastTranslateX + nurbsCircle1.LastTranslateY + nurbsCircle1.LastTranslateZ)){
$direction = -360.00;
} else {
$direction = 360.00;
};
if (Sh54_anim.auto == 1 )
{
Sh54_point_grp.rotateZ -= nurbsCircle1.Distance * $direction / 2 / 3.14 / 2;
};
if ((nurbsCircle1.rotateX + nurbsCircle1.rotateY + nurbsCircle1.rotateZ) != $oldRotateAverage && nurbsCircle1.Distance == $oldTransAverage){
Sh54_anim.auto = 0;
} else {
Sh54_anim.auto = 1;
};
Sh54_point_grp.back_up = Sh54_point_grp.translateX;
$oldRotateAverage = nurbsCircle1.rotateX + nurbsCircle1.rotateY + nurbsCircle1.rotateZ;
$oldTransAverage = nurbsCircle1.translateX + nurbsCircle1.translateY + nurbsCircle1.translateZ;

Resources