Get proper absolute rotation after mesh.lookAt - rotation

After I call mesh2.lookAt(mesh1.position) mesh2.rotation.y is the same value whether mesh1.position.z is positive or negative, but mesh2.rotation.x and z are either 0 or -PI -- so there actually is some information in the quaternion/matrix.
I tried to manually call update functions for every matrix and getWorldRotation.
What the heck is going on? How to get the absolute rotation?
Thanks in advance

I don't get why yet, but setting the order of the euler (Y first) fixed it.
I'm just rotating around one axis.

Related

p5.js angleBetween calculates absolute value

My study colleague and I are working on creating some doughnut graphs, and want to use mouseOver to print the values of doughnut slices. The calculations we are using are in radians (using angleMode(RADIANS).
To track the mouse position, we are calculating the mouseX and mouseY positions, and also a 'mouse angle' as the angleBetween comparing a vector(1, 0) and a vector of mouseX, mouseY.
For me, the result is positive radians clockwise from 3.00 o'clock to 9.00 o'clock, and then negative radians for the opposite.
However, for my study colleague they are getting absolute values in radians.
We found a GitHub detailing a bug in this function from August 2019: (https://github.com/processing/p5.js/issues/3973)
My colleague implemented a fix checking on the overall position of Y before calling the angleBetween function, but this is obviously not optimal.
Has anyone else encountered this issue, and know why this is happening?
Thanks!
Older versions of the p5.js library the angleBetween function had different behavior (only return the absolute value of the angle irrespective of direction). However as of version 0.10.0, angleBetween will return a signed value depending on whether the second vector is clockwise or counter clockwise of the first vector (see pull request #4048). So the solution is to make sure they everyone is using the same, recent, version of p5.js.

How to interpolate an NSBezierPath/UIBezierPath and retrieve normal Vector at location?

I have an NSBezierPath. I would like to interpolate along in a way that
interpolate (0) = starting point
interpolate (1) = end point
interpolate (0.5) = middlePoint.
I would like as well a function that provides a normal vector at the point.
I saw a lot of puzzle pieces during my search but nobody offering a full solution in swift. Key problems are:
how can I ensure that interpolate(0.5) is really in the middle of the path? Do I need to consider the length of the overall path? Will it always be the middle point of the middle path segment? I doubt that. However, good approximations are welcome!
how do I retrieve the normal vector on such a point?
Many thanks in advance!
Basically, I used this library: https://github.com/louisdh/bezierpath-length/blob/master/Source/BezierPath%2BLength.swift
It gave me a good approximation of the x that is needed in order to do the interpolation after a small change. With that x and this explanation: https://medium.com/#adrian_cooney/bezier-interpolation-13b68563313a was I able to find the normal vector (the last interpolation is along the tangent, so I only needed the orthogonal vector from that last interpolation).
My code is still a bit messy. I might publish later. But it works!

Game Maker - Touch Event

I´m making my first game in Game Maker.
In the game i need to the user to draw a figure, for example a rectangle, and the game has to recognize the figure. How can i do this?
Thanks!
Well, that is a pretty complex task. To simplify it, you could ask him to place a succession of points, using the mouse coordinates in the click event, and automatically connect them with lines. If you store every point in the same ds_list structure, you will be able to check conditions of angle, distance, etc. This way, you can determine the shape. May I ask why you want to do this ?
The way I would solve this problem is pretty simple. I would create a few variables for each point when someone clicked on one of the points it would equal true. and wait for the player to click on the next point. If the player clicked on the next point i would call in a sprite as a line using image_angle to line both points up and wait for the player to click the next point.
Next I would have a step event waiting to see if all points were clicked and when they were then to either draw a triangle at those coordinates or place an sprite at the correct coordinates to fill in the triangle.
Another way you could do it would be to decide what those points would be and check against mouse_x, and mouse_y to see if that was a point and if it was then do as above. There are many ways to solve this problem. Just keep trying you will find one that works for your skill level and what you want to do.
You need to use draw_rectangle(x1, y1, x2, y2, outline) function. As for recognition of the figure, use point_in_rectangle(px, py, x1, y1, x2, y2).
I'm just wondering around with ideas cause i can't code right now. But listen to this, i think this could work.
We suppose that the user must keep his finger on touchscreen or an event is triggered and all data from the touch event is cleaned.
I assume that in future you could need to recognize other simple geometrical figures too.
1 : Set a fixed amount of pixels of movement defined dependent on the viewport dimension (i'll call this constant MOV from now on), for every MOV you store in a buffer (pointsBuf) the coordinates of the point where the finger is.
2 : Everytime a point is stored you calculate the average of either X and Y coordinates for every point. (Hold the previous average and a counter to reduce time complexity). Comparing them we now can know the direction and versus of the line. Store them in a 2D buffer (dirVerBuf).
3 : If a point is "drastically" different from the most plain average between the X and Y coordinates we can assume that the finger changed direction. This is where the test part of MOV comes critical, we must assure to calculate an angle now. Since only a Parkinsoned user would make really distorted lines we can assume at 95% that we're safe to take the 2nd point that didn't changed the average of the coordinate as vertex and let's say the last and the 2nd point before vertex to calculate the angle. You have now one angle. Test the best error margin of the user to find if the angle is about to be a 90, 60, 45, ecc.. degrees angle. Store in a new buffer (angBuf)
4 : Delete the values from pointsBuf and repeat step 2 and 3 until the user's finger leaves the screen.
5 : if four of the angles are of 90 degrees, the 4 versus and two of the directions are different, the last point is somewhat near (depending from MOV) the first angle stored and the two X lines and the Y lines are somewhat equal, but of different length between them, then you can connect the 4 angles using the four best values next to the 4 coordinates to make perfect rectangular shape.
It's late and i could have forgotten something, but with this method i think you could even figure out a triangle, a circle, ecc..
With just some edit and confronting.
EDIT: If you are really lazy you could instead use a much more space complexity heavy strategy. Just create a grid of rectangles or even triangles of a fixed dimension and check which one the finger has touched, connect their centers after you'have figured out the shape, obviously ignoring the "touched for mistake" ones. This would be extremely easy to draw even circles using the native functions. Gg.

Raphael animate based on position not time

I have two path elements. For the sake of description lets just say one is a heart and the other is a square with rounded corners.
I'm trying to get one to interpolate into the other like this:
http://raphaeljs.com/animation.html
This is easy enough to do but I need to do it based off the elements X position and not time.
So as I drag the path element between an Xval of 100 and 200 it will slowly morph into the second path until it finally completes the transformation at an X value of 200.
Does anyone have any idea how I might do this?
I'd use the drag handler (or you could bind to the mousemove event and feed the coordinates in as well) and use that to move between a starting position and a target animation based on the x or y coordinate and using Element.status. Check out the fiddle (based on yours):
And I'm sorry -- I gave you a copy of your own fiddle back originally. Terribly poor form =( Try this one instead:
http://jsfiddle.net/kevindivdbyzero/MqFkP/2/
So far this is what I have come up with:
http://jsfiddle.net/jocose/CkL5F/139/
This works but relies on a delay which I am afraid is going to break on slower machines, and plus it seams really hacky. If anyone knows of a better solution please let me know.

Reach a waypoint using GPS/Compass/Accelerometer - Algorithm?

I currently have a robot with some sensors, like a GPS, an accelerometer and a compass. The thing I would like to do is my robot to reach a GPS coordinate that I enter. I wondered if any algorithm to do that already existed. I don't want a source code, which wouldn't have any point, just the procedure to follow for my robot to do so, for me to be able to understand what I do... At the moment, let's imagine that I can access the GPS coordinate everytime, so no need of a Kalman filter. I know it's unrealistic, but I would like to programm it step by step, and Kalman is the next step.
If anyone has an idea...
To get a bearing (positive angle east of north) between two lat-long points use:
bearing=mod(atan2(sin(lon2-lon1)*cos(lat2),(lat1)*sin(lat2)-sin(lat1)*cos(lat2)*cos(lon2-lon1)),2*pi)
Note - angles probably have to be in radians depending on your math package.
But for small distances you can just calculate how many meters in one degree of lat and long at your position and then treat them as flat X,Y coords.
For typical 45deg latitudes it's around 111.132 km/deg lat, 78.847 km/deg lon.
1) orient your robot toward its destination.
2) Move forward until the distance between you and your destination is increasing where you should go back to 1)
3) BUT ... if you are close enough (under a threshold), consider that you arrived at the destination.
You can use the Location class. It's BearingTo function computes the bearing you have to follow to reach another location.
There is a very nice page explaining the formulas between GPS-based distance, bearing, etc. calculation, which I have been using:
http://www.movable-type.co.uk/scripts/latlong.html
I am currently trying to do these calculations myself, and just found out that in Martin Becket answer there is an error. If you compare to the info of that webpage, you will see that the part in the middle:
(lat1)*sin(lat2)
should actually be:
cos(lat1)*sin(lat2)
Would have left a comment, but don't have the reputation yet...

Resources