Units of velocity in Pymunk - chipmunk

I was wondering what the basic units of velocity are in Pymunk. If I put in a velocity of (50,50) does that correspond to 50 pixels/second in each direction? The API says that the units of angular velocity are rad/s but doesn't say anything about linear velocity.
Thanks.

Chipmunk avoids enforcing any particular units or scales. If you are using pixels for distance, and seconds for time, then velocity is in pixels per second. If you want to use meters and hours that’s fine too. As long as you drive it with the right input and interpret the output correctly you are good.

Related

DJIWaypointMissionHeadingTowardPointOfInterest, but the gimbal pitch angle?

using DJIWaypointMissionHeadingTowardPointOfInterest as heading mode in DJIWaypointMission i can automatically rotate the drone to heading vs a POI, but is there a way to also automatically tilt the camera to frame the POI?
(unlucky also the "pointOfInterest" have no altitude property)
also i think that would be better change/define the heading mode in the DJIWaypoint instead to be a property of the entire DJIWaypointMission, is it possible?
I'm adding to what Ken said.
When you use 'isGimbalPitchRotationEnabled' you can set a pitch angle in each waypoint. The drone will change the pitch angle in a linear motion between each 2 waypoints.
Of course this will not work when during a flight between the 2 points the drone gets closer to the POI and than backs away.
What I'm doing in my app is dividing the straight line between the 2 points into several straight sections and calculating the correct pitch at each point. As I divide the original line I calculate the error (the difference between the calculated pitch and the linear extrapolated pitch). If the error is greater than some value (5 degrees, for example) I divide the line recursively and re-calculate the pitch, until the error at each point is small enough. It takes some geometric calculations when preparing the mission, but it produces amazing fly-by shots.
Take a look at these settings:
isGimbalPitchRotationEnabled and gimbalPitch
This isn't a 'true' location though because the pitch divides equally based on the distance between the waypoints but it's close.
In my app I manually control the gimbal but that means a connection must be maintained or the gimbal stops moving.
Yes, it would be nice if the POI in waypoints was a true POI functionality and I've made the suggestion many times but to no avail so far. I'd like to see the POI be a per waypoint (as a LocationCoordinate 3D) rather than the current, 1 POI for the whole mission.
I'd also like to see the location in waypoint to be consistent and be of type LocationCoordinate3D.

How use raw Gryoscope Data °/s for calculating 3D rotation?

My question may seem trivial, but the more I read about it - the more confused I get... I have started a little project where I want to roughly track the movements of a rotating object. (A basketball to be precise)
I have a 3-axis accelerometer (low-pass-filtered) and a 3-axis gyroscope measuring °/s.
I know about the issues of a gyro, but as the measurements will only be several seconds and the angles tend to be huge - I don't care about drift and gimbal right now.
My Gyro gives me the rotation speed of all 3 axis. As I want to integrate the acceleration twice to get the position at each timestep, I wanted to convert the sensors coordinate-system into an earthbound system.
For the first try, I want to keep things simple, so I decided to go with the big standard rotation matrix.
But as my results are horrible I wonder if this is the right way to do so. If I understood correctly - the matrix is simply 3 matrices multiplied in a certain order. As rotation of a basketball doesn't have any "natural" order, this may not be a good idea. My sensor measures 3 angular velocitys at once. If I throw them into my system "step by step" it will not be correct since my second matrix calculates the rotation around the "new y-axis" , but my sensor actually measured an angular velocity around the "old y-axis". Is that correct so far?
So how can I correctly calculate the 3D rotation?
Do I need to go for quaternoins? but how do I get one from 3 different rotations? And don't I have the same issue here again?
I start with a unity-matrix ((1, 0, 0)(0, 1, 0)(0, 0, 1)) multiplied with the acceleration vector to give me the first movement.
Then I want use the Rotation matrix to find out, where the next acceleration is really heading so I can simply add the accelerations together.
But right now I am just too confused to find a proper way.
Any suggestions?
btw. sorry for my poor english, I am tired and (obviously) not a native speaker ;)
Thanks,
Alex
Short answer
Yes, go for quaternions and use a first order linearization of the rotation to calculate how orientation changes. This reduces to the following pseudocode:
float pose_initial[4]; // quaternion describing original orientation
float g_x, g_y, g_z; // gyro rates
float dt; // time step. The smaller the better.
// quaternion with "pose increment", calculated from the first-order
// linearization of continuous rotation formula
delta_quat = {1, 0.5*dt*g_x, 0.5*dt*g_y, 0.5*dt*g_z};
// final orientation at start time + dt
pose_final = quaternion_hamilton_product(pose_initial, delta_quat);
This solution is used in PixHawk's EKF navigation filter (it is open source, check out formulation here). It is simple, cheap, stable and accurate enough.
Unit matrix (describing a "null" rotation) is equivalent to quaternion [1 0 0 0]. You can get the quaternion describing other poses using a suitable conversion formula (for example, if you have Euler angles you can go for this one).
Notes:
Quaternions following [w, i, j, k] notation.
These equations assume angular speeds in SI units, this is, radians per second.
Long answer
A gyroscope describes the rotational speed of an object as a decomposition in three rotational speeds around the orthogonal local axes XYZ. However, you could equivalently describe the rotational speed as a single rate around a certain axis --either in reference system that is local to the rotated body or in a global one.
The three rotational speeds affect the body simultaneously, continously changing the rotation axis.
Here we have the problem of switching from the continuous-time real world to a simpler discrete-time formulation that can be easily solved using a computer. When discretizing, we are always going to introduce errors. Some approaches will lead to bigger errors, while others will be notably more accurate.
Your approach of concatenating three simultaneous rotations around orthogonal axes work reasonably well with small integration steps (let's say smaller than 1/1000 s, although it depends on the application), so that you are simulating the continuous change of rotation axis. However, this is computationally expensive, and error grows as you make time steps bigger.
As an alternative to first-order linearization, you can calculate pose increments as a small delta of angular speed gradient (also using quaternion representation):
quat_gyro = {0, g_x, g_y, g_z};
q_grad = 0.5 * quaternion_product(pose_initial, quat_gyro);
// Important to normalize result to get unit quaternion!
pose_final = quaternion_normalize(pose_initial + q_grad*dt);
This technique is used in Madgwick rotation filter (here an implementation), and works pretty fine for me.

Unity Rigidbody angular speed

I'm working on a phisics based game and I have a question
Is it possible to make almost real phisics inside Unity engine??
Because when I put a rolling sphere at top of a ramp and let it roll... it moves very slow... and when I do it in real life... obviously the ball rolls with certain speed depending on the angle of the ramp... less angle = less speed... more angle = more speed
I tried:
Removing drag
Removing angular drag
Changing the values in interpolate and collision detection
Changing the mass value
Any help will be apreciated
Thanks in advance
Be sure to check the scale of your objects, it is very easy to set up a scene at the wrong scale because there's no easy frame of reference!
Unity's units map to 1 meter, so if your objects are extremely large, they will appear to move more slowly because the physics engine is set up to respect this scale by default.
A marble should have a diameter of roughly 0.025 units, and a person should be around 1.7 units tall!
I think your error is the scale objects. I found this link that may can help you: http://gamedevelopment.tutsplus.com/articles/how-to-fix-common-physics-problems-in-your-game--cms-21418

Three.js matrix precision for real worlds

I'm experimenting some issues when work with real worlds.
The center of my camera is 280000, 45787254 (for example).
The extension of my world is about 500 x 500 (not too big)
I'm using data based in metric units (meters).
I have created a tile map structure build with simple planes.
I see little gaps between the plane borders and this planes are built to be contiguous (that is xmin of the adjacent plane is equal to xmax of previous).
In the past I have issues related with ray cast.
Matrix projection with this big units have low precision.
Change near value to number great than 10 can be the fix. However, using this value means bad visualization (you can't place the cam much near of the scene, it disappears).
I talked with the guy who develops potree and he said me is had to move the lidar worlds to 0,0 to work properly.
So... the final solution is to work in 0,0 worlds, isn't it ?
Or is there any trick we can do at matrix calculations?
I'd like to know three.js developers.
Floating point math is best at ranges close to zero, you just end up compounding errors as you move far away. You can always do as much math as possible near the origin and then translate the result to wherever you need, that will help with some of it, but if you can, work in local coordinates.
Potree probably gets odd ripple-looking aliasing effects when too far from the origin, no?

iOS: CoreMotion Acceleration Values

We can retrieve the acceleration data from CMAcceleration.
It provides 3 values, namely x, y and , z.
I have been reading up on this and I seem to have gotten different explanation for these values.
Some say they are the acceleration values in respect to gravity.
Others have said they are not, they are the acceleration values in respect to the axis as they turn around on its axis.
Which is the correct version here? For example, does x represent the acceleration rate for pitch or does it for from left to right?
In addition, let say if we want to get the acceleration rate (how fast) for yaw, how could we be able to derive that value when the call back is feeding us constantly with values? Would we need to set up another timer for the calculation?
Edit (in response to #Kay):
Yes, it was basically it - I just wanted to make sure x, y, z and respectively pitch, roll and yaw and represented differently by the frame.
1.)
How are these related in certain situations? Would there be a need that besides getting a value, for example, for yaw that needs addition information from the use of x, y, z?
2.)
Can you explain a little more on this:
(deviceMotion.rotationRate.z - previousRotationRateZ) / (currentTime - previousTime)
Would we need to use a timer for the time values? And how would making use of the above generate an angular acceleration? I thought angular acceleration entail more complex maths.
3.)
In a real world situation, we can barely only rely on a single value from pitch, roll and yaw because that would be impossible to for us to make a rotation only on one axis (our hand is not that "stable". Especially after 5 cups of coffee...)
Let say I would like to get the values of yaw (yes, rotation on the z-axis) but at the time as yaw spins I wanted to check it against pitch (x-axis).
Yes, 2 motions combine here (imagine the phone is rotating around z with slight movement going towards and away from the user's face).
So: Is there is mathematical model (or one that is from your own personal experience) to derive a value from calculating values of different axis? (sample case: if the user is spinning on z-axis and at the same time also making a movement of x-axis - good. If not, not a good motion we need). Sample case just off the top of my head.
I hope my sample case above with both yaw and pitch makes sense to you. If not, please feel free to cite a better use case for explanation.
4.)
Lastly time. How can we get time as a reference frame to check how fast a movement is since the last? Should we provide a tolerance (Example: "less than 1/50 of a second since last movement - do something. If not, do nothing.")? Where and when do we set a timer?
The class reference of CMAccelerometerData says:
X-axis acceleration in G's (gravitational force)
The acceleration is measured in local coordinates like shown in figure 4-1 in the Event Handling Guide. It's always a translation und must not be confused with radial or circular motions which are measured in angles.
Anyway, every rotation even with a constant angular velocity is related to a change in the direction and thus an acceleration is reported as well s. Circular Motion
What do you mean by get the acceleration rate (how fast) for yaw?
Based on figure 4-2 in Handling Rotation Rate Data the yaw rotation occurs around the Z axis. That means there is a continuous linear acceleration in the X,Y plane. If you are interested in angular acceleration, you need to take CMDeviceMotion.rotationRate and divide it by the time delta e.g.:
(deviceMotion.rotationRate.z - previousRotationRateZ) / (currentTime - previousTime)
Update:
It depends on what you want to do and which motions you are interested in to track. I hope you don't want to get the exact device position in x,y,z when doing a translation as this is impossible. The orientation i.e. the rotation relativ to g can be determined very well of course.
I think in >99% of all cases you won't need additional information from accelerations when working with angles.
Don't use your own timer. CMDeviceMotion inherits from CMLogItem and thus provides a perfect matching timestamp of the sensor data or respectivly the interpolated time for the result of the sensor fusion algorithm.
I assume that you don't need angular acceleration.
You are totally right even without coffee ;-) If you look at the motions shown in this video there is exactly the situation you describe. Maths and algorithms were the result of some heavy R&D and I am bound to NDA.
But the most use cases are covered with the properties available in CMAttitude. Be cautious with Euler angles when doing calculation because of Gimbal Lock
Again this totally depends on what you are up to.

Resources