DJI SDK - Fly drone in circular orbit, set the radius (in meters/feet) and Hot Point Mission for Air 2S - dji-sdk

I m trying to fly the drone in circular orbit around the car standing still on the ground.
The drone is flying around the car in circular direction (clockwise) if it is initially placed perpendicular to the front left tyre at a fixed distance. But when the drone is placed on the right or left side of the car, the drone start fly in circular orbit from the point it is placed and that is too off to the car but not around the car as shown in Pic 2.
Pic 1 -
Pic 2 -
this is my code :
var flightController: DJIFlightController?
var timer: Timer?
var radians: Float = 0.0
let velocity: Float = 0.1
#IBAction func actionOrbit(_ sender: Any) {
setupFlightMode()
// Schedule the timer at 2Hz while the default specified for DJI is between 5 and 25Hz
timer = Timer.scheduledTimer(timeInterval: 0.5, target: self, selector: #selector(timerLoop), userInfo: nil, repeats: true)
}
private func setupFlightMode() {
// Reset radians
radians = 0.0
// Invalidate timer if necessary
// This allows switching between flight modes
if timer != nil {
print("invalidating")
timer?.invalidate()
}
}
#objc func timerLoop() {
radians += velocity
if(radians >= 6.283){ //360 degree it will stop the drone once the one orbit is complete
self.timer?.invalidate()
radians = 0.0
self.verticalMoveUpward()
//vertical throttle and then move forward and land
return
}
x = cos(radians)
y = sin(radians)
z = 0
}
private func sendControlData(x: Float, y: Float, z: Float) {
print("Sending x: \(x), y: \(y), z: \(z), yaw: \(yaw)")
// Construct the flight control data object
var controlData = DJIVirtualStickFlightControlData()
controlData.verticalThrottle = z //throttle // in m/s
controlData.roll = x //roll
controlData.pitch = y //pitch
controlData.yaw = self.yaw
// Send the control data to the FC
self.flightController?.send(controlData, withCompletion: { (error: Error?) in
// There's an error so let's stop
if error != nil {
print("Error sending data")
// Disable the timer
self.timer?.invalidate()
}
else{
print("Error sending data \(error.debugDescription)")
}
})
}
The model of the drone I m using : Air 2S
Question 1 :
How can I make the drone fly around the car in circular orbit irrespective of the drone's placement ?
Question 2 :
How can I decide the radius (in meters or feet) of the circular orbit ? Currently every 0.5 seconds it keeps on increasing the radians by 0.1 until it reaches 6.283. Is it something to do with the frequency i.e. 2Hz mentioned in the comment.
Question 3 :
As per the documentation there is Hot Point mission in which the Object can be identified and set. And then the drone can fly in circular orbit around it.
Is it possible for the drone model : Air 2S and if yes how can I provide the radius (in meters) instead of Latitude and Longitude.

If I understand you correct you trying to change the roll and pitch as a function of time?
It will never work. You have to check the drone position and have a pid regulator adjusting for the errors for you desired path.
The gps-pos is fused from gps and imu.
If you have no gps, only imu (maybe sensor) is used and lat/lon=0 from takeoff point.
You will also risc of getting the dji jerky gimbal movement, since it's a flaw in their implementation of virtualstick. Not much to do about that.

Related

AFrame & Three.JS detecting collision between moving point and box which happens between frames

I'm trying to implement "bullet and target collision" problem and create an explosion when collision occurs. I managed to do it using aframe-physics-system which was working good: the explosion was rendering at the exact point of the collision and in the exact time. Now I decided to get rid of the physics system as I don't need such overhead - my only goal is to render an explosion.
I tried to use box.containsPoint as well as Raycaster:
tick(time: number, delta: number): void {
// bullet-component
// ...
// Update speed based on acceleration
this.speed = this.currentAcceleration * .01 * delta;
if (this.speed > this.data.maxSpeed) {
this.speed = this.data.maxSpeed;
}
// there is an initial position and direction set in data property.
const newBulletPosition = this.position.add(this.direction.multiplyScalar(this.speed));
// targets is an array of boxes
const found = this._detectCollision(newBulletPosition, this.targets);
if (found) {
console.log("found!");
this.resetBullet();
this.el.emit("collide", {
coordinates: newBulletPosition//found
});
return;
}
this.el.object3D.position.set(newBulletPosition.x, newBulletPosition.y, newBulletPosition.z);
},
_detectCollision(point: THREE.Vector3, obj: THREE.Object3D[]): THREE.Vector3 | null {
const ray = new THREE.Raycaster(point,
this.temps.direction.clone().multiplyScalar(-1).normalize());
const intersects = ray.intersectObjects(obj, true);
return intersects.length % 2 === 1 ? intersects[0].point : null;
},
_box: new THREE.Box3(),
_inverseWorldMatrix: new THREE.Matrix4(),
_detectCollision2(point: THREE.Vector3, obj: THREE.Object3D): THREE.Vector3 | null {
obj.updateMatrixWorld(true);
this._inverseWorldMatrix.copy(obj.matrix).invert();
this._box.setFromObject(obj);
this._inverseBulletPosition.set(point.x, point.y, point.z);
this._inverseBulletPosition.applyMatrix4(this._inverseWorldMatrix);
return this._box.containsPoint(this._inverseBulletPosition);
}
But both approaches have the following flaw:
On frame X the bullet is just in front of a box, but in frame X+1 it is already behind this box. For some reason in this case there might be desirable intersections, but the last bullet position is different than the intersection. Which causes the explosion to be rendered in a wrong position. So, the second approach works only if bullet during it's "jumps" appears inside of a box which is far from being frequent.
The question is how in this case I can repeat the behaviour I had with physics system:
Bullet is moving relatively fast
The intersection is being detected instantly once a bullet crosses any face of a box, so there is no "jump" in bullet's movement.
Thanks in advance.
This is a common problem when trying to recreate the calculations of a physics engine. Since your bullet is too small and sometimes travels beyond the wall in between frames, I see two options:
On frame x+1 you could calculate how much distance has been traveled since frame x, and use that as the size of the bullet. If the plane is crossed in the distance travelled between x -> x1, then you know you've had a collision.
If collision points don't move, you could use a THREE.Raycaster and calculate the point of collision pre-emptively, so you'll know where the bullet will hit before that point is reached:
const raycaster = new THREE.Raycaster();
shoot() {
raycaster.set(origin, direction);
const intersects = raycaster.intersectObjects(arrayOfWalls);
// No intersection took place
if (intersects[0] == undefined) return;
// How far away from origin the collision takes place.
intersects[0].distance;
// The Vector3 where the bullet crosses the wall
intersects[0].point;
}
You can read more about Raycasters in the docs.
Thanks to #Marquizzo, I ended up with the following solution:
I'm casting a ray from the bullet position to the position of the gun. If there is 1 intersection, then the bullet is inside of the box, so I can render an explosion at the intersection position. But if there are two intersections I will take the second one as it will be more far from the ray origin point and hence closer to the gun. But also I had to calculate the distance between the bullet position and the intersection which as was advised should be less than the distance bullet passed between the frames:
tick(time: number, delta: number): void {
const el = this.el;
if (!el) {
console.warn("AFRAME entity is undefined.");
return;
}
this.el.object3D.lookAt(this.direction.clone().multiplyScalar(1000));
// Update acceleration based on the friction
this.temps.position.copy(this.el.object3D.position);
// Update speed based on acceleration
this.speed = this.currentAcceleration * 0.05 * delta;
if (this.speed > this.data.maxSpeed) {
this.speed = this.data.maxSpeed;
}
// Set new position
this.temps.direction.copy(this.direction);
const newBulletPosition = this.temps.position.add(this.temps.direction.multiplyScalar(this.speed));
if (newBulletPosition.length() >= FADE_DISTANCE) {
this.resetBullet();
return;
}
const found = this._detectCollision(newBulletPosition, this.targetCollisionShapes);
if (found) {
const jumpDistance = newBulletPosition.clone().sub(this.el.object3D.position).length();
const collisionDistance = newBulletPosition.clone().sub(found).length();
if (collisionDistance < jumpDistance) {
console.log("found!");
this.resetBullet();
this.el.emit("collide", {
target: this.target,
coordinates: found
} as CollisionEvent);
return;
}
this.el.object3D.position.set(newBulletPosition.x, newBulletPosition.y, newBulletPosition.z);
},
_detectCollision(point: THREE.Vector3, obj: THREE.Object3D[]): THREE.Vector3 | null {
const ray = new THREE.Raycaster(point, this.direction.clone().multiplyScalar(-1).normalize());
const intersects = ray.intersectObjects(obj, true);
return intersects.length % 2 === 1
? intersects[0].point
: intersects.length > 1 ? intersects[1].point : null;
}

threejs rotate the object gradually to where camera is looking using orbit control

I'm planning to use Orbit Control to do a simple 3rd person camera view,
But I cant seem to figure out how to do it.
when I rotate the camera around an object, and press say “W” key to move forward, I want the object “look” to gradually rotate and move to the new direction the camera is facing.
How can I do that?
It's possible to do exactly that by gradually rotating the object to the camera direction.
Made a codepen here which uses a generic replacement to orbit controls for simplicity:
https://codepen.io/cdeep/pen/QWMWyYW
// Get the X-Z plane in which camera is looking to move the player
camera.getWorldDirection(tempCameraVector);
const cameraDirection = tempCameraVector.setY(0).normalize();
// Get the X-Z plane in which player is looking to compare with camera
model.getWorldDirection(tempModelVector);
const playerDirection = tempModelVector.setY(0).normalize();
// Get the angle to x-axis. z component is used to compare if the angle is clockwise or anticlockwise since angleTo returns a positive value
const cameraAngle = cameraDirection.angleTo(xAxis) * (cameraDirection.z > 0 ? 1 : -1);
const playerAngle = playerDirection.angleTo(xAxis) * (playerDirection.z > 0 ? 1 : -1);
// Get the angle to rotate the player to face the camera. Clockwise positive
const angleToRotate = playerAngle - cameraAngle;
// Get the shortest angle from clockwise angle to ensure the player always rotates the shortest angle
let sanitisedAngle = angleToRotate;
if(angleToRotate > Math.PI) {
sanitisedAngle = angleToRotate - 2 * Math.PI
}
if(angleToRotate < -Math.PI) {
sanitisedAngle = angleToRotate + 2 * Math.PI
}
// Rotate the model by a tiny value towards the camera direction
model.rotateY(
Math.max(-0.05, Math.min(sanitisedAngle, 0.05))
);

Gimbal rotation doesn't work with Windows SDK

I've got the sample UWP app to work with my Mavic Air, I can see camera feed etc. Now I'm trying to change Gimbal angle like this:
var connected = await DJISDKManager.Instance.ComponentManager.GetGimbalHandler(0, 0).GetConnectionAsync();
// true
var attitude = await DJISDKManager.Instance.ComponentManager.GetGimbalHandler(0, 0).GetGimbalAttitudeAsync();
// pitch: 0, roll: 0, yaw: -124
var range = await DJISDKManager.Instance.ComponentManager.GetGimbalHandler(0, 0).GetGimbalAttitudeRangeAsync();
// pitch: max 17 min -90; yaw: max 0 min 0; roll: max 0 min 0;
var angle = new GimbalAngleRotation() { mode = GimbalAngleRotationMode.ABSOLUTE_ANGLE, pitch = -20, yaw = 10, roll = 0 };
var resp = await DJISDKManager.Instance.ComponentManager.GetGimbalHandler(0, 0).RotateByAngleAsync(angle);
// PARAM_OUT_OF_RANGE
As you can see, the range for yaw & roll is 0-0 (none). However, the actual yaw value is -124; In my understanding Mavic Air has a 3 axis gimbal, so I should have wider range for each axis.
Also, when I try to change yaw or roll I get PARAM_OUT_OF_RANGE response. Changing pitch only results in NO_ERROR response, but I see no difference in Gimbal angle.
For this issue, you can try the following steps to resolve it:
You also need to set the GimbalAngleRotaion.duration(measured in seconds) a non-zero value in the GimbalAngleRotation struct to rotate the gimbal.
Currently, the Windows SDK doesn't support yaw and roll axises' Gimbal rotation.
I do also have the same issue. I cannot see the Gimbal doing anything with the RotateByAngleAsync method. The only way I got the Gimbal doing anything is using the RotateBySpeedAsync. This wouldn't be an issue if this would work perfectly, but sometimes it also does nothing. It works 1 time out of 10. :-/. If it works - it works as long as I do not restart the application. Didn't figure out yet what do to "reset" it.
// Defined somewhere else
gimbalHandler = DJISDKManager.Instance.ComponentManager.GetGimbalHandler(0, 0);
// In my control method
var gimbalRotation = new GimbalSpeedRotation();
gimbalRotation.pitch = 4;
gimbalHandler.RotateBySpeedAsync(gimbalRotation);

Metal - I can't draw more than 2048 points using a single buffer

This is my first post here, so please forgive any unintentional breach of protocol or etiquette, thanks!
BASIC PROBLEM: MetalKt doesn't seem to be drawing all the lines I'm trying to display.
DETAILS: I'm into about week 3 of learning the Metal Frameworks (mostly thru MetalKit on OS X). So far I've managed to put together a MetalView displaying an audio wave from a file on disk, with a swiper bar that travels across the screen as the audio is playing back.
The audio wave is just a set of points representing sound levels, each pair of which is connected by a line, which eventually looks like something one would see in GarageBand or Logic etc.
The problem I am having is Metal doesn't draw all the points I believe I am asking it to. Thru trial and error I discovered it stops after drawing 2048 points (computer number!). I can verify that I'm feeding the data in correctly - that is, I'm gathering enough points to draw the wave fully, with the correct coordinates that draw the entire wave, but somewhere between creating the buffer and asking Metal to draw it, it gets clipped to 2048. The rest of the audio just doesn’t show up.
So I'm wondering if there is some buffer data limit of my creation, or in Metal itself, that would cause this. I've gotten around it by using multiple buffers, but this feels like a band-aid fix, and it troubles me that I don't understand the cause.
The setup is fairly barebones, no textures or scaling (that I'm aware of ... like I said I'm just starting)
Here are my classes:
// Shaders.metal
#include <metal_stdlib>
using namespace metal;
struct Vertex {
float4 position [[position]];
float4 color;
};
struct Uniforms {
float4x4 modelMatrix;
};
vertex Vertex vertex_func(constant Vertex *vertices [[buffer(0)]],
constant Uniforms &uniforms [[buffer(1)]],
uint vid [[vertex_id]]) {
float4x4 matrix = uniforms.modelMatrix;
Vertex in = vertices[vid];
Vertex out;
out.position = matrix * float4(in.position);
out.color = in.color;
return out;
}
fragment float4 fragment_func(Vertex vert [[stage_in]]) {
return vert.color;
}
Here's some matrix utils (mostly for future use, currently returning unity) - adapted from online Metal tutorial by Marius Horga:
// MathUtils.swift
// chapter07
//
// Created by Marius on 3/1/16.
// Copyright © 2016 Marius Horga. All rights reserved.
// adapted for personal use
import simd
struct Vertex {
var position : vector_float4
var color : vector_float4
init(pos: vector_float4, col: vector_float4) {
position = pos
color = col
}
}
struct Matrix {
var m: [Float]
init() {
m = [1, 0, 0, 0,
0, 1, 0, 0,
0, 0, 1, 0,
0, 0, 0, 1
]
}
func modelMatrix(var matrix: Matrix) -> Matrix {
return matrix // for now, just unity
}
}
Here's the view:
// MetalView
import MetalKit
public class TesterMetalView: MTKView {
var vert_audio_buffer : MTLBuffer!
var uniform_buffer : MTLBuffer!
var rps : MTLRenderPipelineState! = nil
required public init(coder: NSCoder) {
super.init(coder: coder)
createBuffers()
registerShaders()
}
override public init(frame frameRect: CGRect, device: MTLDevice?) {
super.init(frame: frameRect, device: device)
createBuffers()
registerShaders()
}
override public func drawRect(dirtyRect: NSRect) {
super.drawRect(dirtyRect)
if let rpd = currentRenderPassDescriptor,
drawable = currentDrawable {
rpd.colorAttachments[0].clearColor = MTLClearColorMake(0.5, 0.5, 0.5, 1.0)
let command_buffer = device!.newCommandQueue().commandBuffer()
let command_encoder = command_buffer.renderCommandEncoderWithDescriptor(rpd)
command_encoder.setRenderPipelineState(rps)
command_encoder.setVertexBuffer(vert_audio_buffer, offset: 0, atIndex: 0)
command_encoder.setVertexBuffer(uniform_buffer, offset: 0, atIndex: 1)
let numVerts = (vert_audio_buffer.length / sizeof(Vertex))
command_encoder.drawPrimitives(.Line, vertexStart: 0, vertexCount: numVerts)
command_encoder.endEncoding()
command_buffer.presentDrawable(drawable)
command_buffer.commit()
}
}
func createBuffers() {
if device == nil { self.device = MTLCreateSystemDefaultDevice() }
// rotation + scaling
uniform_buffer = device!.newBufferWithLength(sizeof(Float) * 16, options: [])
let bufferPointer = uniform_buffer.contents()
memcpy(bufferPointer, Matrix().modelMatrix(Matrix()).m, sizeof(Float) * 16)
}
func registerShaders() {
if device == nil { self.device = MTLCreateSystemDefaultDevice() }
if let library = device!.newDefaultLibrary() {
let vertex_func = library.newFunctionWithName("vertex_func")
let frag_func = library.newFunctionWithName("fragment_func")
let rpld = MTLRenderPipelineDescriptor()
rpld.vertexFunction = vertex_func
rpld.fragmentFunction = frag_func
rpld.colorAttachments[0].pixelFormat = .BGRA8Unorm
do {
try rps = device!.newRenderPipelineStateWithDescriptor(rpld)
} catch {
Swift.print("***ERROR: newRenderPipelineStateWithDescriptor failed ...\r\t\(error)")
}
}
}
}
And here's how the metal buffers are created:
// within Metal View Controller
var verts_audio = [Vertex]()
// ... create Vertex's from audio data
// I can confirm verts_audio contains valid data
// however, Metal draws only the first 2048 of them
let bufferLength = sizeof(Vertex) * verts_audio.count
// metalV() gets a typed reference to the view
metalV().vert_audio_buffer = metalV().device!
.newBufferWithBytes(verts_audio,
length : bufferLength,
options : [])
So, if I try to draw a wave with vertices containing 2838 points, and put them all in a single buffer using the above code, I get this:
screen shot - clipped drawing
If I spread the save vertices amongst multiple buffers, each containing 2048 Vertex's (code not shown), I get the full wave (lighter lines show extra buffers):
screen shot - full wave
I'm probably doing something bone-headed or obvious. I'd sure appreciate someone smarter than me shedding some light on this. Thanks!
I think I've been fighting this today also. While searching for an answer I found " Is there a size limit to newBufferWithBytes()? ".
In it, the answer mentions the Apple Documentation: Metal Feature Set Tables.
In there it says that the " Maximum memory allocation for a shader or compute function variable in the constant address space" is unlimited for the iOS devices but only 64KB for OS X.
So in the vertex shader, I believe, "constant Vertex *vertices" should be "device Vertex *vertices" on OS X to use the device address space.

Animate rotation in Unity3d

I'm working on a 2D game in Unity3D (using Orthello 2D).
Since I switched from Cocos2d and CoronaSDK, I'm wondering if there's a way of implementing the following behaviour for a sprite (or any Unity3D object) as it worked in Corona:
object = ...
transition.to ( object, { time = 1000, rotation = object.rotation + 100, onComplete = function ()
// do something
end })
So a sprite rotates by 100 degrees over 1 second.
In my script attached to a sprite I can have a rotation in my Update () function, but it's a bit different approach...
You can do it easily in an Update function.
float timer = 0f;
void Update()
{
if(timer <= 1)
{
// Time.deltaTime*100 will make sure we are moving at a constant speed of 100 per second
transform.Rotate(0f,0f,Time.deltaTime*100);
// Increment the timer so we know when to stop
timer += Time.deltaTime;
}
}
If you need to do another 100 degrees rotation you will just have to reset the timer.
You can see different version of the Rotate function here and more information about the lifesaver Time.deltaTime value here
There are several differnt ways of doing that. For example using a coroutine:
IEnumerator TweenRotation(Transform trans, Quaternion destRot, float speed, float threshold )
{
float angleDist = Quaternion.Angle(trans.rotation, destRot);
while (angleDist > threshold)
{
trans.rotation = Quaternion.RotateTowards(trans.rotation, destRot, Time.deltaTime * speed);
yield return null;
float angleDist = Quaternion.Angle(trans.rotation, destRot);
}
}

Resources