What is reference north for DJI Onboard SDK (Magnetic or True) - dji-sdk

The onboard SDK has several parameters that are referenced to North but the documentation does not indicate which North (true or magnetic). I'm particularly interested in the acceleration relative to ground, since that's measured with accelerometers - is the magnetic variation removed from that number prior to being reported? Also, the reported YAW angle...
Thanks!

DJI software uses magnetic north. Any reference to true north in the docs I assume to mean "true" magnetic north.
Questions about accelerometer yaw calculation are not relevent to the compass as far as I know.

Related

3D Stereo_camera and 2D LiDAR pointclouds data fusion

I have two separate pointclouds(type= sensor_msgs/PointCloud2) from two different sensors, a 3D stereo camera and a 2D LiDAR. I wanted to know how can I fuse these two pointclouds if the stereo pointcloud is 3D with fix length and a 2D LiDAR pointcloud with variable pointcloud length?
If someone has worked on it please help me, your help will be highly appreciated.
Thanks
I studied this in my research.
The first is you have to calibrate 2 sensors to know their extrinsic. There are a few open source packages you can play with which I listed Below
The Second is fuse the data. The simple way is just based on calibration transform and use the tf to send. The complicated way is to deply pipelines such as depth image to LIDAR alignment and depth map variance estimation and fusion. You can choose to do it ez way like easiler landmark included EKF estimation, or you can follow CMU Zhangji`s Visual-LIDAR-Inertial fusion work for the direct 3D feature to LIDAR alignment. The choice is urs
(1)
http://wiki.ros.org/velo2cam_calibration
Guindel, C., Beltrán, J., Martín, D. and García, F. (2017). Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups. IEEE International Conference on Intelligent Transportation Systems (ITSC), 674–679.
Pros. Pretty accurate and ez to use package. Cons. you have to made rigid cut board.
(2) https://github.com/ankitdhall/lidar_camera_calibration
LiDAR-Camera Calibration using 3D-3D Point correspondences, arXiv 2017
Pros. Ez to use, Ez to make the hardware. Cons May is not so accurate
There were couple of others I listed In my thesis, I`ll go back and check for it and update here. If I remember

DJI Mobile SDK - how is distance between adjacent waypoints calculated?

Using the DJI Mobile SDK to upload Waypoint Missions, if two adjacent waypoints are determined by the DJI to be too close (within 0.5 meters), the upload is rejected.
Does anyone know the algorithm used to determine the distance between adjacent waypoints in a waypoint mission?
Specifically, is the DJI algorithm using a haversine calculation for distance between lat, lon coordinates and if so, what is the earth radius used? Is it the UIGG mean radius: 6371008.8 meters. Or some other radius?
Or does it use the ellipsoidal Vincenty formula (WGS-84)?
This information would be useful for more precise waypoint decimation prior to mission upload.
First off, I would comment that DJI answering an internal implementation question is very unlikely since it would expose them to having to support the implementation over time and across aircraft. Different aircraft, different technologies may result in varying implementations.
What has always worked for me it to use standard "distance between points" calculations, either common from map formulas or as built into platform SDK (iOS, Android, etc.) I have found these to be sufficiently accurate enough to plan even complex flights.
Based on several tests I can now confirm that the current DJI internal distance computation is dependent on latitude and/or longitude. Meaning that you will get different results for the same (!) waypoint distance pair depending on where your two points are anchored.
A 1.5-meter distance Waypoint Pair got accepted as a mission in a location in central Europe but were rejected with WAYPOINT_DISTANCE_TOO_CLOSE for a location in the central US.
(We verified with https://gps-coordinates.org/distance-between-coordinates.php that both waypoint distance pairs had the same 1.5 meter distance between them.)
So it's safe to assume that DJI has a bug in their distance calculation.

dji virtual stick body direction

sketch_image
Drone(DJI) heading in north-east in sketch
if my drone is control with sendvirtualstickcommanddata() in velocity and body co-ordinate system.
What will be my drone direction?
As of my knowledge ...
+Pitch is South East and
+Roll is North East
Is this right?
(Sorry,Drone damaged so not able to simulate this,but eager to know)
DJI_virtual stick_Table_image
You're getting off tracks completely.
Pitch, Roll & Yaw are relative to the body of the aircraft regardless of its compass direction (North, South, East, West).
When using the virtual stick to navigate in the world coordinate system, you need to build a control system (feedback system) and translate objectives into commands.
Basically, monitor the compass value of the aircraft and send commands to get closer to the value that you want. Yaw commands will be enough to realign to the compass direction that you want.
It's not something that will fit here so you'll have to dig into it yourself.
For more info on the theory, check out this article.
There used to be a wonderful article written by DJI that describes the world coordinate systems the aircrafts can use but the article was linked to in the old developer forums (which no longer exist).
Maybe someone has a copy of the article; I believe it will answer your questions.

Mobile SDK: ActiveTrack Mission for reaching specified point of the Concrete wall

I wonder I can use DJI 'Mobile SDK: ActiveTrack Mission API' not for targetting moving objects but for reaching specified point of the Concrete wall.
If Concrete wall is too homogeneous to identify,I consider light up the destination point with laser pointer to specify.
Can I use Mobile SDK : ActiveTrack Mission for such purpose?
The way ActiveTrack I am certain it will not satisfy your requirements.
ActiveTrack tracks a "moving" object, if you point to the object (aka, wall) then the camera and aircraft may identify and follow the wall but until the wall moves the aircraft will hover in-place (same as it does when you start with a human target).

Remove GPS on matrice 100

With my team we are programming indoor flight for the matrice 100, and we don't have the use of the GPS.
Is it possible to remove it?
And sometimes at the floor level we have electromagnetic problems, and the drone refuses to turn on the rotors, are they any way to force it ?
We use guidance, and I have noticed even without GPS and with electromagnetic interference the drone is stable.
As of Mar 2018, on ALL DJI drones you need to have at least the compass connected in order to start flight.
Can't you just remove the GPS module from the M100? The module rises above the rest of the craft; it's that little white puck with "DJI" written on it in red.
Alternatively, I've heard of people covering it with tin foil to prevent the GPS signal from coming through.

Resources