DJI SDK Android - unwanted yaw motion on GoToAction in time line mission - dji-sdk

I am trying to develop a mobile application using DJI Mobile Android SDK. The goal of the application is navigate a Mavic 2 Pro to a target GPS coordinates and automatically center a camera on a vehicle and take a snapshot. After taking off and flying to target altitude a new tracking mission in spotlight mode is called to find an object and center a camera on it.
The first process goes normally after an aircraft is turned on and the mobile application runs the missions. The aircraft is landed manually.
The second trial with the mobile application goes wrong. There is an additional yaw motion that is not in time line mission. I have missed some cleaning method that reset the aircraft to an initial clean state probably.
How to setup the aircraft to a clean state before the application starts the missions please?
I don't understand why there is a yaw 45° motion in a simple
time line mission:
missionControl.scheduleElement(new TakeOffAction());
missionControl.scheduleElement(new GoToAction(2.0f));
missionControl.startTimeline();
Why the aircraft yaw 45° after the takeoff while it is lifting to target
altitude? It's to see https://youtu.be/-gCWFXou-WI

You never share any other code. so below is my list of guess/checklist for a possible solution.
First, clear all other code, e.g disable drone yaw following or any other possible routing with the keyword of tracking/following etc from both code and remote controller screen
The easiest way to check if it caused by this is to call
elements.add(new GoToAction(new LocationCoordinate2D(homeLatitude+0.00001, homeLongitude+0.00001), 5));
elements.add(new GoToAction(new LocationCoordinate2D(homeLatitude-0.00001, homeLongitude-0.00001), 5));
if the camera still follows you while at a multiple location. Then sth at tracking is bothering you. If follow it could be caused by home lock also
Secondly, GoToAction never mentions about the orientation but only 3D position. Theoretically, they can do anything they want. so check API for all orientation method/setting e.g
Use setFlightOrientationMode to set course lock or home lock to get your desired behavior.
method setFlightOrientationMode
void setFlightOrientationMode(#NonNull FlightOrientationMode type,
#Nullable CompletionCallback callback)
Package: dji.sdk.flightcontroller
SDK Key: FlightControllerKey.ORIENTATION_MODE
Description:
Sets the aircraft flight orientation relative to the Aircraft Heading, Course Lock, or Home Lock. See the Flight Controller User Guide for more information about flight orientation.
Last I assume you removed all other possible following modes and it is still not behave according to your wish.
The api given is
GoToAction(LocationCoordinate2D coordinate)
GoToAction(float altitude)
float altitude Target altitude in meters.
GoToAction(LocationCoordinate2D coordinate, float altitude)
If the direct setting altitude has an issue. Can you try the full command to determine it is a bug in the source code or it is sth else
double homeLatitude = get your start gps lat;
double homeLongitude = get your start gps long;
elements.add(new GoToAction(new LocationCoordinate2D(homeLatitude, homeLongitude), 2));
If you are sure that you have no other routing that bothers the drone, and both GoToAction(float altitude) and GoToAction(LocationCoordinate2D coordinate, float altitude) has the same yaw problem. open a ticket in dev#dji.com.
Personally, I don't think it is DJI`s issue. Because you never post the full code so I have no idea what you have done or you haven't have done but should have done. So good luck in finding the solution to your unwanted behavior.

Related

How to set a Sprite to a specific Frame in Godot

I have the Player move around and when he enters a new Room (via Instancing) his Sprite shows him facing in the Default direction (in my Case down). So If you enter a Room from any other direction then it looks weird, cause for a short Moment you can see the Player facing down even if you came from the right. How can I tell Godot to set the Player Sprite to a specific Frame in Code, so I can set it to the proper Frame for each Direction. I'm new to Godot and I used HeartBeast Action RPG Tutorial for my Movement. So it's using an AnimationTree and AnimationPlayer. I tried "set_frame" but Godot just says it doesn't know the Method.
If you are following the tutorial series I think you are following (Godot Action RPG)… You are using an AnimationTree with AnimationNodeBlendSpace2D (BlendSpace2D).
The BlendSpace2D picks an animation based on an input vector "blend_position". This way you can use BlendSpace2D to pick an animation based on the direction of motion or the direction the player character is looking at. For example, you can "idle_up", "idle_down", "idle_left", and "idle_right" animations, and use BlendSpace2D to pick one in runtime based on a direction vector.
Thus, you need to set the "blend_position" of the BlendSpace2D like this:
animationTree.set("parameters/NameOfTheBlendSpàce2D/blend_position", vector)
Where:
animationTree is a variable set to the AnimationTree.
"NameOfTheBlendSpàce2D" is the name of the BlendSpace2D you want to set (e.g. "Idle").
vector is a Vector2D with the direction you want (e.g. Vector2.UP).
This is shown in the episode 6 of the tutorial series (Animation in all directions with an AnimationTree).
You can find a reference project by HeartBeast at arpg-reference, where you can find a function update_animation_blend_positions that looks like this:
func update_animation_blend_positions():
animationTree.set("parameters/Idle/blend_position", input_vector)
animationTree.set("parameters/Run/blend_position", input_vector)
animationTree.set("parameters/Attack/blend_position", input_vector)
animationTree.set("parameters/Roll/blend_position", input_vector)
Here "Idle", "Run", "Attack", and "Roll" are BlendSpace2D, each configured with animations for the corresponding actions, and this function updates them in sync so that they are picking the correct animation.
As far as I can tell the code from the repository is further refactored from what is show in the tutorial series. This code from the repository is under MIT licence.

motion capture with a Kinect v1 in processing

Hello there I was wondering if anyone could help me with something
I have recently been giving a task to do from teachers at college and. I hope to achieve this is through motion capture.
The other lecturers' teacher sound art and film art, so I plan to create a program that will track the participant's movements and displaying the movement on screen with ether set or random colours.
I would also like use to the sound part of this project through the participant's movements, but by either changing the pitch of noise through movement or by changing the speed of the sound through movement.
I have manged to get a 360 xbox Kinect 1414 to work in processing and what played around with the motion tracking but can’t seem to figure out how to attach an ellipse to the hands. I hope someone can help me and that it doesn’t seem much of a hellish task.
if you can help here is my email address (alicebmcgettigan#gmail.com)
(if this is impossible I would understand as I tend to make life difficult for myself haha)
You will need a middleware library that can provide skeleton tracking data from depth data.
One option on Windows is the Kinect for Windows Processing library which uses the Kinect SDK.
There is another library called SimpleOpenNI which works on multiple operating systems.
The official version is not longer updated for Processing 3 (does work with Processing 2.2.1 though.). Fortunately you can find an updated fork of the SimpleOpenNI library on github
To manually install the library:
select the version of the library for your version of Processing (e.g. for Processing 3.5.3 go to SimpleOpenni Processing_3.5.3). It should be one of 3.5.3, 3.5.2, 3.4, 3.3.7, 3.3.6, or 2.2.1 (otherwise you may to install one of these Processing versions)
Click Clone or download > Download ZIP (on the top right side of the repo)
Unzip and the contents and within the folder select the SimpleOpenNI folder that contains a folder named library:
Move this nested SimpleOpenNI folder (containing the library folder) to Documents/Processing/libraries
Restart Processing (if it was already running)
Go to Processing > Examples > Contributed Libraries > SimpleOpenNI > OpenNI and start playing with the examples
Other notes:
To track a user start with the User and User3d examples
notice context.getCoM() returns the centre of mass: a single point while context.getJointPositionSkeleton() can get you position of a hand in 3D
you can use context.convertRealWorldToProjective() to convert from a 3D position to a project 2D position on screen
Once the skeleton tracking is locked to a person you can get the joint position for each hand, but it's worth noting there's a separate hand tracker functionality: checkout the Hands / Hands3d examples. Depending on how you want to track participants / what the environment is / what the motions choose the option that works the best
Speaking of the environment bare in mind the Xbox 360 kinect is susceptible to infrared light interference (for example bright incandescent lights, direct sunlight, etc.): this will deterioriate the depth map quality which in turn affects skeleton tracking. You would want to have as much control over lighting as possible and have ideal lighting conditions.
test ! test ! test ! :) think of the interaction and the environment (sketching on paper first can be useful), for each assumption run a basic test to prove that it works or not. Use iterations to learn how to change either the environment or interaction to make it work.
Check out the RecorderPlay example: it records a .oni file which contains both RGB and depth data. This is super useful because it allows you to record on site in areas where you might have limited time access and it will save you time not having to go back and forth between your computer and in front of the kinect. (Once you initialize SimpleOpenNI with the path to the .oni file (e.g. context = new SimpleOpenNI(this,recordPath);) you can run the skeleton tracking and everything using the recording
If you want to see more about Kinect and Processing check out Daniel Shiffman's Getting started with Kinect and Processing page
Have fun!

Vertical takeoff for a DJI Matrice 100

DJI Android SDK version: 4.11
Matrice 100 / Matrice 600
I am trying to take off the drone vertically.
I tried with GoToAction in a timeline, but that failed due to some bug in the SDK, (confirmed by your support team dev#dji.com #29496) I get STARTED for the GoToAction, but no PROGRESSED or FINISHED, and no errors logged at all.
Since I need to continue working, I tried a workaround by sending FlightControlData to the VirtualStick by calling the following function with the requested height 20 times a second:
VerticalControlMode.POSITION
FlightOrientationMode.AIRCRAFT_HEADING
VirtualStickModeEnabled = true
VirtualStickAdvancedModeEnabled = true
void sendHeightCommand(Float requestedAltitude) {
FlightControlData data = new FlightControlData(0f, 0f, 0f, requestedAltitude);
flightController.sendVirtualStickFlightControlData(data, djiError -> {
log.v(djiError.getDescription);
});
}
And it works (with the right amount of timeouts) but if there is wind, the drone drifts away, which is very dangerous for me as there is more than one drone in the field, and I don't want them to collide.
Is there another way to change the altitude of the drone, while maintaining its position?
Or is there a way to measure the wind, and push back against it?
[*] Take off drone vertically:
I always use the TakeOffAction in the timelineMission before the GoToAction to ascend to the desired height. However I'm using a Mavic Pro and the SDK may behave different with a matrice drone.
When using the FlightControlData with the VirtualSticks, I use the startPrecisionTakeoff() method in the FlightController class; after the takeoff, the drone ascends to the desired position when the flight control data is sent continuously.
[*] stable hovering:
For the hovering the only low cost solution I see is to enable the VisionAssistedPositioning in the FlightAssistant class, I don't know if the Matrice supports this feature as the documentation doesn't say anything on the supported aircrafts.
Ok so the solution was to use the function: setVirtualStickAdvancedModeEnabled(true)
The reason I didn't see any results was because in the simulator I was playing with 20.0 North Wind, which apparently is too much.
When I lowered it to 5.0 it works perfectly.

How to implement rc command on virtual joystick?

I need to implement a Tello command, which is rc a b c d on a virtual joystick. From different forums, I came to know that for virtual joystick, we need to use rc commands to move the Tello drone. But I don't know how to implement it. In their SDK documentation, they have mentioned it as
a:left/right (-100~100) b: forward/backward (-100~100) c: up/down (-100~100) d: yaw (-100~100)
What do these negative values mean? How can I use the rc command to move the drone?
This is the virtual joystick code which I am using:
JoystickView joystick = (JoystickView) findViewById(R.id.joystickView);
joystick.setOnMoveListener(new JoystickView.OnMoveListener() {
#Override
public void onMove(int angle, int strength) {
// code goes here
}
});
The values -100~100 normally are the velocity for their respective axis. Depending on the coordinate system set and the control modes for the axes, the aircraft move along the axis corresponding to the value. Based on the code you provided I assume the strength value represents the percentage how much the stick is pushed/shifted and the angle value shows the direction into which the stick is pushed.
For the virtual Sticks you need to set the Control modes and coordinate system by accessing the flight Controller:
setRollPitchControlMode(RollPitchControlMode.VELOCITY);
setYawControlMode(YawControlMode.ANGULAR_VELOCITY);
setVerticalControlMode(VerticalControlMode.VELOCITY);
setRollPitchCoordinateSystem(FlightCoordinateSystem.BODY);
The modes chosen above ensure that the virtual controller behaves the same as the default physical remote controller.
Additionally you need to activate the virtual Sticks with the setVirtualStickModeEnabled method before you can use them.
Now for the continuous control over the aircraft you need to send the virtualStickData with at least 5 Hz:
SendVirtualStickDataTask task = new SendVirtualStickDataTask();
this.timer = new Timer();
this.timer.schedule(task, 0, 200);
In this example SendVirtualStickDataTask extends TimerTask and only sends the current pitch, roll, yaw, and vertical throttle values to the drone via the sendVirtualStickFlightControlData method from DJI SDK inside the run() method of TimerTask.
Finally the current pitch, roll, yaw, and vertical throttle values are set inside your onMove() method you posted in your question. E.g. you can use the trigonometric functions sin and cos to determine the x- and y- parts of the strength value, something like this:
pitch = Math.cos(angle)*strength;
roll = Math.sin(angle)*strength;
Please note that the angle needs to be radian and you probably need to cast the angle/strength to float. Furthermore depending on how the angle value is determined you need to adjust the value accordingly.
The second joystick can be used to control the vertical throttle and the yaw. you will need a bit of fine-tuning and testing.
For the control modes/coordinate system, the following DJI SDK Documentation is helpful (scroll down to "Virtual Sticks"):
https://developer.dji.com/mobile-sdk/documentation/introduction/component-guide-flightController.html
DJI also has a basic code example for the virtual Sticks usage:
https://developer.dji.com/mobile-sdk/documentation/android-tutorials/SimulatorDemo.html
I highly recommend using the DJI Flight Assistant 2 Software to test your code before you attempt to fly in the real world.

How to get the (x,y,z) value of HTC vive Gyroscope in real-time?

have you met this problem?I want to get the X,Y,Z values from htc vive HMD in real-time,but i don't know how to do it.I guess whether the htc vive SDK has the special interface for developer?Could you please give me some help?thank you!
This suggestion is only valid if Unity is being used.
Not sure about getting the coordinates from the HMD, but you can attach a script that detects the position of the GameObject.
this.transform.localPosition.x
this.transform.localPosition.y
this.transform.localPosition.z
These are read only values.
Hope this helps.
The SteamVR SDK doesn't seem to expose the raw values of the IMU (inertial measurement unit) to developers. Possible workarounds:
Using the velocity of the HMD via the SteamVR SKD GetDeviceToAbsoluteTrackingPose
Calculate the velocity of the HMD manually or by adding a RigidBody component to the main camera (Camera (eye))

Resources