Move in velocity the UAV in attitude mode - dji-sdk

First of all I am sorry for my English.
I am trying to control the UAV with the DJI A3 in speed without GPS, indoors.
With no GPS, the drone flies in ATTITUDE mode. I am in P_GPS mode to obtain control authority. Due to not having GPS, the UAV is in ATTITUDE and although I have the authority control the UAV does not move.
Is there any possible way to move the UAV in speed while in ATTITUDE mode by not having GPS?

Yes you can. You won't be able to use velocity or position control but you should be able to set roll, pitch and yaw angles on the control flags. Take a look at the following links here:
https://github.com/dji-sdk/Onboard-SDK-ROS/issues/10
https://github.com/dji-sdk/Onboard-SDK-ROS/issues/294
https://github.com/dji-sdk/Onboard-SDK-ROS/issues/248

Related

ESP32-CAM image noisy

I am using ESP32-CAM, and CameraWebServer example from standard Arduino IDE package.
It works fine, but the image I receive in a browser is noisy: color lines appear randomly over the picture. Any idea what causes it and how to fix?
There could be a number of reasons for this behaviour, and it possibly down to a cumulative number of issues which will affect the picture quality.
Power supply quality. ESP32's draw a lot of current under certain conditions, and this can cause a brownout condition to occur. This could be down to your USB port not being able to supply enough current. Check the serial terminal for messages, if you see brownout error messages on your serial monitor, try a powered USB hub or a better quality USB cable
Power supply noise. If you have access to an oscilloscope, check the 3.3 and 5v rails for garbage. If excessive, try adding 2 x 1000uf capacitors on each rail
RF interference. The ribbon cable between the camera and the board is not shielded. Try lifting it away from the board, or even wrapping it in a thin layer of foil and some insulating tape, ensuring no shorts occur. If the ribbon cable is long, try a camera with a shorter cable
Lighting. With fluorescent and LED lighting, some forms of illumination seem noisier than others. Try increasingly natural daylight
Interface settings. The defaults on the webserver example are not ideal for certain lighting conditions. Try disabling Lens correction and manually adjusting the gain control, AE level and exposure. Tweaking these settings will ellemminate much of the background noise.
I found all of these small improvements makes a big difference to picture quality. In my scenario, low light and noisy power seemed to be the worst culprits, but your YMMV. By implementing these I managed to improve the picture quality from unwatchable to reasonable.

Pepper right shoulder pitch wont respond in timeline ir inspector

I'm trying to make some reach-grasp trajectories in timeline. They work fine with the virtual robot. On the real Pepper, sometimes they execute fine, then, next try, the right arm doesnt move in the complete trajectory. If I then use the inspector to move the right shoulder pitch, it gets stuck around 50°.
I wonder if there is some stiffness, or force, parameter: it is as if the robot does not have the strength to make the movement.
You may debug the hand movement with a simple command line Python script that we have made for us. See https://nlp.fi.muni.cz/trac/pepper/wiki/SetArmPosition with a link to souce code.
Pepper has barely the strength to lift its arms when stretched.
When it does so, these motors get hot, and the power fed to them is reduced, to the point where the motion cannot be performed.
Prefer more complex motion, that requires less instantaneous strength to perform, for instance by using a bit of the roll motion, and by stretching the arms only at the last moment.

Is there any way to get live telemetry data from the DJI Mavic 2 Zoom to another unit, say a computer?

As part of a course on my university we've been given the task of taking the live wind telemetry from a drone and then feeding it to a neural network so that it gives better estimates than just using a sensor.
The research we've concluded so far tells us that our drone, the DJI Mavic 2 Zoom, is only compatible with the Windows SDK but not the onboard SDK.
Simply our question is; is there any way for us to send the raw wind speed and direction data from the drones sensors to a computer?
Create Android application with DJI Mobile SDK and send data from msdk to your computer with wifi.
The SDK only provides the wind warning level(0, 1 and 2). It does not provide any information regarding the direction from which the wind is blowing or the actual speed of the wind.
The aircraft tries to stay in it's current position on it's own, even if there is moderate wind blowing. However the drone does not tell the user how much it has to work in which direction to negate the effect of the wind.
I assume you're better off with accessing real time wind information for your position from a weather service on the internet, if that's available in your country.
I've done a wind meter app.
The best method is:
Fly against the wind
In virtualstick use angle mode, and set pitch and roll to 0 This will let the drone drift with the wind.
Slowly rotate the yaw.
Meassure the speed, when it stop increasing, the gps-speed gives you the windspeed and direction.
Warning, in strong wind you have to fly for quite a while against the wind.
The yaw rotation needs to be done due to the drone is never exactly leveled, and it will pick up speed at one direction. If turned, it cancel that out.
Send the info to a server over internet/wifi.
I've done this on an android phone connected the controller.
Windows api doesn't seem to support virtualsticks, which I find strange. In that case it must be done on android or ios, and trasnmitted to a server. I might be wrong since I never used the windows api.

[dji-sdk][onboard-sdk] Loosing GPS while in Onboard control flight mode

How does the DJI UAV (A3 or M600) behave if the GPS signal was completely lost during the flight and the setpoint was given as Horizontal command in ground_ENU frame.
According to this appendix:
Only when the GPS signal is good (health_flag >=3),horizontal position control (HORI_POS) related control modes can be used.
Only when GPS signal is good (health_flag >=3),or when Guidance system is working properly with Autopilot,horizontal velocity control(HORI_VEL)related control modes can be used.
Will the DJI switch to Attitude Flight mode?
Will you still have the authority to control over Onboard-SDK? And if yes, does this mean that you could control it only via HORI_ATTI_TILT_ANG mode?
Thanks!
I never test the full case which hook up the dji a3 and osdk and let it crash.
What I tested is using ground set up with A3 and turn off the ESC and motor. Run the mission and plot the GPS. But I moutned many other sensors to get the correct position/velocity command.
When next to the building. GPS went to crash on the building. The DJI GPS mission control follows that. The GPS from DJI sdk is denoted in the green line. I have to use aullixary visual and Lidar based navigation to get to correct position

Responding to tilt of iPhone in Sprite Kit

I have been building a Sprite Kit game for quite some time now. Just recently I have been adding gyro/tilt functionality. Using the CMMotionManager, I've been able to access the numbers surprisingly easily. However, my problem arises as a result of how the acceleration.x values are stored.
You see, the way my game works, when the game starts, the phone quickly calibrates itself to how it's currently being held, and then I respond to changes in the acceleration.x value (holding your phone in landscape orientation, this is equivalent to tilting your screen towards and away from you.) However, laying your phone flat is 1.0 and tilting it straight towards you is 0.0, and then it loops back through those values if you go beyond that. So, if someone is sitting upright and their phone is calibrated at .1, and they tilt their phone .2 downwards, the results will not be what is expected.
Is there any easy way to counteract this?
Why are you trying to make your own system for this? You shouldn't really be using the accelerometer values directly.
There is a class called CMAttitude that contains all the information about the orientation of the device.
This orientation is not taken raw from accelerometer data but uses a combination of the accelerometers, gyroscopes and magnetometer to calculate the current attitude of the device.
From this you can then take the roll, pitch and yaw values and use those instead of having to calculate them yourself.
Class documentation for CMAttitude.

Resources