How can I get a shutter signl of the Matrice 600 (PRO) A3? - dji-sdk

I have an M600 PRO (A3) and I need to connect an Arduino to it to receive a drone shutter.
In fact, I would like to use Drone Deploy with a photo of each waypoint and at the time of each photo I would like my arduino to receive this signal to perform a specific task. I have no camera attached to the drone.
Could someone help me with this task? I've already been able to connect A3 to the Arduino. I still can not understand the data bus fault.
i already connect one Arduino to the DJI MAtrice 600 Pro, but I do not found the hexa-Code related with shutter sign
I would like to receive the shutter sign from Matrice 600 in my Arduino, in order to program a special task related with this sign.

Since you say you want to use Drone Deploy, you won't be able to use the Mobile SDK and will need to build something with the Onboard SDK. It's unclear what you want since you say you have no camera but want to trigger something every time a picture is taken.
If you just want to detect when Drone Deploy tries to take a picture, you may get an error depending on if the sdk thinks there is a camera detected or not. I haven't tried the Onboard SDK much or especially tried without a camera attached so I don't know if an error will result or if you can set the trigger in the take picture callback.
Either way the best place to start is probably the Camera sample onboard sdk app.

Related

OSDK4.x: Can you use DJI Assistant 2 for Matrice with payload actions?

Is it possible with OSDK4.x to command payload and flight actions and use the DJI Assistant 2 for Matrice concurrently?
Previously, we have been using the M210V1 with OSDK3.9. Using the DJI Assistant 2 for Matrice to simulate the drone flight is key to our ability to develop our system.
However, the M210V2 and OSDK4.x require the USB port of the drone to be connected to the Linux device running the OSDK, otherwise any payload (GimbalManager, CameraManager) actions throw an error - such as GimbalManager::resetSync.
This is not ideal for development since we cannot use the simulator (on MacOS) and connect the USB to the Linux device (there is only one USB port on the drone). Has anyone solved this problem?
Yes and no.
For 210, There is only one USB2 port and it is either for connecting to PC side assitant2 or connecting to onboard PC to get osdk stream. You can think it as a bug in design phases.
Yes, you can run part of osdk and run the simulator without the payload camera-related function. If I recall correctly, I can still "rostopic echo" the Gimbal orientation topic from the drone. Its only image topic that is disabled. You can simulate the GPS based flight in simulator and try to set gimbal direction. I remember this was achievable.
There is no way for you to run both simulators and get payload functions such as images from OSDK. so to get both image running and drone running in simulator that's not achievable.
If you really wants both at same time. I suggest you move to M300 which they have dual USB C interface for both camera stream and for simulation.

Control commands of the Onboard-SDK are published with different frequencies

I am using the Onboard-SDK for DJI M100 on ROS.
I developed a code for controlling the position of the M100 to certain target position.
However, it doesn't reach the specified target.
For that reason I checked the published control signals with ROS, and as I saw, in some experiments the frequency of the control signal is not "constant" at all. For example, sometimes I have 50Hz, some others 5Hz, 10Hz etc.
I would like to know what is the actual reason behind this.
Assuming you 3.3V FTDI works and have perfect working hardware, I would guess some 1 changed the DJI assistant2 SDK setting for you. Otherwise, it will not changes. I had some simar problem before, but the problem is I burn the API port by using 5V FTDI
Besides, you control should be sent to the drone in a fixed time loop by running a ros loop rate and ros sleep routing. not send at each callback. The reason being that you need to control your drone position with PID or other control methods which is time depended.

Can I test my DJI SDK for Windows 10 application with a DJI Drone simulator without the risk of running a real DJI Drone?

Can I test my DJI SDK for Windows 10 application with a DJI Drone simulator without the risk of running a real DJI Drone?
I am asking since it is cumbersome to have developers run a real drone in their office to test what they are building.
There are 2 ways to test applications but both require a physical drone.
1) Download and install the PC Simulator; this is a game-engine that connects the PC (USB) to a drone and displays the flight operations. It is available online under the developer downloads.
2) Using the API, you can enable simulator. This method is not as good because you cannot see in real-time what the aircraft is doing unless you receive and display flight details within your app.
Both act the same to the PC simulator can be easier to evaluate and observe the flight operation.
You can set your drone in a simulator mode using the APIs.
This will make your drone respond as if it was flying, but the motors won't turn on.
After that, you can use the DJI Assistant to visualize the aircraft moving in a simulated 3D environment.

Live Stream Directly to Computer

I'm a student currently working with a Matrice 100 for a project. I know that currently you can stream to YouTube/Facebook from the mobile app, but is there any way to get the stream directly to computer? I noticed that there was a mini hdmi port on the controller, could you plug a cable into that and a computer to access the stream?
I'd leave this as a comment instead of an answer but I don't have enough reputation.
Do you have an HDMI cable on hand? You could try plugging one end into the hand controller and the other into a laptop.
Also, DJI has a ground control application for the PC (although I don't think they are continuing support for it) that can be used with a bluetooth radio to communicate with and receive video from the aircraft. The application is free to download from the DJI website.

How to setup dji L2 api demo?

I'm intended to create an android app in which you can control the drone (in my case phantom 2 vision) I want to control it using a virtual joystick, I already have the Level 2 API. To do that I wanted to see a "working app" with my own eyes, to understand how I should use the API. I tried to run the dji demo application (I followed the steps they pointed on the documentation, like put the api key in the manifest), the application seems to work ok, but I only can control the gimbal, the virtual joystick does not work for some reason. Is there any limitation in terms of android OS version, devices, phantom firmware version etc?. I made some questions on dji's forum but no one gave me a concrete answer, I hope some one here can give me a hint :)
I'm using a Samsung galaxy note 10.1. I'm working on the DJI-SDK-Android-V2.4.0 project.
I could get "D/GsProtocolJoystickDemoActivity: GroundStationResult GS_Result_Failed"
while I was debugging.
Since you could invoke the API from gimbal and camera correctly, I am assuming that you have already activated your app. Here is my point, the virtual joystick could be used only in the ground station mode. My suggestions are as following:
Turn the remote controller mode to the F mode
Invoke DJIDrone.getDJIGroundStation().openGroundStation().
Invoke the joystick methods.
Please Note: The app key should have the LEVEL2 access so that you could invoke the Ground Station relative methods.

Resources