I have downloaded the official windows DJI Thermal SDK, and I can successfully use the exe on sample image. But failed to open my DJI XT2 camera's thermal rjpeg output.
D:\test\dji_thermal_sdk_v1.2_20211209\utility\bin\windows\release_x64>dji_irp.exe -s D:/Dropbox/dji/selected/999431646940552000.jpg -a measure -o measure.raw
DIRP API version number : 0x12
DIRP API magic version : bb44858
R-JPEG file path : D:/Dropbox/dji/selected/999431646940552000.jpg
ERROR: create R-JPEG dirp handle failed
Test done with return code -7
The sample image in XTS folder has a image naming like DJI_0001_R.jpg. But my image name is 999431646940552000.jpg like below, is this supported by DJI Thermal SDK?
(I cam verify my rjpeg image is correct, since it can be opened using FlIR thermal Studio starter.)
the DJI thermal SDK doesn't support XT2 camera. Supported products are:
XTS
H20T series
Mavic 2 Enterprise Advanced
Related
I'm trying to see why our DJI-enabled app isn't working correctly with the Mavic Air 2 on iOS or Android. Here I'm debugging with iOS but I've seen the same failures when briefly testing on our Android app.
When calling setMode:completion: on the single camera belonging to the Mavic Air 2, I consistently get "Current product does not support this feature.(code:-1013)"
isMediaDownloadModeSupported returns true for the camera, and yet I can't set the camera mode to media download mode at all (or any other mode).
I've found that setFlatMode:completion: seems to work ok to set photo and video modes as a sort of alternative, however this is only for photo/video modes, and won't help me with downloading media from the SD card. (right?)
Any help out there?
From my DJI Developer Support ticket for the same issue:
For the Mavic Air 2 drone, should use the setFlatMode to switch the photo, video mode, and use the enterPlayback, exitPlayback to enter/ exit the download mode.
For my M100 I want to control several external devices such as laserpointer or thermal camera. I think, for a full control I have to integrate it by OSDK. I looked through the dji sdk documentation, but I have no idea where to start and what I effectively have to do. May somebody give me a little hint how to start and what really is necessary to fullfill my requirement described below.
Connect a Flir Boson (3.3V) and display the thermal image on the remote controllers mobile device.
Control the thermal camera (switch on/off) by the remote conroller.
Connect a laser pointer on a standard port of the M100 and control switch it on/off by a button of the remote controller.
I developed an application using DJI Windows SDK to control a Mavic 2 Pro. I manged to get all the data from the drone but when I tried to send commands to the drone through the VirtualRemoteController nothing happens
The set up is the following. I first send the command to auto-take off from my Windows app, which does it without out problems. Then, I give a position to move on my app. All the calculations are done as I expected but the drone continues hovering despite the values that I am sending are different of 0 and are on the range [-1,1].
I am getting an instance of the virtual remote controller as it follows:
VirtualRemoteController virtualController = DJISDKManager.Instance.VirtualRemoteController;
Then, I use the following command to send the movement that I want to execute:
virtualController.UpdateJoystickValue(throttle,roll,pitch,yaw);
throttle, roll, pitch and yaw are values between [-1,1]
This API is only supported for DJI aircraft that have a "WIFI-only" capability.
It is available for the Mavic Air for the Windows SDK.
The Mavic Pro supports Wifi but the Windows SDK does not support the Mavic Pro.
None of the Mavic 2 models support wifi.
we developed out custom android for our own board developed by our company. Initially I am having issues in displaying similar content on hdmi but i solved that issue . Now we are trying to display different content on hdmi but
/sys/devices/virtual/disp/disp/hdmi status is showing 0 (HDMI not Connected !)
while we are having similar display on hdmi port. We are using android 4.2. I need guidance, where to look in the android code or do i have to change my driver for hdmi ( because i am using driver for marsboard A20) ?
Thanks.
First of you need to see in product fex file that whether HDMI support is enable in your device or not.
You will find product fex file in lichee/tools/pack/chips/sun7i/config/android//sys_config.fex.
Check values of screen0_output_type and screen1_output_type attributes.
This code return false in Wear API 22:
PackageManager pm = getPackageManager();
boolean hasGps = pm.hasSystemFeature(PackageManager.FEATURE_LOCATION_GPS);
Per default in Android Studio AVD Manager for Wear AVD there is no GPS option. However if I "Clone Device..." (guess same as "New Hardware Profile") an existing one and then edit it, there is an GPS option. Still the AVD does not return true (code above).
There is an option "Import Hardware Profiles".
Does Sony have a "Hardware Profile" which I can import?
How can I test my local Wear GPS (without buying Sony SmartWatch 3 hardware)?
Is there a Sony Smartwatch 3 emulator/AVD/skin?
Not all Android Wear devices provide a GPS unit. Instead, you should use the FusedLocationProviderApi from Google Play Services to request location updates. The nice part about this API is that if your phone and watch are together, it will use the GPS in the phone to save battery - it will only use the GPS on the wearable when it is disconnected from the phone. The FusedLocationProviderApi uses the same API as available on phones, so you can reuse most of your existing code.
Here is a blog post I wrote about this:
http://android-developers.blogspot.com/2014/10/gps-on-android-wear-devices.html
Documentation for FusedLocationProviderApi:
https://developers.google.com/android/reference/com/google/android/gms/location/FusedLocationProviderApi
And a sample that implements this:
https://github.com/googlesamples/android-SpeedTracker
There is no emulator that provides the GPS functionality of the Sony SmartWatch 3.