I'm trying to get the HoloLens's accelerometer data for the purpose of movement speed while in a car.
I tried using the Windows 'Devices' namespace to access their 'Sensors.Accelerometer' object, but that's inexplicably incompatible with the HoloLens ... a Windows Device.
I was able to cheat by calculating the speed of the camera object in Unity, but that only works relative to the headset wearer (when I test the app in the passenger seat of a car, I just the speed at which I move my head around).
Some time ago I'd found a Github repo from MS about HoloLens sensor streams, but I can no longer find it (Dear Future Me, clone EVERYTHING).
Does anyone know if there's another way I can get the accelerometer data? Either in C# or C++.
PS I have Research Mode and MS's demo app on my HoloLens and I've been analyzing that code for answers too. Didn't want anyone to think I show up here looking for a magic bullet!
If you want to use accelerations to calculate speed, you will run into problems because of error propagation and the lack of a reference point. Even if there is no direction change the integration of acceleration over the time will make values are useless after a short period of time. (s. Getting displacement from accelerometer data with Core Motion for example)
An alternative would be GPS but HoloLens doesn't a built in GPS sensor and pairing to phones over bluetooth was not possible in the past (maybe now). That said the only thing I can imagine is oopening a WiFi hotspot on a phone connect the HoloLens to it and write an app that will transfer GPS data via WiFi to the HoloLens app listening on the other end.
Related
I am useing DJI Mobile SDK to crtate an APP by Android Studio now.I want to know how to use the GPS signal of the aircraft and the phone to realize position control. Is there any API in the DJI Mobile SDK I can use?
You may follow this sample to run simple higher-level GPS position control https://developer.dji.com/mobile-sdk/documentation/ios-tutorials/GSDemo.html
If you stop at some waypoint, it can automatically hold the position. It is a simple recreation of the DJI pilot app waypoint planning.
For low-level GPS position control requires a higher understanding of the system. This usually allows interesting applications such as allow drone to follow some person or precision landing to some mark or circle around some tower. There is not much open-source implementation available on the internet. You have to search in the MSDK for the API for some basic control and you also need to have deep understanding in the field that you are trying to achieve e.g real-time object detection, low-level control framework, Visual-Inertial SLAM etc
Can we see analytics just by connecting to beacons ( beacons who are not actually STICK to the walls ) through estimote app ?
Due to pandemic in India it is not possible to go to the office now . I have a development kit of proximity beacons and I need to develop a simple project out of it.
So should i travel and bring that kit to home and develop an app using some already available templates and later on return it to the office so that once we resume after 5-6 months we can stick them to the wall and play around it.
Please let me know, I have read every article but can't find anything about it.
You will need to bring the kit home, enable the beacons so they are transmitting (which means the battery will start being used), register them with Estimote, and then use a mobile phone in the vicinity of the beacons with an app running the Estimote SDK. You may install the beacons in their permanent location at a later time.
I am new to the drone community. I am trying to use a drone to follow a driving vehicle and record a video, this would be part of my graduate school project.
I know DJI provides an "active track" function but it seems like the maximum speed for the tracking is around 20-25 mph, and it cannot track the vehicle at a top-down angle (drone looks straight down at the vehicle).
I have an idea to send the GPS info of the ego vehicle to my mobile device and use a customized app to read the GPS location and set it as a target so the drone can follow it.
Is it doable? Is it in general worth the effort? Or is it better off to just use the active track and work with the best angle that I can get?
I've done that.
Send gps updates from a tracker phone to the phone connected to rc.
You have to use virtualstick mode in the MSDK. Tha max speed is 15m/s
Virtual stick functions in the DJI Mobile SDK simulate the remote controller's joysticks, and therefore an aircraft can be automated to fly in any way a human can manually fly it. Compared to missions, this is a more complicated, but flexible way to automate flight.
Works very well.
An example:
https://youtu.be/i3axYfIOHTY?t=55
It's a little shaky, but thats fixed now :-)
I would like to know how to get the current feature points used in motion tracking and the ones that are present in the learned area (detected or not).
There is an older, related post without an useful answer:
How is it possible to get tracked features from tango APIs used for motion tracking. I'm using the tango to not do SLAM and IMU-integration on my own.
What do I need to do, to visualize the tracked features like they did in some of the presentation videos. https://www.youtube.com/watch?v=2y7NX-HUlMc (0:35 - 0:55)
What I want in general is some kind of measure or visual guidance on how good the devices learned the current environment. I know, there is is the Inspector App but I need this information on the fly.
Thanks for your Help ;)
If you want to check if an area is present in your learned area model and which is not, you can use the Tango Debug Overlay App. It has a field 'Tracking Success' that only counts up if the device sees learned feature points (ADF on) or finds new feature points (ADF off) (http://grauonline.de/alexwww/tmp/tango_debug_overlay_app.jpg). Additionally, you can request that debug information like Tango Debug Overlay App does (as a simple text) via UDP port 29361 in your App and parse the returned debug text (although this is not recommended at all for a real app as this interface is not documented)
PS: In Tango Core 01-19-2017 this counter does not seem to work anymore.
I'm checking at DJI specification for Phantom 2 Vision and Phantom 2 Vision plus and also API reference, but I'm not sure wether is it possible to pilot drone without remote controller? How is the communication via SDK done - is smartphone directly communicating with drone, or is going via remote controller?
DJI SDK Level 2 does allow direct flight control of the DJI Phantom 2 Vision+. It communicates the commands to the unit using wifi. That being said, the wifi extender that your phone is connected to is attached to the controller, so you still need the controller to be turned on and in the vicinity in order to operate, but the device is still what is sending the commands to the Phantom. I would recommend that someone always have the controller at the ready incase the wifi network has interference / phone dies etc. I am using the SDK in this way for my side project www.followmephantom.com.
Hope that helps.
FYI - to use the level SDK you first have to submit a proposal for your project to DJI and get their approval..... and pay a fee in most cases.
Here is a helpful link that shows the difference in features for the SDK level 1 and 2.
https://dev.dji.com/en/products/sdk/mobile-sdk/features/level-compare
From 3 weeks of playing: the smart phone controls the gimbal and the camera in general, all the flight controls are strictly from the remote control unit albeit waypoints from the smart phone could be argued as control by the phone. I havent flown it without the smart phone but I am fairly sure I could, without blades on the sitting room floor it seemed possible. I cant imagine why you would want controls from your phone, when things go pear-shaped you would be pretty much screwed with just a touch screen.