Kivy Python api to determine location by satelite (no triangulation) - location

I need an api/software that will give me accurate - up to 25m - position of my location for both android and ios devices. My app will be used in the bush where there are no transmitters or other devices for triangulation.

Related

How to access the image data of the obstacle avoidance sensor of DJI Phantom 4 using Mobile SDK

I would like to access visual data from the stereo vision cameras used by Phantom 4 for obstacle avoidance. I like to use it to find if the obstacle is a human and localize the human position w.r.t the DJI camera. Is it possible?
It's not possible on the Phantom 4 or the MobileSDK. You can achieve this with the M210 and the OnboardSDK though.

Scale 4K Apple TV app in simulator to pixel accurate

I'm trying to test an application for the Apple TV 4k. In order to test it the best possible way, I want to scale the simulator to be 'pixel accurate.'
For some reason Apple TV 4k simulator does not resize to the real Pixel accurate size. The iPhone X however does scale to the correct size.
I did scale to Pixel accurate or cmd+2
Does anyone have the same problem, and how can I scale the Apple TV 4k to the Pixel accurate size.
i'm using XCode Version 9.1 beta (9B46)

Project Tango, camera position?

I would like to be able to take a photo indoors and be able to determine the position (x,y,z coordinates) of the camera in the room. Is this/will this be possible with Project Tango and the Lenovo Phab 2 Pro phone?
Thanks.
Yes, if you made an app yourself you could use an ADF (Area Description File) to determine the devices position relative to where you started to record the ADF. You can get position as XYZ and rotation as a quaternion.
Worth noting is that the camera on the Project Tango Tablet DK has a really bad camera, so it really wouldnt be worth the effort with that, although the Phab 2 Pro probably has a better camera.

Responding to tilt of iPhone in Sprite Kit

I have been building a Sprite Kit game for quite some time now. Just recently I have been adding gyro/tilt functionality. Using the CMMotionManager, I've been able to access the numbers surprisingly easily. However, my problem arises as a result of how the acceleration.x values are stored.
You see, the way my game works, when the game starts, the phone quickly calibrates itself to how it's currently being held, and then I respond to changes in the acceleration.x value (holding your phone in landscape orientation, this is equivalent to tilting your screen towards and away from you.) However, laying your phone flat is 1.0 and tilting it straight towards you is 0.0, and then it loops back through those values if you go beyond that. So, if someone is sitting upright and their phone is calibrated at .1, and they tilt their phone .2 downwards, the results will not be what is expected.
Is there any easy way to counteract this?
Why are you trying to make your own system for this? You shouldn't really be using the accelerometer values directly.
There is a class called CMAttitude that contains all the information about the orientation of the device.
This orientation is not taken raw from accelerometer data but uses a combination of the accelerometers, gyroscopes and magnetometer to calculate the current attitude of the device.
From this you can then take the roll, pitch and yaw values and use those instead of having to calculate them yourself.
Class documentation for CMAttitude.

will my apps work on iphone OS4?

Screen resolution has increased in iphone OS4. Since lot of UI stuff have hardcoded co-ordinates, will my app run properly on OS4? I still haven't got Snow Leopard, so cant test run the simulator for OS4.
It is publicly known that the number of points vs pixels is 2:1 so point 320 has 640 pixels in hi-res and 320 pixels in low-res. Low resolution images will be somewhat jaggy, but their positioning on the screen would remain the same.

Resources