Does project tango dev tablet have pressure sensor? - google-project-tango

https://store.google.com/product/project_tango_tablet_development_kit
Does anyone who has the development happen to know if it supports Sensor.TYPE_PRESSURE?
(I do not see it listed as one of the features, but have an application that needs pressure).

Tango device does include the pressure sensor.
You can check the pressure sensor reading by using any android sensor test app on the Play Store.
My guess is that it should be good for normal sensor type in android.
But I have never seen any app using this feature before. You should have a try.

Related

Is it possible to use the ML Kit SDK on Wear OS?

I am wondering if anyone knows if it is possible to use the ML Kit SDK on Wear OS devices? I know Wear OS is based on Android, and I've seen references online to Firebase notifications working on Wear OS.
I have googled combinations of the terms "ML Kit", "Firebase ML Kit" and "Wear OS" but not found any definitive answers.
I don't have code on hand, but I am wondering would it even be possible to import/use the SDK on Wear OS apps in the first place.
Expected results would be being able to instantiate and use some of the machine learning models from the ML Kit API on a Wear OS watch.
Battery usage and efficiency are secondary at the moment, I'm just wondering if it is possible at all.
I have never tied it but I think it is possible. In the end, the Andorid Wear 2.0 (WearOS) is just another regular Android and it supplies direct internet access from the device.
I was able to succesfully use RenderScript on Wear devices but on the other hand I've faced an issue that Google Awareness API was not available on Wear. THis seem not to be the expected state.
Probably in the end you will need to do feasibility study on your own.
You can try to use some android MLKit sample and simply run it on Wear device.

Computer vision google tango

Tango is developed by google which has api that used for motion tracking on mobile devices. I was wondering if it could be applied to stand alone java application without android (for java-SE). If not then I was wondering are there any api out there are similar to tango where it tracks motion and depth perceptions.
I am trying to capture the motion data from a video, not camera/web cam. If this was possible at all.
Googles Tango API is only compatible with Tango enabled devices only. So it does not work on all mobile devices only devices that are Tango enabled. If you try to use the API with a device that is not Tango Enabled it wont work.
I think you should research a bit into OpenCV its an Open Source Computer Vision Library that is compatible with Java and many other languages. It lets you analyze videos without the need for that many sensors (like Raw Depth Sensors which are primarily used on Tango enabled Devices).
The Tango API is only available on Tango-enabled devices, which there aren't that many of. That being said, it is possible to create your own motion-tracking and depth-sensitive app with standard Java.
For motion-tracking all you need is a accelerometer and gyroscope, which most phones come equipped with nowadays as standard. All you basically then do is integrate those readings over time and you should have an idea of the device's position and orientation. Note that the accuracy will depend on your hardware and implementation, but be ready for it to be fairly inaccurate thanks to sensor drift and integration errors (see the answer here).
Depth-perception is more complex and would depend on your hardware setup. I'd recommend you look into the excellent OpenCV library which has Java bindings for you already and make sure you have a good grasp on the basics of computer vision (calibration, camera matrix, pinhole model, etc.). The first two answers in this SO question should get you started on how to go about determining depth using a single camera.

Does Project Tango utilize the Android SDK?

I'm developing an app that uses device sensors to determine user x-axis rotations and y-axis pitch (essentially the user spins in a circle and looks up at the sky or down at the ground). I've developed this app for a phone using the android Sensor.getRotationMatrix and Sensor.getOrientation functions and then using the first two resulting orientation values. I've now moved my app to a Project Tango tablet and these values no longer seem to be valid. I've looked into PT a bit and it seems that this measures things in Quarternions. Does this mean that Project Tango is not meant to implement the Android SDK?
The Project Tango APIs (which are for Android only) and the Android SDK are both required to build Project Tango apps. The Tango APIs offer higher level interfaces to Android device sensors than the Android SDK's direct access to sensors state - Tango APIs combine sensors states to deliver more complete "pose" (6 degrees of freedom position and orientation) state, as well as 3D (X, Y, depth) scene points and even feature recognition in scenes, etc. The crucial benefit of the Tango APIs is syncing several different sensors very precisely in realtime so the pose state is very accurate; indeed, the latest Tango devices support that sync inside the CPU circuitry itself. An app collecting that data from sensors using the (non-Tango) Android SDK APIs will not be fast enough to correlate the sensors as through the Tango APIs. So perhaps you're getting sensor data that's not synced, which sows as offsets.
Also, a known bug in the Tango APIs is that the device's compass sensor is returning garbage values. I don't know if that bug affects the quality of data returned by the Android SDK's calls directly to the compass. But the Android SDK's calls to the compass are going to return state at least somewhat out of sync with the state returned by the Tango API calls.
In theory, the Android SDK should still be working, so your app should work without any change, but it won't get advantage of the improvements given by the Project Tango.
To get the advantages of Tango (fisheye camera for improved motion tracking...), you need to use the Tango API to activate the Tango Service and then yes, use the pose in quaternions.

Installing Eddybeacon

As far as I understand eddybeacon (just released by Google) is effectively a new 'operating system' for Bluetooth 4.0 Low energy devices (iBeacons). I have been experimenting with iBeacons for sometime now and want to try out a few things with eddybeacon. Has anyone had a go with it yet? I've read a few sites and they say it can be installed to some devices... Can anyone share how to do this?
If you want to start out by playing with Eddystone, you have a couple of options:
You can use a software transmitter. Just download my free Locate App in the Google Play store which will both act as an Eddystone transmitter and decode other Eddystone-compatible beacons in the vicinity. Google also has posted an Android app that can transmit the Eddystone-UID frame here, but you have to compile it yourself.
You can get a few hardware beacons for testing with a Developer Kit from Radius Networks (my company) here.
Once you have a transmitter, you can try writing some software to work with it. Here's a tutorial I wrote on how to build a basic Eddystone-capable Android app.
One other thing that might be useful is an Eddystone detector tool. You can use the free Android Locate app to detect and decode all of the frames transmitted by Eddystone.
So:
Eddystone is a specification for Bluetooth Smart (usually just called BLE) devices to behave like beacons — it defines the Bluetooth frames and content they need to broadcast to be seen as beacons.
iBeacon is not a generic term. iBeacon is actually Apple's specification for Bluetooth beacons. Eddystone and iBeacon are both examples of beacon specifications for BLE devices.
There are a few ways to get started with Eddystone beacons.
a. A number of hardware manufacturers sell developer kits that will let you get started with Eddystone beacons right away, and there is plenty of example software out, either from those vendors, or from the google pages on GitHub — github.com/google/eddystone and github.com/google/beacon-platform.
b. Some people have had good luck with Arduinos and Raspberry Pis. You can see an Arduino example here (Note: I have no idea how well that project works, I've just seen it used a few times.)

iBeacon app for android and IOS doubts

I'm trying to learn something about iBeacon and I have a question:
As far as I understood Apple provides API in order to develop iBeacons app since IOS7, but for android how does it work? The only thing I found is that It works only from version 4.3 (Is it correct?) But are there any sdk or library to use?
Yes, you can use the open source Android Beacon Library, which gives Android the same basic capabilities to detect and transmit as beacons as provided by CoreLocation on iOS devices. This library is designed to be vendor neutral, and works with a wide variety of beacons. There are also a number of proprietary Android SDKs offered by beacon manufacturers, some of which harness special features that only work with those beacons.
The main thing to understand on Android is that while 4.3+ devices can all detect Bluetooth LE transmissions, there is no native beacon framework, and working with beacons typically requires quite a bit of logic beyond reading the Bluetooth LE packets they send out. As a result, Android beacon apps typically bundle a small library like mentioned above with the app to provide beacon detection and/or transmission capability.
Full disclosure: I am the lead developer for the Android Beacon Library.
You can use as well kontakt.io Android SDK which handles Beacons with IBeacon profile. In the latest release some optimizations in terms of battery consumption were made. Additionally, it supports filtering and modes (Android Lollipop and upwards) according to which scan is performed (explanation in brief here).
To start, visit http://docs.kontakt.io/android-sdk/quickstart/ and follow the instructions.
There is a sample app demonstrating SDK functionalities here. I suggest observing the project as it is the first place where new changes are being introduced.
As #davidgyoung pointed, there is no native framework for IBeacons in Android at the moment.

Resources