AndroidProximityLibrary receiving beacon information rate - ibeacon

I am using the `AndroidProximityLibrary' for a project where i'm measuring the distance to the beacon and when it reaches / passes a certain distance it will do something.
Everything is working fine except the distances i'm receiving from the library have a big variation of values. Even if i'm standing in front of the beacon in clear nice of sight, i can get distance values that goes from 1,5 to 4 meters ( When i'm standing around 3 meters from beacon)
My real question is if i can somehow get more distance values so i can get rid of those spikes, currently i'm receiving beacon information around 2 distance values per second. Is is the beacon that is only sending information with that frequency ? or is it the library that is only doing the callbacks with that frequency ?
As a beacon, i'm using a raspberry pi configured like the RadiusNetwork tutorial. I'm using a nexus 5 hosting the client application.

The reason the values vary so much is because there was a bug in that library that only used a single signal strength measurement to estimate distance. The latest version of the Android Beacon Library (which shares much of the code of the library you mention) uses a running average of signal strength samples over a 20 second window. This smooths out the noise significantly.
Unfortunately the AndroidProximityLibrary has been discontinued, and no new updates are being provided. If you are not using the cloud data features of the library, your best option is to migrate to the Android Beacon Library 2.0, which has all the other features. A migration guide is available here.

Related

Impoving ContactTracing Api efficiency with bluetooth signal strength

As per current specs only duration in 5 min increments is tracked. Suggested interval is 200-300ms it seems. In Singapore signal strength was accounted for but this is variable per device. What if we do still also track the signal strength during that time? You would get a curve from weak to strong that gives an indication of the speed of travel while approaching, and couldn't you also derive fairly accurate indications of proximity after just one day of data?
I noticed that beacon libraries already attempt to estimate distance: Understanding ibeacon distancing
But it does not seem these self-calibrate yet based for instance on min-max readings versus moving targets. I'm thinking that could work especially as phones are modified to be always on in that respect.
It is very difficult to accurately determine distance by Bluetooth RSSI measured between two phones because there is a huge variation in the way different phone models measure bluetooth signals. Check out this graph produced by the Open Trace folks behind the effort in Singapore:
Those variations are consistent with my work in this area for the Android Beacon Library open source project. The fragmentation of Android devices has made it impossible to keep up with all the variations in signal strength response.
One point that the Open Trace team did not address in their work, is that there are a number of different bluetooth channels, and RSSI varies greatly on a given phone depending on which channel is being used. Mobile phones give you no indication of what channel the radio was on when a measurement was taken. The channel difference probably accounts for much of the "height" of the blue bars in the graph.
Unfortunately, there is no way to know if a device is approaching or stationary by reading RSSI updates. The changes could be because of natural variation, motion, or changes in obstacles. I do not believe self-calibration in a contact tracing app is viable.
This does not mean that RSSI is worthless for distance estimates, but it does mean that the margin of error is very high in what you can measure. If you see a device at all, there is a very good chance it is within 50 meters. And if you see that the RSSI is stronger than -70 dBm, there is a good chance you are within 2 meters. But there will always be false positives and false negatives.

The four-legged spider experiment

let me start saying that I'm new to neural networks, Machine Learning etc, and so far I have just made few very simple experiments to learn so please be patient with me also If I ask very naive or long questions.
My favorite coding language is java and I'm looking at playing with Weka. To play with this API that looks like to me
being very clear and complete, this time, to have something more than a set of ideal data to train and check the success rate, I started not from just the software but from a real world problem that I created for myself.
The problem I created and that I'd like to solve with neural networks, is a four legged spider shaped robot controlled by a Raspberry PI with some ADC and servo Hats. This sort of strange robot has 4 legs, each leg being made
of 3 parts, each part is moved by a servo motor. In total I have so 4 legs * 3 leg parts = 12 servo motors. On each servo motor I have a 3 axis analog accellerometer attached (12 in total) I can read from the Raspberry PI.
From each of these accellerometers I read 2 axis to determine the position of each servo, that is the position of each leg segment for each leg. In addition, the "sole of the foot" of each of the 4 legs, has a button to
determine if the leg has reached the floor and is sustaining the spider. The spider construction blog is here, for those interested: https://thestrangespider.blogspot.com/
The purpose of this experiment is to make the spider able to put itself in balance and set itself horizontally starting from any condition.
The spider should be able in few words, to level horizontally its body regardless of the fact that I have put the spider on a horizontal or oblique surface. The hardware platform is ready, I miss few details but let's assume
I can read all the signals I need from a java Spring boot application using the PI4J APIs to interface hardware (24 values from ADC for 12 servos and 4 digital inputs (true/false) from the buttons on
the soles of the spider feets). The intent it to solve the problem of moving the legs servo motors using neural networks built with Weka, reading the various input signals until the system reaches the
success condition (static balance and body in a horizontal position). The main problem is how to use all the data I have to build a data set in the best way, what neural networks to use to put in place
the adaptive corrective feedback, until the spider reaches the success condition of having its body finally horizontal and in static balance.
Going deeper into this topic, let me provide my analysis.
Each leg should perform this steps:
Can move independently one servo per time starting from a random position (in the range of the angle allowed by the leg portion)
Re-read after each move, all the legs segments position, waiting for the next static condition
Determine if last move has brought an advantage or a disadvantage to the system as a whole.
If no change happened because of the last move, keep the latest change/move and continue
If any max angle has been reached, change the direction for that angle next move
Check if system success is closer now respect to previous step and determine next action type
The problem to me is a mix of open questions where one is how to represent this system from a data perspective:
Data coming from acceletometers all belonging from servos in the same legs are candidates for a cluster of data or a robot sub-system ?
Can each servo and its position measurement be a subsystem of the main system instead ?
Can these be multiple problems instead of one really ?
How would you approach this problem from a neural network perspective ?
after studying Weka classification and regression, I finally got to the conclusion that the processing should be a regression problem. That ended as well in a too complex problem that would have need regression with multi label output that apart from Weka would have required another framework on top of that called Meka.
Finally I got to a conclusion that now I will try that is, using first Weka to classify data and detect first on which length the inclination is, the solve the single specific problem of acting on that leg eventually with a different Weka model working only on a single leg per time.
Problem resolution so, will have these steps:
Use a legs Weka classification model first to detect which of the four legs has the higher inclination.
Use a leg specific Weka classification to correct for the leg detected above in first step, the actions to be taken by the three leg segments in order to correct the problem.
Stefano

Is GPS inaccuracy consistent over short time spans?

I'm interested in developing a semi-autonomous RC lawnmower.
That is, the operator would decide when to stop, turn, etc., but could request "slightly overlap previous cut" and the mower would automatically do so. (Having operated high-end RC mowers at trade shows, this is the tedious part. Overcoming that, plus the high cost -- which I believe is possible -- would make a commercial success.)
This feature would require accurate horizontal positioning. I have investigated ultrasonic, laser, optical, and GPS. Each has its problems in this application. (I'll resist the temptation to go off on these tangents here.)
So... my question...
I know GPS horizontal accuracy is only 3-4m. Not good enough, but:
I don't need to know where I am on the planet. I only need to know where I am relative to where I was a minute ago.
So, my question is, is the inaccuracy consistent in the short term? if so, I think it would work for me. If it varies wildly by +- 1.5m from one second to the next, then it will not work.
I have tried to find this information but have had no success (possibly because of the ubiquity of other GPS-accuracy discussion), so I appreciate any guidance.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Edit ~~~~~~~~~~~~~~~~~~~~~~
It's looking to me like GPS is not just skewed but granular. I'd be interested in hearing from anyone who can give better insight into this, but for now I'm going to explore other options.
I realized that even though my intended application is "outdoor", this question is technically in the field of "indoor positioning systems" so I am adding that tag.
My latest thinking is to have 3 "intelligent" high-dB ultrasonic (US) speaker units. The mower emits RF requests for a tone from each speaker in rapid sequence, measuring the time it takes to "hear" each unit's response, thereby calculating distance to each of these fixed point and using trilateration to get position. if the fixed-point speakers are 300' away from the mower, the mower may have moved several feet between the 1st and 3rd response, so this would have to be allowed for in the software. If it is possible to differentiate 3 different US frequencies, they could be requested/received "simultaneously". Though you still run into issues when you're close to one fixed unit and far from another. So some software correction may still be necessary. If we can assume the mower is moving in a straight line, this isn't too complicated.
Another variation is the mower does not request the tones. The fixed units send RF "here comes tone from unit A" etc., and the mower unit just monitors both RF info and US tones. This may simplify things somewhat, but it seems it really requires the ability to determine which speaker a tone is coming from.
This seems like the kind of thing you could (and should) measure empirically. Just set a GPS of your liking down in the middle of a field on a clear day and wait an hour. Then come back and see what you find.
Because I'm in a city, I can't run out and do this for you. However, I found a paper entitled iGeoTrans – A novel iOS application for GPS positioning in geosciences.
That includes this figure which duplicates the test I propose. You'll note that both the iPhone4 and Garmin eTrex10 perform pretty poorly versus the accuracy you say you need.
But the authors do some Math Magic™ to reduce the uncertainty in the position, presumably by using some kind of averaging. That gets them to a 3.53m RMSE measure.
If you have real-time differential GPS, you can do better. But this requires relatively expensive hardware and software.
Even aside from the above, you have the potential issue of GPS reflection and multipath error. What if your mower has to go under a deck, or thick trees, or near the wall of a house? These common yard features will likely break the assumptions needed to make a good averaging algorithm work and even frustrate attempts at DGPS by blocking critical signals.
To my mind, this seems like a computer vision problem. And not just because that'll give you more accurate row overlaps... you definitely don't want to run over a dog!
In my opinion a standard GPS is no way accurate enough for this application. A typical consumer grade receiver that I have used has a position accuracy defined as a CEP of 2.5 metres. This means that for a stationary receiver in a "perfect" sky view environment over time 50% of the position fixes will lie within a circle with a radius of 2.5 metres. If you look at the position that the receiver reports it appears to wander at random around the true position sometimes moving a number of metres away from its true location. When I have monitored the position data from a number of stationary units that I have used they could appear to be moving at speeds of up to 0.5 metres per second. In your application this would mean that the lawnmower could be out of position by some not insignificant distance (with disastrous consequences for your prized flowerbeds).
There is a way that this can be done, as has been proved by the tractor manufacturers who can position the seed drills and agricultural sprayers to millimetre accuracy. These systems use Differential GPS where there is a fixed reference station positioned in the neighbourhood of the tractor being controlled. This reference station transmits error corrections to the mobile unit allowing it to correct its reported position to a high degree of accuracy. Unfortunately this sort of positioning system is very expensive.

iBeacons: bearing to beacon?

Partly a coding problem, partly math problem.
Q1. I have an iOS device with compass active. If it knows I'm moving through the field of an iBeacon - or the Beacon is moving through my detection range - would it be possible for a phone to work out (roughly) the relative direction/bearing of that beacon with a series of readings by comparing signal strengths? Has anyone had a try at this?
Q2. Would it be possible to change the Major and Minor values of a beacon regularly (eg: every second) to pass small pieces of info - such as a second user's Bearing and Course?
Q1. It MIGHT be possible but you would need a controlled environment. Either the beacon or the phone needs to be fixed. You also need to be in an area with no obstructions or sources of radio interference.
Then you'd need to use the signal strength (which is sloppy and varies by a fair amount) as one input, and the device's heading info (which is also grossly inaccurate) and do some petty gnarly math on it.
Assuming you could work out the math, the slop in the input readings might make the results too iffy to be useful. (For example, how would you distinguish moving directly towards the beacon from moving 30 degrees to one side or the other? The signal strength would still increase, just not as quickly.
And your algorithm would have to deal with edge cases like moving along a circle around the beacon. In that case the signal strength should not change.
My gut is that even with clever algorithms that input data is just too unreliable to make much sense out of it, beyond "getting warmer" and "getting colder."
As mentioned above, you'd have to track your device's movement within the field, including distance covered and direction, then with multiple readings of signal strength you could theoretically calculate relative direction to the beacon to some degree of accuracy.
As to your second question about changing the minor version number, I have not seen any beacon APIs that allow that, either from the beacon manufacturers or from Apple's implementation.
However, a typical beacon is an ARM or other low power processor with a BLE transceiver, running a program. In theory it should be possible to create your own iBeacon transmitter that changed one of the parameters in order to transmit changing information. You'd have to set up the iOS device with the beacon region only specifying the UUID or UUID and major ID (depending on whether you wanted to change just the minor or change both the major and minor ID in order to transmit changing information.)
Note, too, that iBeacons are a special case of BLE, and the BLE standard does support the sending of arbitrary, changing data. You might be better off implementing your own BLE scheme either instead of or in addition to iBeacons.

how to increase GPS polling rate on Windows Phone 7

Is there any way to increase the gps polling rate in my WP7 app? Right now its updated roughly once a second. I need at least three or four times as much resolution..
I was considering spawning multiple GeoCoordinateWatcher(s), each in their own thread, but that doesn't sound like it would work.
** a bit of background **
I'm trying to measure 0-60 with my application, but am quickly realizing when testing with cars that do sub 5 second 0-60 runs (GTR, ZR1, 911 turbo, etc), sampling once a second isn't going to give me an accurate result.
It sounds like I have to rely on the accelerometer to get this data and use the GPS data to correlate?
I think you can ask the GPS for its Position as often as you like... and this probably will result in serial calls to the "GPS chip" - so should get you more results back.
I definitely wouldn't spawn multiple Watchers - its just too hacky a solution - and could well go wrong (far too hard to get them to spread their results uniformly)
Also, more importantly in an A-GPS system on a typical phone, I doubt the GPS would be sufficiently accurate to really make it worthwhile calling more than once a second - even if you are in a car travelling at 120kmh I think that still only make 30 metres per second travel - and your phone will often be struggling to get the accuracy as close as 30 metres (from my observations so far on the RunSat results while cycling).
No.
Most GPS receivers installed in phones only calculate a solution # 1 Hz. Even if you manage to get the operating system to give you more frequent updates, the will only change value once per second. The position calculation is done on an application specific IC, and it's programming cannot be altered.
GPS, in any case, is not a good solution for what you are trying. Figure out how to use the accelerometers. Integrate their values to get speed.

Resources