Google Address GeoCoding ROOFTOP accuracy degraded? - google-geocoder

Has anyone else noticed that the accuracy of the lat/long returned as ROOFTOP is now quite a bit off on the ground from what it was in the past?
I used to get lat/longs truly on rooftops, now the points are in the street in front of the address if I'm lucky, or one to two addresses away. I was depending on the accuracy to be good enough to pick the parcel for an address - now the pin in the right of way or one/two parcels away. I guess this is better for driving but they can no longer call the accuracy ROOFTOP.
I'm using free access (very low volume) - has Google degraded the accuracy on purpose? Can we pay a to get the accuracy back?
Is there another free/cheap resource out there for true rooftop points?

Related

Impoving ContactTracing Api efficiency with bluetooth signal strength

As per current specs only duration in 5 min increments is tracked. Suggested interval is 200-300ms it seems. In Singapore signal strength was accounted for but this is variable per device. What if we do still also track the signal strength during that time? You would get a curve from weak to strong that gives an indication of the speed of travel while approaching, and couldn't you also derive fairly accurate indications of proximity after just one day of data?
I noticed that beacon libraries already attempt to estimate distance: Understanding ibeacon distancing
But it does not seem these self-calibrate yet based for instance on min-max readings versus moving targets. I'm thinking that could work especially as phones are modified to be always on in that respect.
It is very difficult to accurately determine distance by Bluetooth RSSI measured between two phones because there is a huge variation in the way different phone models measure bluetooth signals. Check out this graph produced by the Open Trace folks behind the effort in Singapore:
Those variations are consistent with my work in this area for the Android Beacon Library open source project. The fragmentation of Android devices has made it impossible to keep up with all the variations in signal strength response.
One point that the Open Trace team did not address in their work, is that there are a number of different bluetooth channels, and RSSI varies greatly on a given phone depending on which channel is being used. Mobile phones give you no indication of what channel the radio was on when a measurement was taken. The channel difference probably accounts for much of the "height" of the blue bars in the graph.
Unfortunately, there is no way to know if a device is approaching or stationary by reading RSSI updates. The changes could be because of natural variation, motion, or changes in obstacles. I do not believe self-calibration in a contact tracing app is viable.
This does not mean that RSSI is worthless for distance estimates, but it does mean that the margin of error is very high in what you can measure. If you see a device at all, there is a very good chance it is within 50 meters. And if you see that the RSSI is stronger than -70 dBm, there is a good chance you are within 2 meters. But there will always be false positives and false negatives.

Is GPS inaccuracy consistent over short time spans?

I'm interested in developing a semi-autonomous RC lawnmower.
That is, the operator would decide when to stop, turn, etc., but could request "slightly overlap previous cut" and the mower would automatically do so. (Having operated high-end RC mowers at trade shows, this is the tedious part. Overcoming that, plus the high cost -- which I believe is possible -- would make a commercial success.)
This feature would require accurate horizontal positioning. I have investigated ultrasonic, laser, optical, and GPS. Each has its problems in this application. (I'll resist the temptation to go off on these tangents here.)
So... my question...
I know GPS horizontal accuracy is only 3-4m. Not good enough, but:
I don't need to know where I am on the planet. I only need to know where I am relative to where I was a minute ago.
So, my question is, is the inaccuracy consistent in the short term? if so, I think it would work for me. If it varies wildly by +- 1.5m from one second to the next, then it will not work.
I have tried to find this information but have had no success (possibly because of the ubiquity of other GPS-accuracy discussion), so I appreciate any guidance.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Edit ~~~~~~~~~~~~~~~~~~~~~~
It's looking to me like GPS is not just skewed but granular. I'd be interested in hearing from anyone who can give better insight into this, but for now I'm going to explore other options.
I realized that even though my intended application is "outdoor", this question is technically in the field of "indoor positioning systems" so I am adding that tag.
My latest thinking is to have 3 "intelligent" high-dB ultrasonic (US) speaker units. The mower emits RF requests for a tone from each speaker in rapid sequence, measuring the time it takes to "hear" each unit's response, thereby calculating distance to each of these fixed point and using trilateration to get position. if the fixed-point speakers are 300' away from the mower, the mower may have moved several feet between the 1st and 3rd response, so this would have to be allowed for in the software. If it is possible to differentiate 3 different US frequencies, they could be requested/received "simultaneously". Though you still run into issues when you're close to one fixed unit and far from another. So some software correction may still be necessary. If we can assume the mower is moving in a straight line, this isn't too complicated.
Another variation is the mower does not request the tones. The fixed units send RF "here comes tone from unit A" etc., and the mower unit just monitors both RF info and US tones. This may simplify things somewhat, but it seems it really requires the ability to determine which speaker a tone is coming from.
This seems like the kind of thing you could (and should) measure empirically. Just set a GPS of your liking down in the middle of a field on a clear day and wait an hour. Then come back and see what you find.
Because I'm in a city, I can't run out and do this for you. However, I found a paper entitled iGeoTrans – A novel iOS application for GPS positioning in geosciences.
That includes this figure which duplicates the test I propose. You'll note that both the iPhone4 and Garmin eTrex10 perform pretty poorly versus the accuracy you say you need.
But the authors do some Math Magic™ to reduce the uncertainty in the position, presumably by using some kind of averaging. That gets them to a 3.53m RMSE measure.
If you have real-time differential GPS, you can do better. But this requires relatively expensive hardware and software.
Even aside from the above, you have the potential issue of GPS reflection and multipath error. What if your mower has to go under a deck, or thick trees, or near the wall of a house? These common yard features will likely break the assumptions needed to make a good averaging algorithm work and even frustrate attempts at DGPS by blocking critical signals.
To my mind, this seems like a computer vision problem. And not just because that'll give you more accurate row overlaps... you definitely don't want to run over a dog!
In my opinion a standard GPS is no way accurate enough for this application. A typical consumer grade receiver that I have used has a position accuracy defined as a CEP of 2.5 metres. This means that for a stationary receiver in a "perfect" sky view environment over time 50% of the position fixes will lie within a circle with a radius of 2.5 metres. If you look at the position that the receiver reports it appears to wander at random around the true position sometimes moving a number of metres away from its true location. When I have monitored the position data from a number of stationary units that I have used they could appear to be moving at speeds of up to 0.5 metres per second. In your application this would mean that the lawnmower could be out of position by some not insignificant distance (with disastrous consequences for your prized flowerbeds).
There is a way that this can be done, as has been proved by the tractor manufacturers who can position the seed drills and agricultural sprayers to millimetre accuracy. These systems use Differential GPS where there is a fixed reference station positioned in the neighbourhood of the tractor being controlled. This reference station transmits error corrections to the mobile unit allowing it to correct its reported position to a high degree of accuracy. Unfortunately this sort of positioning system is very expensive.

iBeacons: bearing to beacon?

Partly a coding problem, partly math problem.
Q1. I have an iOS device with compass active. If it knows I'm moving through the field of an iBeacon - or the Beacon is moving through my detection range - would it be possible for a phone to work out (roughly) the relative direction/bearing of that beacon with a series of readings by comparing signal strengths? Has anyone had a try at this?
Q2. Would it be possible to change the Major and Minor values of a beacon regularly (eg: every second) to pass small pieces of info - such as a second user's Bearing and Course?
Q1. It MIGHT be possible but you would need a controlled environment. Either the beacon or the phone needs to be fixed. You also need to be in an area with no obstructions or sources of radio interference.
Then you'd need to use the signal strength (which is sloppy and varies by a fair amount) as one input, and the device's heading info (which is also grossly inaccurate) and do some petty gnarly math on it.
Assuming you could work out the math, the slop in the input readings might make the results too iffy to be useful. (For example, how would you distinguish moving directly towards the beacon from moving 30 degrees to one side or the other? The signal strength would still increase, just not as quickly.
And your algorithm would have to deal with edge cases like moving along a circle around the beacon. In that case the signal strength should not change.
My gut is that even with clever algorithms that input data is just too unreliable to make much sense out of it, beyond "getting warmer" and "getting colder."
As mentioned above, you'd have to track your device's movement within the field, including distance covered and direction, then with multiple readings of signal strength you could theoretically calculate relative direction to the beacon to some degree of accuracy.
As to your second question about changing the minor version number, I have not seen any beacon APIs that allow that, either from the beacon manufacturers or from Apple's implementation.
However, a typical beacon is an ARM or other low power processor with a BLE transceiver, running a program. In theory it should be possible to create your own iBeacon transmitter that changed one of the parameters in order to transmit changing information. You'd have to set up the iOS device with the beacon region only specifying the UUID or UUID and major ID (depending on whether you wanted to change just the minor or change both the major and minor ID in order to transmit changing information.)
Note, too, that iBeacons are a special case of BLE, and the BLE standard does support the sending of arbitrary, changing data. You might be better off implementing your own BLE scheme either instead of or in addition to iBeacons.

Determining local horizon using USGS topographical data?

Using USGS' topographical data (DEM), you can determine what your
local horizon looks like. Has anyone done this?
Example: if the packet of land 50 feet away from you has a 10 foot
higher elevation, it will subtend a horizon-blocking angle of 11.31
degrees (the arctangent of 10 feet over 50 feet).
The horizon-blocking topography isn't always adjacent: a large
mountain several miles away may block more of your horizon than nearby
topography. Caveats:
For more distant items, you'd also have to compensate for the Earth's
curvature.
USGS only measures average elevation for a packet of land, so the
results will be approximate.
The results also won't include man-made structures, trees, or other
non-topographical elements.
Our eyes are ~4-5 feet above ground level, and you'd have to compensate for that.
Nonetheless, this all seems quite do-able, so I'm guessing someone has
done it?
noted that no one has responded. i've been casually interested in how one could determine a local horizon, and found your question. a bit of Google helped discover the following quite useful URL, http://www.gitta.info/TerrainAnalyi/en/html/VisibilAppls_learningObject6.html, brought to us by the Geographic Information Technology Trainiing Alliance

Measuring distance - Windows phone app

Good evening all.
I'm just trying to collate some ideas really and was wondering if I could pick some brains.
I'd like to develop an app that relies upon measuring distance reasonably accurately. So for example, I have a central point, I want to be able to detect whether the phone is within a radius of a meter.
How could I achieve this?
The points would be static but I don't think GPS would be accurate enough to rely on this solely.
I'm definitely not a hardware chap but is there a way of combining GPS and some other sort of transmitter to ensure accuracy?
Any help or suggests greatly appreciated.
One meter accuracy? It's probably not going to happen with any phone hardware out there - definitely not with any Windows Phone. GPS isn't accurace enough without a differential beacon, and phones don't have the hardware to receive that (and I doubt you have a differential transmitter either).
The location service on the phone (assuming high accuracy is selected) combines data from GPS, cell towers and WiFi hot spots to provide a location.
There is no way to include the use of other sensors to improve this data.
You also won't be able to get the level of accuracy you're after from the phone. It's just not designed for the purpose you describe.

Resources