I would like to create the indoor positioning system. I hope the system can collect all beacon signal and create map automatically. However, I know that far away beacon maybe cannot detect. Therefore, is that possible to discovery far away beacon based on other beacon? That is, beacon can transfer their signal based on other beacon?
Sorry, but no, it is not possible to use a beacon to relay signals of more distant beacons. Bluetooth beacons are extremely simple devices that just transmit a unique identifier. They are transmit only, and therefore completely unaware of other beacons around them.
(how) Is it possible to have the Eddysone-URL provide functionality, similar to NFC, that would have the user only within a close proximity be able to get the URL?
I've been testing using the eddystone-beacon library on the Intel Bluetooth 4 enabled Wifi card to send the signal successfully. But I find that I can receive the signal from far (20+m) away, when I'd like to limit it to within one meter.
The library has options to attenuate the power txPowerLevel: -22, // override TX Power Level, but I find that changing this only messes with the distance calculation, and not the ability to receive the signal.
Is this perhaps an issue with the hardware (maybe a dedicated USB would allow control?)
Eddystone-URL is not designed to work this way using Google's standard services. However, it is possible to do what you want if you have a dedicated app on the mobile device that detects the beacon.
If this is an option for you, then you won't want to reduce the transmitter power on your hardware device. Even if you get hardware that allows this, sending a very weak signal will lead to unpredictable minimum detection ranges of 3 feet or more on devices with strong receivers, and not detections at all (even if touching the beacon) on devices with weak receivers.
Instead, leave it at the maximum transmission power and then filter for a strong RSSI on the receiving device, showing the detection only when the RSSI meets a threshold. You'll still have trouble with varying strengths of receivers, but it is much more predictable. I have used this technique combined with a device database that tracks the strongest signal level seen for a device model, so I know what RSSI a specific device model will detect when it is right next to the beacon.
If you are game for this approach, you can use the Android Beacon Library to detect Eddytstone-URL for your app on Android devices and the iOS Beacon tools on iOS devices.
I'm sure we have all heard of apples iBeacon by now... We've been working on a few projects using the technology and have been wondering about one usage that I have seen others promoting.. that is using the LE Bluetooth radios to create a dwell time heat map in a space...
The concept sounds simple enough place a LE Beacon in an area and as people pass by it 'counts' that person which is then overlaid over a store map to create traffic patterns.. that's the claim. I'm trying to figure out how that can be possible?
The concept uses the mobile device on the passerby as the 'trigger' for the count. There is no way at all to achieve this with out the user having a certain app downloaded on their device correct? The only feasible way I can see it working is if the user has an app downloaded on their device and that app pings a web server every time it sees a beacon.. that is then mapped.. but that also will use data and battery resources on the mobile device which most likely will result in the user deleting the app before long...
This also leaves a large number of passers by who will not be accounted for... making the results very difficult to quantify.
Am I wrong in this assumption? Is there something that I'm missing?
Your analysis of the possibilities and challenges of the technology are largely correct. My company, Radius Networks, has done similar traffic visualizations for large events.
A few points:
Even if most users do not have an app on their phone, the data are still valuable if there are enough to provide a representative statistical sample.
When using iBeacons for this purpose, you must have quite coarse grained locations for two reasons:
The range of Bluetooth LE is about 50 meters.
Assuming the users will only be passively running the app in the background, beacon detection can take minutes on iOS.
Combining the two challenges above, you can really only use the technology to do this for very large venues.
The battery drain is not really a problem if the phone only wakes up every few minutes to report a beacon detection to a server.
When leveraging the kCLLocationAccuracyHundredMeters constant, what location data is being used and where is that data stored when the OS pulls the data? The "last known location" used to be stored in cache.plist, but since iOS 4.2.8 that is no longer the case.
I am trying to gain a better understanding of how applications determine a device's location. The Core Location Framework allows calls for location-related data but hides the gritty details behind the API.
It's not the apps that determine device location but the device. (hardware and software) The device in turn has the ability to inform the app about the device location via the Core Location API. Location is not tracked automatically, but you can ask the device to track it (again through the API).
The rationale behind constants like kCLLocationAccuracyHundredMeters is that an increasing localization accuracy costs increasing amounts of computation power. There is no point in investing in finding your position with an accuracy of 10 meters if all what you need is 100 meters.
Internally, there are two sources of data that are used to locate the device: 1. A GPS chip built into the device which is capable of receiving signals from GPS satellites surrounding the earth, which send timing information. As the signals arrive at different times because their distance to the device is differen and the speed of light is finite, the position can be calculated. 2. A database of visible SSIDs with their respective positions is used from which the proximity to their respective position is known.
That said, for you as an app developer it should be largely irrelevant how the respective data is stored internally in the device. All you need to know is the API.
When you assign the kCLLocationAccuracyHundredMeters to the desiredAccuracy property of your CLLocationManager, what you are actually doing is suggesting the manager the level of accuracy you want on the location it reports to you. The CLLocationManager can be a gentleman and provide you with the desired accuracy, but sometimes, it is not possible for the locationManager to provide you with such level of accuracy. What you need to do is, in your delegate, when you are receiving the CLLocation objects, you look for the date the location was captured, and the horizontal/vertical accuracy properties. This will allow you to determine how accurate the data really is.
The CLLocationManager's location property will give you the last recorded location, even before reporting location changes to its delegate.
Any ideas what the average user's download speed is? I'm working on a site that streams video and am trying to figure out what an average download speed as to determine quality.
I know i might be comparing apples with oranges but I'm just looking for something to get a basis for where to start.
Speedtest.net has a lot of stats broken down by country, region, city and ISP. Not sure about accuracy, since it's only based on the people using their "bandwidth measurement" service.
It would depend on the geography that you are targeting. For example, in India, you can safely assume it would be a number below 256kbps.
Try attacking it from the other angle. Look at streaming services that cater to the customer you want, and have significant volume (maybe youtube) and see what they're pushing. You'll find there'a pretty direct correlation between alexa rating (popularity) and quality(minimum bitrate required). Vimeo will always have fewer users than Youtube because the user experience is poor for low bitrate users.
There are many other factors, and this should only form one small facet of your bandwidth decision, but it's a useful comparison to make.
Keep in mind, however, that you want to degrade gracefully. As more and more sites come online you'll start bumping into ISPs that limit total transfer, and being able to tell your customers how much of their bandwidth your site is consuming is useful, as well as proclaiming that you are a low bandwidth site.
Further, more and more users are using portable cellular connections (iPhone) where limited bandwidth is a big deal. AT&T has oversold many markets so being able to get useful video through a tiny link will enable you to capture market that vimeo and Hulu cannot.
Quite frankly, though, the best thing to do is degrade on the fly gracefully. Measure the bandwidth of the connection continuously and adjust bandwidth as needed for a smooth playback experience with good audio. Then you can take all users across the gamut...
-Adam
You could try looking at the lower tier offerings from AT&T and Comcast. Probably 1.5 Mbps for the basic level (which I imagine most people get).
The "test your bandwidth" sites may have some stats on this, too.
There are a lot of factors involved (server bandwidth, local ISP, network in between, etc) which make it difficult to give a hard answer. With my current ISP, I typically get 200-300 kB/sec. Although when the planets align I've gotten as much as 2 MB/sec (the "quoted" peak downlink speed). That was with parallel streams, however. The peak bandwidth I've achieved on a single stream is 1.2 MB/sec
The best strategy is always to give your users options. Why don't you start the stream at a low bitrate that will work for everyone and provide a "High Quality" link for those of us with FTTH connections? I believe YouTube has started doing this.
According to CWA, the average US resident has a 1.9Mbps download speed. They have data by state, so if you have money then you can probably get a more specific report for your intended audience. Keep in mind, however, that more and more people are sharing this with multiple computers, using VOIP devices, and running background processes that consume bandwidth.
-Adam
Wow.
This is so dependent on the device, connection method, connection type, ISP throttling, etc. involved in the end-to-end link.
To try and work out an average speed would be fairly impossible.
Think, fat pipe at home (8Gb plus) versus bad wireless connection provided for free at the airport (9.6kb) and you can start to get an idea of the range of connections you're trying to average over.
Then we move onto variations in screen sizes and device capabilities.
Maybe trawl the UA stings of incoming connectins to get an idea of the capabilities of the user devices being used out there.
Maybe see if you can use some sort of geolocation solution to try and see how people are connecting to your site to get an idea of connection capabilities as well.
Are you offering the video in a fixed format, i.e. X x Y pixel size?
HTH.
cheers,
Rob
If I'm using your site, "average" doesn't matter. All I care about is MY experience, and so you either need to make the site adaptive, design for a pretty low speed (iPhone 2G gets you 70-80 kbps if you're lucky, to take one common case), or be very clear about the requirements so I can decide whether or not my connection-of-the-moment will work or not.
What you don't want to subject your users to is unpredictably choppy, intermittent video and audio.