Some one could tell me please, if there is a way to get the battery performance by setting different advertising intervals of Kontakt beacons(either set to iBeacons or Eddystone frame format).
If yes, Can you please tell me how can i represent graphically the battery performance against advertising intervals?
i don't know what kind of beacon battery Konkat use but i can help you with this technical report.
Aislelabs explains how the scan interval could impact on battery drain but also the impact of advertising interval on battery drain.
Impact of advertising interval on battery drain
Different beacons have different advertising intervals. A short advertising interval (say 100 msec) by Beacon A versus a longer one (say 500 msec) by Beacon B implies that in one second of scanning, Beacon A will be detected 10 times and Beacon B twice by the phone.
Multiple detection of a beacon within the same scan is desirable as it aids in smoothing the signal readings. These signal readings are critical to assessing the precise distance of the beacon from the phone. At the same, multiple signals mean more work required to process the signals, and hence more battery consumption.
So as we reduced the advertising interval of the beacon, we noticed modestly higher battery use for our three handsets.
Other researches : An Analysis of the Accuracy of Bluetooth Low
Energy for Indoor Positioning Applications here
BLE advertising beacons are particularly attractive to
retailers because of the promise of long battery lives of
many years, and so low maintenance requirements. Long
battery lives are expected to require low radio power
output and/or low beaconing rates. While this does not
affect their use for proximity detection it does affect their
usefulness for providing fingerprint-based positioning
throughout an entire indoor environment.
The beaconing rate affects the battery life of the BLE
beacon, and for most iBeacon purposes will typically be
set to an advertising rate of a few Hertz.
Related
We are working on a project right now, where we're using coin cell batteries (3 x CR2032). The device is connected to the application via bluetooth and we're showing battery percentage on the app (Reading by turning ADC on during reading and turning ADC off after reading is taken. This saves battery life).
My question is how do we display the percentage on the application throughout the life of the battery.
Eg. 3.2 V - 100%
3.0 V - 80%
2.8 V - 60%
These values are exaggerated just to show why i'm trying to guess here.
coin cell batteries discharge quickly from 3.2 to 2.9V and then they discharge very slowly. We want to show the readings, considering the nature of a coin cell battery. Eg. From 3.2V - 2.9V, we can show a reduction of 4-5% only and then do the rest of the calculations according to a slow rate.
Please suggest calculations which we can implement in our code.
We are currently following this but it doesn't make sense to me.
https://devzone.nordicsemi.com/question/37130/battery-level-discharge-curve-how-can-i-read/
#2.9 V if we show less than half of the battery on the app, then the user would be confused as to why the battery drained quickly even when the user hardly used it.
If the device consumes the same current all the time, you could build self discharge diagram (in your link the discharge diagram is build for a specific load - resistor 15k).
After that, you could use polynoms to get function which provides the dependence of battery capacity from time. Use the inverse funсtion to predict how much time is left to work.
Commonly, you could collect some information about battery life, If your device has remote server, you could send statistics on it and analyze received data in future.
(how) Is it possible to have the Eddysone-URL provide functionality, similar to NFC, that would have the user only within a close proximity be able to get the URL?
I've been testing using the eddystone-beacon library on the Intel Bluetooth 4 enabled Wifi card to send the signal successfully. But I find that I can receive the signal from far (20+m) away, when I'd like to limit it to within one meter.
The library has options to attenuate the power txPowerLevel: -22, // override TX Power Level, but I find that changing this only messes with the distance calculation, and not the ability to receive the signal.
Is this perhaps an issue with the hardware (maybe a dedicated USB would allow control?)
Eddystone-URL is not designed to work this way using Google's standard services. However, it is possible to do what you want if you have a dedicated app on the mobile device that detects the beacon.
If this is an option for you, then you won't want to reduce the transmitter power on your hardware device. Even if you get hardware that allows this, sending a very weak signal will lead to unpredictable minimum detection ranges of 3 feet or more on devices with strong receivers, and not detections at all (even if touching the beacon) on devices with weak receivers.
Instead, leave it at the maximum transmission power and then filter for a strong RSSI on the receiving device, showing the detection only when the RSSI meets a threshold. You'll still have trouble with varying strengths of receivers, but it is much more predictable. I have used this technique combined with a device database that tracks the strongest signal level seen for a device model, so I know what RSSI a specific device model will detect when it is right next to the beacon.
If you are game for this approach, you can use the Android Beacon Library to detect Eddytstone-URL for your app on Android devices and the iOS Beacon tools on iOS devices.
How can I receive a time signal with an unmodified RTL-SDR USB TV Dongle here in Europe?
RTL-SDR Dongles are able to receive the frequency range 52-2200MHz.
Here in Europe, radio-controlled clocks receive DCF-77, a time signal broadcast on 77KHz, but as 77KHz is a bit lower than 52MHz, that's out.
The GPS L2 signal is at 1575.42 MHz, so that's within the dongle's range, but the signal is way too weak to be received with the TV antenna. An active GPS antenna is needed, and for providing the antenna with the power, I'd need to make some modifications to the electronics which I don't really want to do.
In the old age of analog TV broadcasting, we had Teletext / Videotext here in Germany, which contained a time signal, but these times are long gone.
ADS-B reception with a dongle works like a charm, but unfortunately they did not put in time or date bits into the data packets.
So: Does anybody have any idea where in the spectrum that can be received by an unmodified RTL-SDR dongle there is a time signal that could be easily decoded?
I'm well aware that getting time over the network via NTP, or via a GPS modem via NMEA 0183 would be way easier, but I'm curious and just want to play around with that dongle a bit. Precision is not important. +/- 2 seconds is fine. And I'd like to do it the SDR-way, so using the dongle in the originally intended way (as a DVB-T receiver using the original software) defeats the purpose (i.e. learning and DIY)
The GPS L2 signal is at 1575.42 MHz, so that's within the dongle's range, but the signal is way too weak to be received with the TV antenna. An active GPS antenna is needed, and for providing the antenna with the power, I'd need to make some modifications to the electronics which I don't really want to do.
Well, first of all, GPS is really weak, but it still works under the noise floor; that's something important to realize – I've seen it more than one time that people are worried because they can't see GPS on a PSD display. You won't; you'll need signal processing to recover it from all the noise.
The modifications aren't all that complicated; basically, you need a capacitor to let through the AC component to the RTL dongle, and a voltage source to feed the active antenna; the required component is usually called a bias-T.
Nevertheless, an active antenna will be necessary – your RTL dongle probably won't have a Noise Figure low enough to receive GPS signals on its own.
In the old age of analog TV broadcasting, we had Teletext / Videotext here in Germany, which contained a time signal, but these times are long gone.
True; haven't looked at local FM stations, but RDS might be the way to go – it can contain a clock/time signal; the German Wikipedia claims that mainly publicly owned stations transmit that information field.
Have a look at gr-rds; it's a GNU Radio implementation of the Radio Data System. If you don't have a working GNU Radio installation (yet), you might try out the GNU Radio LiveSDR Image, which contains a ready-to-use gr-rds.
I'd like to modify the SensorTag Software from TI for the CC2650STK kit, so that it speeds up the reading and also the transmission of the sensor values.
Do I need to modify only the Sensor Software (CCS BLE Sensor stack from TI) or also the android app?
I'd principally need only one temperature, so other sub-question is: how can the other sensors be deactivated if not needed or if they conflict with the higher speed of the temperature sensor?
What do you mean by "speeding up"?
There are a number of different things you might mean.
Reduce the latency between opening the mobile app and displaying a
reading.
Refactor the mobile app to make it simpler to get new
readings.
Increase the frequency with which notifications are sent
by the device, if you use it in that way.
Change the firmware interaction with the sensors to obtain a reading.
Each of these meanings entails a different approach.
The period for each sensor is described in the User Guide that you reference and is typically between hundreds of milliseconds and one or two seconds. Do you really need readings more frequently than that? Typically each sensor will need an amount of time in order to obtain a reliable reading. This would be described in the sensor data sheet, along with options for working with the sensor.
More generally 'speed' will be a function of the bluetooth handshake, the throughput available over the physical radio link, the processing within the sensor tag and the processing within the sensors. I would expect the most variable part of this would be the physical link.
It is up to the mobile app to decide which sensors services it wishes to use.
Have you studied the Software Developer's Guide, available at the same page as the BLE Stack?
I was working on my phone-book application and recorded battery drain for 1 hour.
It drained to 5 % for 1 hour. I wanted to know what is the android standard or what is the android benchmark for battery drain for specific time of an application.
There really is no answer for this, because your app had barely any effect at all on the battery during that time.
The device is on and awake, powering the processor at speed, generating, and displaying graphics. You app is doing very few calculations compared to what is going on behind the scenes.
Battery life also varies by device, battery health, backlight level, wifi, bluetooth, nfc, and other factors, one of which is your app, very low on the list of power consumption.
Once you start calculating Pi, or doing other intense calculations, you will not see a significant power consumption due to your app alone.