CR2032 Battery Percentage - battery

We are working on a project right now, where we're using coin cell batteries (3 x CR2032). The device is connected to the application via bluetooth and we're showing battery percentage on the app (Reading by turning ADC on during reading and turning ADC off after reading is taken. This saves battery life).
My question is how do we display the percentage on the application throughout the life of the battery.
Eg. 3.2 V - 100%
3.0 V - 80%
2.8 V - 60%
These values are exaggerated just to show why i'm trying to guess here.
coin cell batteries discharge quickly from 3.2 to 2.9V and then they discharge very slowly. We want to show the readings, considering the nature of a coin cell battery. Eg. From 3.2V - 2.9V, we can show a reduction of 4-5% only and then do the rest of the calculations according to a slow rate.
Please suggest calculations which we can implement in our code.
We are currently following this but it doesn't make sense to me.
https://devzone.nordicsemi.com/question/37130/battery-level-discharge-curve-how-can-i-read/
#2.9 V if we show less than half of the battery on the app, then the user would be confused as to why the battery drained quickly even when the user hardly used it.

If the device consumes the same current all the time, you could build self discharge diagram (in your link the discharge diagram is build for a specific load - resistor 15k).
After that, you could use polynoms to get function which provides the dependence of battery capacity from time. Use the inverse funсtion to predict how much time is left to work.
Commonly, you could collect some information about battery life, If your device has remote server, you could send statistics on it and analyze received data in future.

Related

how often does the GPS return the speed information

I am working on a mobile APP that can detect activity speed through Google map API.
I wonder how often does the GPS return the speed information? I would like to know the specific time interval like every 0.25 second or every 0.1 second?
Any help will be much appreciated!
The speed is returned with every location fix.
A Gps receiver can be set for which intervall to use.
In the past the minimum interval was 1s, now some receiver provide
0,5s.
Some specialized receiver even 0.1s
On Smartphones the minimum intervall is 1 per second.
If your application has direct access to the receiver you have to read the specification of the GPS chip manufacturer which intervall you can use.

Maximum number of addressable LED's to be controlled by arduino

I'm working on a project that involves building a big 'light wall' (h*w = 3,5 * 7 m). The wall is covered in semi-transparent fabric, and behind this is gonna be mounted strips of addressable RGB-LED's. The strips are gonna be mounted vertically and function as a big display for showing primitive graphics and visuals. I plan to control the diodes using an arduino. The reason for the choice of the arduino as controller is that is must be easily programmable, (relatively) cheap, and should work as a part of different interaction design setups.
Current version of the light wall (without addressable LED's)
My ultimate concern is now this:
How many individual diodes can I control before the arduino becomes inadequate?
As I see it this comes down to two points:
When does the update rate of the diodes become too slow? I want to update the picture at a rate of minimum 10 Hz.
When will the arduino run out of memory? The state of all the diodes on the light wall should be stored for the next frame. At some point then the memory (256 KB for the arduino MEGA 2560) will run out. How many diodes can I control before this happens?
I realize that both questions are related to how the arduino is programmed. I am very open to any tips regarding speed optimization and memory conservation.
Thanks in advance!

iBeacon indoor map 'heat map'

I'm sure we have all heard of apples iBeacon by now... We've been working on a few projects using the technology and have been wondering about one usage that I have seen others promoting.. that is using the LE Bluetooth radios to create a dwell time heat map in a space...
The concept sounds simple enough place a LE Beacon in an area and as people pass by it 'counts' that person which is then overlaid over a store map to create traffic patterns.. that's the claim. I'm trying to figure out how that can be possible?
The concept uses the mobile device on the passerby as the 'trigger' for the count. There is no way at all to achieve this with out the user having a certain app downloaded on their device correct? The only feasible way I can see it working is if the user has an app downloaded on their device and that app pings a web server every time it sees a beacon.. that is then mapped.. but that also will use data and battery resources on the mobile device which most likely will result in the user deleting the app before long...
This also leaves a large number of passers by who will not be accounted for... making the results very difficult to quantify.
Am I wrong in this assumption? Is there something that I'm missing?
Your analysis of the possibilities and challenges of the technology are largely correct. My company, Radius Networks, has done similar traffic visualizations for large events.
A few points:
Even if most users do not have an app on their phone, the data are still valuable if there are enough to provide a representative statistical sample.
When using iBeacons for this purpose, you must have quite coarse grained locations for two reasons:
The range of Bluetooth LE is about 50 meters.
Assuming the users will only be passively running the app in the background, beacon detection can take minutes on iOS.
Combining the two challenges above, you can really only use the technology to do this for very large venues.
The battery drain is not really a problem if the phone only wakes up every few minutes to report a beacon detection to a server.

What is the android standard or benchmark for battery drain by an application?

I was working on my phone-book application and recorded battery drain for 1 hour.
It drained to 5 % for 1 hour. I wanted to know what is the android standard or what is the android benchmark for battery drain for specific time of an application.
There really is no answer for this, because your app had barely any effect at all on the battery during that time.
The device is on and awake, powering the processor at speed, generating, and displaying graphics. You app is doing very few calculations compared to what is going on behind the scenes.
Battery life also varies by device, battery health, backlight level, wifi, bluetooth, nfc, and other factors, one of which is your app, very low on the list of power consumption.
Once you start calculating Pi, or doing other intense calculations, you will not see a significant power consumption due to your app alone.

Linux Driver real time constraints

I need to build a platform for logging some sensor data. And possibly later doing some calculations on this logged data.
The Raspberry Pi seem like an interesting (and cheap!) device for this.
I have a gyroscope that can sample at 800 Hz which is equivalent to one sample every 1.25 ms.
The gyroscope has a built-in FIFO that can store 32 samples.
This means that the FIFO has to be emptied at least every 32 * 1.25 = 40 ms, otherwise samples will be dropped.
So my question is: Can I be 100% certain that my kernel driver will be able to extract the data from this FIFO within the specified time?
The gyroscope communicates with the host via i2c, and it can also trigger an interrupt pin on a "almost full"-event if that would make things simpler.
But it would be easiest if I could just have a loop in the driver that retrieves the data at regular intervals.
I can live with storing the data in kernel space, and move it to user space more infrequently (no constraint on time).
I can also live with sampling the gyroscope at lower sample rates (400 or 200 Hz is acceptable).
This is with regards to the stock kernel, and not the special real-time kernel as it seems like this is currently not supported for the Raspberry Pi.
You will need a real-time linux environment for tight timing:
You could try Xenomai on Raspberry Pi:
http://diy.powet.eu/2012/07/25/raspberry-pi-xenomai/
However, following along this blog:
http://linuxcnc.mah.priv.at/rpi/rpi-rtperf.html (dead, and I could not find it in wayback or google cache)
It seems he is getting repeatable +/- 20µS timing out of the stock kernel. As your timing resolution is 1250µS you may be fine with the stock kernel if you willing to lose a sample once in a blue moon YMMV.
I have not tested this yet myself but I have been reading up in an attempt to try to drive a ws2811 LED controller with the Raspberry Pi and this was looking the most promising to me.
There is also the RT linux patch: https://rt.wiki.kernel.org/index.php/Main_Page
Which has at lest one pi version: https://github.com/licaon-kter/raspi-rt
However I have run into a bunch of nay-sayers when looking deeper into this patch.
Your best bet it to read the MS timer and log or light an LED if you miss an interval and then try some of the solutions. Happy Hacking..

Resources