What is the android standard or benchmark for battery drain by an application? - power-management

I was working on my phone-book application and recorded battery drain for 1 hour.
It drained to 5 % for 1 hour. I wanted to know what is the android standard or what is the android benchmark for battery drain for specific time of an application.

There really is no answer for this, because your app had barely any effect at all on the battery during that time.
The device is on and awake, powering the processor at speed, generating, and displaying graphics. You app is doing very few calculations compared to what is going on behind the scenes.
Battery life also varies by device, battery health, backlight level, wifi, bluetooth, nfc, and other factors, one of which is your app, very low on the list of power consumption.
Once you start calculating Pi, or doing other intense calculations, you will not see a significant power consumption due to your app alone.

Related

Powering Onboard Computer from DJI Matrice 100

I am trying to set up my Intel Nuc as the Onboard Computer for using DJI OSDK on my Matrice 100. I am looking for suggestions to find a way to power the Nuc from the Matrice.
In the beginning, I connected a DC-DC voltage regulator to one of the XT30 ports to get 19V and power the Nuc. It worked okay for a while and I was able to fly the Matrice outside using OSDK. But it has suddenly stopped working now. When the Nuc tries to boot into Ubuntu, it shuts down abruptly.
In short, use separate power source for Onboard PC and sensors.
Typically sensors and PC such as IR or LIDAR or NUC draws power heavily. If the current changes, the sensor value will change (which is bad). The PC will need a safe margin of voltage e.g 16 to 21. A sudden gust wind can cause the drone to output thrust power at a maximum which might lower the voltage for couple secs. In this case, PC might gets shut down or provide wrong calculation output ( e.g false odometry output)
So adding a stand-alone small size 5 cell battery should give you stable performance. Here I cant promote a specific commercial item. For my project, I bought the battery from Hobbyking to do the job for onboard PC and Kinects. You can search there for the battery that suits your need
PS. don't get DC to DC converter, typically it is very low efficiency and prone to power disturbance. I use typically raw battery and BEC(for 5V or 12V Sensor such as IR sensor and Hokuyo LIDAR)

CR2032 Battery Percentage

We are working on a project right now, where we're using coin cell batteries (3 x CR2032). The device is connected to the application via bluetooth and we're showing battery percentage on the app (Reading by turning ADC on during reading and turning ADC off after reading is taken. This saves battery life).
My question is how do we display the percentage on the application throughout the life of the battery.
Eg. 3.2 V - 100%
3.0 V - 80%
2.8 V - 60%
These values are exaggerated just to show why i'm trying to guess here.
coin cell batteries discharge quickly from 3.2 to 2.9V and then they discharge very slowly. We want to show the readings, considering the nature of a coin cell battery. Eg. From 3.2V - 2.9V, we can show a reduction of 4-5% only and then do the rest of the calculations according to a slow rate.
Please suggest calculations which we can implement in our code.
We are currently following this but it doesn't make sense to me.
https://devzone.nordicsemi.com/question/37130/battery-level-discharge-curve-how-can-i-read/
#2.9 V if we show less than half of the battery on the app, then the user would be confused as to why the battery drained quickly even when the user hardly used it.
If the device consumes the same current all the time, you could build self discharge diagram (in your link the discharge diagram is build for a specific load - resistor 15k).
After that, you could use polynoms to get function which provides the dependence of battery capacity from time. Use the inverse funсtion to predict how much time is left to work.
Commonly, you could collect some information about battery life, If your device has remote server, you could send statistics on it and analyze received data in future.

Battery performance of Bluetooth Low Energy (Beacons)

Some one could tell me please, if there is a way to get the battery performance by setting different advertising intervals of Kontakt beacons(either set to iBeacons or Eddystone frame format).
If yes, Can you please tell me how can i represent graphically the battery performance against advertising intervals?
i don't know what kind of beacon battery Konkat use but i can help you with this technical report.
Aislelabs explains how the scan interval could impact on battery drain but also the impact of advertising interval on battery drain.
Impact of advertising interval on battery drain
Different beacons have different advertising intervals. A short advertising interval (say 100 msec) by Beacon A versus a longer one (say 500 msec) by Beacon B implies that in one second of scanning, Beacon A will be detected 10 times and Beacon B twice by the phone.
Multiple detection of a beacon within the same scan is desirable as it aids in smoothing the signal readings. These signal readings are critical to assessing the precise distance of the beacon from the phone. At the same, multiple signals mean more work required to process the signals, and hence more battery consumption.
So as we reduced the advertising interval of the beacon, we noticed modestly higher battery use for our three handsets.
Other researches : An Analysis of the Accuracy of Bluetooth Low
Energy for Indoor Positioning Applications here
BLE advertising beacons are particularly attractive to
retailers because of the promise of long battery lives of
many years, and so low maintenance requirements. Long
battery lives are expected to require low radio power
output and/or low beaconing rates. While this does not
affect their use for proximity detection it does affect their
usefulness for providing fingerprint-based positioning
throughout an entire indoor environment.
The beaconing rate affects the battery life of the BLE
beacon, and for most iBeacon purposes will typically be
set to an advertising rate of a few Hertz.

Very low fps, except when running something else on background

So, i give development support to a private server for a game called metin2.
The game have low requirements so it runs pretty smoothly, but there is a certain zone in the game that is going well and like randomly and instantly, the fps drops from 40 to 0,1, and it looks like a powerpoint presentation.
The solution the community has come up with (it was pure luck and coincidence) is running "counter strike 1.6" on the background (probably another game should also work) and the game runs smoothly.
So basically, my question is: How does consuming more CPU and RAM actually improves the fps performance in that zone of the game? The game processes are independent.
Is the zone a high-load area and is this occuring for all users ?
Just because it reminds me of some solutions to frame spikes issues I've had before, which were related to some intel speedstep/C1M/turbo/something something BIOS CPU options.
The short of it is that in some 3D applications at certain times when load was low, it'd be too low and whichever feature would determine that less performance was needed and underclock. This was some years ago now but maybe a worthwhile train of thought.

Win32 game loop that doesn't spike the CPU

There are plenty of examples in Windows of applications triggering code at fairly high and stable framerates without spiking the CPU.
WPF/Silverlight/WinRT applications can do this, for example. So can browsers and media players. How exactly do they do this, and what API calls would I make to achieve the same effect from a Win32 application?
Clock polling doesn't work, of course, because that spikes the CPU. Neither does Sleep(), because you only get around 50ms granularity at best.
They are using multimedia timers. You can find information on MSDN here
Only the view is invalidated (f.e. with InvalidateRect)on each multimedia timer event. Drawing happens in the WM_PAINT / OnPaint handler.
Actually, there's nothing wrong with sleep.
You can use a combination of QueryPerformanceCounter/QueryPerformanceFrequency to obtain very accurate timings and on average you can create a loop which ticks forward on average exactly when it's supposed to.
I have never seen a sleep to miss it's deadline by as much as 50 ms however, I've seen plenty of naive timers that drift. i.e. accumalte a small delay and conincedentally updates noticable irregular intervals. This is what causes uneven framerates.
If you play a very short beep on every n:th frame, this is very audiable.
Also, logic and rendering can be run independently of each other. The CPU might not appear to be that busy, but I bet you the GPU is hard at work.
Now, about not hogging the CPU. CPU usage is just a break down of CPU time spent by a process under a given sample (the thread schedulerer actually tracks this). If you have a target of 30 Hz for your game. You're limited to 33ms per frame, otherwise you'll be lagging behind (too slow CPU or too slow code), if you can't hit this target you won't be running at 30 Hz and if you hit it under 33ms then you can yield processor time, effectivly freeing up resources.
This might be an intresting read for you as well.
On a side note, instead of yielding time you could effecivly be doing prepwork for future computations. Some games when they are not under the heaviest of loads actually do things as sorting and memory defragmentation, a little bit here and there, adds up in the end.

Resources