Phone battery use with camera turned on (ar) - user-interface

I am hoping this is a relatively simple answer. Ive always been interested in ar, and I've been debating about tinkering with a possibly ar driven ui for mobile.
I guess the only real question would be having the camera continuously turned on, how much battery would that use? i.e. would it be too much for something like this to be worth doing?

Battery drain is one of the biggest issues in the smartphones nowadays. I'm not a specialist in power consumption or battery life or whatever but anyone having and using a smartphone (not only for calls of course) would not be wrong by saying this. There are many tips on the internet teaching you how to increase the battery life. In fact processes running on your device need energy and that energy is provided by the battery.
To answer your question, I've been using the smartphones' cameras for AR applications since quite long time now. It's a heavy process and indeed it drains the battery faster than other processes. On the other hand you also have to consider the other processes running on your device while your AR application is used. For example your app might use the device's sensors (gyroscope, GPS, etc); these processes are draining the battery also. A simple test that you might do is to charge your device, start the camera and leave it until the battery dies. Well that's exactly how much the camera would drain the battery (you can even measure the time). Of course you might want to turn off everything else running on the device.
To answer your second question, it depends how the application is created (many things can be optimized a lot!) and how it's going to be used. If the goal of the application is to be used continuously for hours and hours then you need to wait for some other kind of technology being discovered (joking..I hope) or having extra power supply attached to your device. I think it's worth doing the application and optimize it on the fly and also in the end when everything is up an running. If the camera is the only issue then I'm sure it's worth trying!

Related

General tips in running a Unity3D-Animation on Microsoft Hololens

I'm going to have the task to make sure that an animation created for in Unity3D can be run on a Microsoft Hololens. I don't have any further information about the animation yet but I wanted to ask in advance if there are any big things i should keep in mind.
In the animation you're playing a "character" in first person mode, controlled by wasd or the arrow keys and you can look up, down, left, right with the mouse. There are (as known to me) no special interactions besides colliders.
And another question: is it easier to test the animation on the actual hololens or to use a hololens-emulator on my laptop?
I know it's a lot to ask right now without any code or stuff but I still hope that some of you can give me a little advide :)
In my experience it is difficult to say. The HoloLens, besides it is an awesome device with nice specs for that size, has quite limited graphical power. Try to minimize your model's vetices to a reasonable low amount (e.g. using Blender's decimate feature). Set down the quality in Unity's quality setting as proposed in the Dev-Guide.
For your emulation question: The emulator does not emulate the HoloLens' specs (like processor, memory...), but emulates input concepts etc., while running a Hyper-V virtual machine. So the performance in the emulator is dependent to your computer's hardware and is not related to the actual performance on a HoloLens.
Also take a look at the performance guidelines from Microsoft
I worked on HoloLens for a couple of projects. A few points that can be useful for you:
the first big thing I would keep in my mind is understanding if the character has to move in a VR environment. In this case HoloLens is almost useless because its lenses will allow you to see the surroundings [the real ones] distracting you from the virtual world. This is exactly what happens with their pre-installed HoloTour. Nice attempt but you will not totally feel in Roma or Machu Picchu
the second big thing that I would consider is the fact that - at least for the first release - HoloLens has a very limited field of view, that "amounts to the size of a monitor in front of you – equivalent to 15 inches" [source]. It is likely that - in a situation where the character will look in every direction - the objects that you put in the AR space will end up being cut or invisible
about testing: the emulator is really exceptional, I didn't find great differences between it and the real device. Of course if you already have the real HoloLens I would use that. But if not I would first develop and test on the emulator to understand if the project is worth the purchase

Is it worth developing a CPU intensive mobile app?

I am considering developing a poker robot to play against for mobile phones.
Needless to say that this is a very CPU intensive application as confirmed by the prototype.
What is the consensus on a CPU intensive mobile apps? Is it even worth developing? My concern is that people will leave negative feedback similar to "CPU Hog. Uninstall".
I could host the CPU intensive brain on a server, but that requires an internet connection for the user, which is undesirable.
It's not worth it if you're dropping frames, especially on mobile. Mobile is very limited comparatively speaking to PC or console, so you have to plan accordingly.
I'd attempt at trying to add as much pre-processed tasks that as possible can be accessed later on without further processing, thus only needing to call the pre-processed data. This'll definitely save quite a bit of CPU usage.
If you can't do it, then it's not worth it unless you can keep the game within a small margin of error which won't be noticed by the average person within your audience.

Sharing CPU over a network

Recently I've been wondering if that's possible to create an application which would take some CPU from one computer and conduct it to another pc connected to the same network for example. I have 2 laptops and one is much worse than the other. When I'm playing a game the first laptop overheats quickly :P and I would like the better laptop to take some of the CPU, execute the calculations and return the results back so that the weaker laptop wouldn't overheat too quickly :P.
Is that possible to code in C/C++? With use of WinAPI or sth? Or maybe there is already an application which would enable me to achieve this goal?
Not as simply as you're probably looking for, no. Some old games may be very well playable over Remote Desktop, but that's probably not what you're looking for.
Still, there are some options, with a bit of tweaking, that can be used to play a game remotely, for example: http://www.tomshardware.co.uk/forum/id-1638643/tutorial-create-onlive-remote-streaming-setup.html

Programmatically prevents battery charge

Some computers (e.g. Dell, Vaio) come with software that prevents a battery from charging.
This functionality allows me to use my laptop with battery (protected from power outage) and keep my battery from charging until it's down to, at most, 50% battery/charge remaining.
I want do write some code to automate this task.
I searched Dell Support Center for a solution, and I searched Google, too - no luck with either.
I thought about downloading the program and debugging it, but I couldn't find it.
Has anyone ever seen something like this?
Thanks
PS: I want to do this on a Dell Inspiron, and the code can be in C++/C# (or something)
I've never heard of a program that disables battery charging. (Why on earth would you want this?) If such programs exist, I imagine that they interact with the firmware or hardware at a very primitive level.
One thing you can try is a busy loop (burning power like mad) that checks the battery level and sleeps for a bit once it gets down to the target level. This won't do good things to the cpu temperature, however.
Some laptops come with battery charge limiting functionality - it is not via software though, but via firmware plus dedicated internal hardware I guess. Some Lenovo and Acer have such capability. The logic is not in software as the charge limiter kicks in even when the laptop is off.
The reason is that battery degrade when kept at 100% - as it is the case with laptops that are always plugged in. The new Acer Swift would limit at 80%, some Lenovo let one input a particular value.
If interested I can provide you with the software side - it works on Windows and Linux but can easily be available on MacOS. It works in conjunction with external hardware - i.e. a homeplug.
The code works but it's by no means production ready. It would need a bit of tweaking for a particular operating system and homeplug. Let know if interested.
Available on github: Charge Limiter

How can I programmatically stop a notebook battery from charging?

There is some easily available information on finding the status of a battery, or whether it's charging or not. (GetSystemPowerStatus API or System.Windows.Forms.SystemInformation.PowerStatus).
I want to be able to stop a battery from charging based on some criteria, e.g. battery power > 20%.
Is there an API to do this?
I think it's impossible, because you have need some API for battery or battery charger.
And this API can provide to you manufacturer of notebook and battery or battery charger support this.
I honestly don't know, but I'd have a look at the APM or ACPI APIs.
Other than that, the only option I can think of right now is a USB controlled robotic arm that ejects the battery when you need to stop charging, but that's probably not what you are looking for, and borders on the complicator's glove in terms of level of over-engineering. :)
I would just get a UPS and programatically tell it to cut all power... most should have an interface for doing this. Otherwise, as someone already said - a computer-controlled power strip would do it ^^
I've actually played with this idea when I was testing/writing about way too many new laptop models a while ago and the battery testing was annoying to set up, monitor and analyze.
I wrote an app that would do exactly everything (setup, listening, measuring, reporting) except unplugging the power and then replugging it and starting the computer again...
One of the options is to get hold of the device(I) for battery (Microsoft ACPI-Compliant Control Method Battery).
Listen for PowerNotification events forever. On each notification check the PowerStatus of the battery.
There are APIs for all of the above purposes in .Net and win32
Keep the device(I) disabled as long as the powerstatus is >threshold. Enable it as soon as goes below that or when you are not on AC power (i.e. before removing AC power, your continuously monitoring software should enable that battery device - or you manually enable it).
hmm,...this is a very buggy solution, but it can achieve what you want, although you have to be very careful.
Actually I use such a charge limiter. There is the control software - a Python script that monitors the battery level (psutil module) and controls external hardware - i.e. a switch that can be software controlled. I have Energenie and TP-Link homeplugs plus my own hardware contraption.
As it is for home use the software it's not polished at all, but with minimal effort can be adapted to any OS or hardware.
Let know if interested. The software lives here: CCC

Resources