Tracking wrist position using accelerometer in a smart watch - algorithm

I am working on a smartwatch project. I want the display to be turned off and only come on when the user brings his hand into the watch-viewing position.
I am running my application on the NRF52 MCU which means machine learning is out of the question. I am using a 3-axis accelerometer from STM.
How can I detect when a user moves his hand into the typically watch viewing position? How is this achieved in smartwatches?
I have the following ideas so far:
- Constantly poll accelerometer and calculate pitch and roll values. Then determine what range of pitch and roll values corresponds to this gesture. This seems a bit wasteful because the CPU will have to be always active.
Is there a simple signal processing algorithm or something similar that can achieve this?

Look into Galvanic skin response sensor - It can measure electrical connectivity of the skin.
When internal or external forces cause arousal — of any kind — the skin becomes a better conductor of electricity. Essentially, when you start to sweat, either from exercise or something else, the band will be able to monitor that.
Detecting when someone is sweating gives the software more information about what a user is doing, which allows for better health tracking. Being able to correlate the level of activity with a different source than just gravity from the accelerometer, allows these programs to take on a more trainer-like role — recommending specific exercises and levels of exertion.
Hope this helps!

Related

Is GPS inaccuracy consistent over short time spans?

I'm interested in developing a semi-autonomous RC lawnmower.
That is, the operator would decide when to stop, turn, etc., but could request "slightly overlap previous cut" and the mower would automatically do so. (Having operated high-end RC mowers at trade shows, this is the tedious part. Overcoming that, plus the high cost -- which I believe is possible -- would make a commercial success.)
This feature would require accurate horizontal positioning. I have investigated ultrasonic, laser, optical, and GPS. Each has its problems in this application. (I'll resist the temptation to go off on these tangents here.)
So... my question...
I know GPS horizontal accuracy is only 3-4m. Not good enough, but:
I don't need to know where I am on the planet. I only need to know where I am relative to where I was a minute ago.
So, my question is, is the inaccuracy consistent in the short term? if so, I think it would work for me. If it varies wildly by +- 1.5m from one second to the next, then it will not work.
I have tried to find this information but have had no success (possibly because of the ubiquity of other GPS-accuracy discussion), so I appreciate any guidance.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Edit ~~~~~~~~~~~~~~~~~~~~~~
It's looking to me like GPS is not just skewed but granular. I'd be interested in hearing from anyone who can give better insight into this, but for now I'm going to explore other options.
I realized that even though my intended application is "outdoor", this question is technically in the field of "indoor positioning systems" so I am adding that tag.
My latest thinking is to have 3 "intelligent" high-dB ultrasonic (US) speaker units. The mower emits RF requests for a tone from each speaker in rapid sequence, measuring the time it takes to "hear" each unit's response, thereby calculating distance to each of these fixed point and using trilateration to get position. if the fixed-point speakers are 300' away from the mower, the mower may have moved several feet between the 1st and 3rd response, so this would have to be allowed for in the software. If it is possible to differentiate 3 different US frequencies, they could be requested/received "simultaneously". Though you still run into issues when you're close to one fixed unit and far from another. So some software correction may still be necessary. If we can assume the mower is moving in a straight line, this isn't too complicated.
Another variation is the mower does not request the tones. The fixed units send RF "here comes tone from unit A" etc., and the mower unit just monitors both RF info and US tones. This may simplify things somewhat, but it seems it really requires the ability to determine which speaker a tone is coming from.
This seems like the kind of thing you could (and should) measure empirically. Just set a GPS of your liking down in the middle of a field on a clear day and wait an hour. Then come back and see what you find.
Because I'm in a city, I can't run out and do this for you. However, I found a paper entitled iGeoTrans – A novel iOS application for GPS positioning in geosciences.
That includes this figure which duplicates the test I propose. You'll note that both the iPhone4 and Garmin eTrex10 perform pretty poorly versus the accuracy you say you need.
But the authors do some Math Magic™ to reduce the uncertainty in the position, presumably by using some kind of averaging. That gets them to a 3.53m RMSE measure.
If you have real-time differential GPS, you can do better. But this requires relatively expensive hardware and software.
Even aside from the above, you have the potential issue of GPS reflection and multipath error. What if your mower has to go under a deck, or thick trees, or near the wall of a house? These common yard features will likely break the assumptions needed to make a good averaging algorithm work and even frustrate attempts at DGPS by blocking critical signals.
To my mind, this seems like a computer vision problem. And not just because that'll give you more accurate row overlaps... you definitely don't want to run over a dog!
In my opinion a standard GPS is no way accurate enough for this application. A typical consumer grade receiver that I have used has a position accuracy defined as a CEP of 2.5 metres. This means that for a stationary receiver in a "perfect" sky view environment over time 50% of the position fixes will lie within a circle with a radius of 2.5 metres. If you look at the position that the receiver reports it appears to wander at random around the true position sometimes moving a number of metres away from its true location. When I have monitored the position data from a number of stationary units that I have used they could appear to be moving at speeds of up to 0.5 metres per second. In your application this would mean that the lawnmower could be out of position by some not insignificant distance (with disastrous consequences for your prized flowerbeds).
There is a way that this can be done, as has been proved by the tractor manufacturers who can position the seed drills and agricultural sprayers to millimetre accuracy. These systems use Differential GPS where there is a fixed reference station positioned in the neighbourhood of the tractor being controlled. This reference station transmits error corrections to the mobile unit allowing it to correct its reported position to a high degree of accuracy. Unfortunately this sort of positioning system is very expensive.

iBeacons: bearing to beacon?

Partly a coding problem, partly math problem.
Q1. I have an iOS device with compass active. If it knows I'm moving through the field of an iBeacon - or the Beacon is moving through my detection range - would it be possible for a phone to work out (roughly) the relative direction/bearing of that beacon with a series of readings by comparing signal strengths? Has anyone had a try at this?
Q2. Would it be possible to change the Major and Minor values of a beacon regularly (eg: every second) to pass small pieces of info - such as a second user's Bearing and Course?
Q1. It MIGHT be possible but you would need a controlled environment. Either the beacon or the phone needs to be fixed. You also need to be in an area with no obstructions or sources of radio interference.
Then you'd need to use the signal strength (which is sloppy and varies by a fair amount) as one input, and the device's heading info (which is also grossly inaccurate) and do some petty gnarly math on it.
Assuming you could work out the math, the slop in the input readings might make the results too iffy to be useful. (For example, how would you distinguish moving directly towards the beacon from moving 30 degrees to one side or the other? The signal strength would still increase, just not as quickly.
And your algorithm would have to deal with edge cases like moving along a circle around the beacon. In that case the signal strength should not change.
My gut is that even with clever algorithms that input data is just too unreliable to make much sense out of it, beyond "getting warmer" and "getting colder."
As mentioned above, you'd have to track your device's movement within the field, including distance covered and direction, then with multiple readings of signal strength you could theoretically calculate relative direction to the beacon to some degree of accuracy.
As to your second question about changing the minor version number, I have not seen any beacon APIs that allow that, either from the beacon manufacturers or from Apple's implementation.
However, a typical beacon is an ARM or other low power processor with a BLE transceiver, running a program. In theory it should be possible to create your own iBeacon transmitter that changed one of the parameters in order to transmit changing information. You'd have to set up the iOS device with the beacon region only specifying the UUID or UUID and major ID (depending on whether you wanted to change just the minor or change both the major and minor ID in order to transmit changing information.)
Note, too, that iBeacons are a special case of BLE, and the BLE standard does support the sending of arbitrary, changing data. You might be better off implementing your own BLE scheme either instead of or in addition to iBeacons.

What AI is best for learning an area

I have a robot that I need to write an autonomous program for. The program is to play on this feild: http://www.vexforum.com/wiki/index.php/Gateway.
and pick up the balls and barrels and put them in the cylinders(goals). I have sensors like light detection(best for following white line on ground or keeping track of location by noticing when you cross a white line), ultrasonic sonar, bump sensors, and encoders(count amount of wheel rotations). I want to make a program where the program learns the field and learns how to navigate best with the tasks at hand. I am thinking a neural net is my best choice but I can't think of what inputs I would use. The main thing is I don't want scripted paths. I know this is pretty vague but too much detail and no one would read this. Anyone ave any ideas?
Check out Udacity course 373by Prof Thurn at http://www.udacity.com/overview/Course/cs373.
He has successfully applied 'particle filters' to program the Google Driveless car
You need to use Simultaneous localization and mapping (SLAM)
It is a pretty standard and successful technique for robot localization.

Measuring distance - Windows phone app

Good evening all.
I'm just trying to collate some ideas really and was wondering if I could pick some brains.
I'd like to develop an app that relies upon measuring distance reasonably accurately. So for example, I have a central point, I want to be able to detect whether the phone is within a radius of a meter.
How could I achieve this?
The points would be static but I don't think GPS would be accurate enough to rely on this solely.
I'm definitely not a hardware chap but is there a way of combining GPS and some other sort of transmitter to ensure accuracy?
Any help or suggests greatly appreciated.
One meter accuracy? It's probably not going to happen with any phone hardware out there - definitely not with any Windows Phone. GPS isn't accurace enough without a differential beacon, and phones don't have the hardware to receive that (and I doubt you have a differential transmitter either).
The location service on the phone (assuming high accuracy is selected) combines data from GPS, cell towers and WiFi hot spots to provide a location.
There is no way to include the use of other sensors to improve this data.
You also won't be able to get the level of accuracy you're after from the phone. It's just not designed for the purpose you describe.

device to measure vibration - retrieving data?

hey guys,
i'm working on a concept for university. i wonder what's the easiest and best way to measure certain vibration in a room. imagine a room full o people dancing. is there any affordable device i can put on the floor that sends data to my computer so i can read out vibration values or use vibration as data?
thank you for your help
I would guess that a microphone, as Pointy suggested, would work, but if you're on a near-zero budget, find an old speaker and bolt it face-down to the floor. Connect the wires to a 1/8" phono plug and plug it into the microphone-in jack on your sound card. Record the vibration data using Audacity. The floor's vibration will flex the speaker cone and generate small amounts of electricity, which the sound card input will see. If you put a foam-lined box over the top (actually back) of the speaker you'll minimize the effect of sound waves from the air on the speaker cone.
There is specific noise monitoring equipment which could serve that purpose, depending on how accurate the information you are monitoring needs to be.
I used to operate sound monitoring equipment as part of a rotating equipment inspection program when I was in the Navy. Basically it was a set of transducers you mounted to the equipment you wanted to monitor, and a proprietary box for recording and analyzing the results. I'm sure you could easily replicate that functionality with a PC.
Do a search for "Vibration Monitoring Equipment" or "Condition Monitoring" and see what turns up. If you are at a University with an engineering department I would imagine the ME's would have something like what you're looking for.

Resources