I want to be able to display and close an image within a few miliseonds.
I generally understand how to do this for a few seconds, but how can I achieve accurate milisecond rendering?
What is the approach to ensure that I can be certain an image has only been open for 10 miliseconds or even 5 milisecond and that my monitor has indeed displayed this image (assuming I have for example a 240hz monitor).
From my limited understanding of a monitors refresh rate, a 240hz monitor may refresh the display 240 times a second which is around every 4 miliseconds. If I want to display something for 5 miliseconds, it's possible that more than half that time my monitor hasn't really been refreshing (therefore not displaying). Under that logic is it correct that I would need to display the image for around 4 or 8 miliseconds? Is there a way to synchronize my displays refresh rate with the output of my program?
I'm planning to do this in c++ although the language isn't too critical provided the problem can be solved. I haven't managed to find any sources on how to approach this sort of problem. Is there any direction to a solution for this sort of problem?
This sync'ing is known as V-sync (Vertical Sync), since displays are typically oriented in horizontal lines. That means an entire screen is shown after the last line is drawn, just before the next line at the top of the screen will be drawn. I.e. the V-sync event is the vertical wrap-around.
You need a game-oriented API like DirectX or OpenGL
I'm writing a toy app to experiment with some Core Animation features, including animating along a path (that's where the Sun's movement comes in) and manipulating time.
https://github.com/boredzo/WatchCompass
(Never mind the button, which isn't implemented yet.)
The sun and watch face are CALayers, each containing a static image. The hour hand is a CAShapeLayer within the watch face layer, with its anchor point set to one end ((NSPoint){ 0.5, 1.0 }).
The sun is animated using a CAKeyframeAnimation along a path. The ellipse shows the path; you can see that they're not lined up for some reason, but that's a different question.
The hour hand's transform.rotation.z is animated using a CABasicAnimation, as described in this answer.
The problem—at least the one I'm asking about in this question—is the difference in duration.
Both animations are set to exactly the same duration, but the sun arrives back at its starting position a full two clock-hours before the hour hand does.
Of course, eventually the clock's duration will be exactly half the sun's duration (or its speed set to 2), since a clock only has 12 hours. If I do that, then the hour hand falls 4 clock-hours behind the sun, rather than 2.
So, given that both animations have the same duration, or the duration of the clock's animation is an even multiple of the sun's animation, why does the clock take longer?
For that matter, although I'm not complaining, why does the sun wait for the clock to catch up?
This appears to be due to the fact that you aren’t specifying a value for the keyTimes property of the keyframe animation. Per the documentation:
For the best results, the number of elements in the array should match the number of elements in the values property or the number of control points in the path property. If they do not, the timing of your animation might not be what you expect.
Indeed, setting keyTimes to #[ #0, #0.25, #0.5, #0.75, #1 ] appears to correct this.
I'd like to track the position of the device with respect to an initial position with high accuracy (ideally) for motions at a small scale (say < 1 meter). The best bet seems to be using motionReading.SensorReading.DeviceAcceleration. I tried this. But ran into few problems. Apart from the noisy readings (which I was expecting and can tolerate), I see some behaviors that are conceptually wrong - e.g. If I start from rest, move the phone around and bring it back to rest- and in the process periodically update the velocity vector along all the dimensions, I would expect the magnitude of the velocity to be very small (ideally 0). But I don't see that. I have extensively reviewed available help including the official msdn pages but I don't see any examples where the position/velocity of the device are updated using the acceleration vector. Is the acceleration vector that the api returns (atleast in theory) supposed to be the rate of change of velocity or something else? (FYI - my device does not have a gyroscope, so the api is going to be the low accuracy version.)
experts.
How I can query the mouse DPI (pointer resolution) on Windows?
I read the article Pointer Ballistics for Windows XP. It says "the typical pointer resolution is 400 mickey/inch". But how I can query the exact value used by various kinds of mice?
It would be great if you could also point me to documents related to this topic.
Thanks!
It's impossible to tell. The mouse DPI is simply the number of times the mouse reports a change in location when it's moved by one inch. On the other side of the mouse cord all you know is that you periodically get a change in location, and you simply move the pointer on the screen every time.
One thing you can do if this is critically important for some special application is to have the user move his/her mouse exactly one inch and count the changes in location. If you're doing this in some professional environment then it's probably worth your while to issue special equipment- give your users the same model of high-end mouse with a particular known DPI. I say high-end because for most mice the approximate DPI number will be ludicrously inaccurate.
I'd like to get mouse movements in high resolution and high framerate on OSX.
"High framerate" = 60 fps or higher (preferably > 120)
"High resolution" = Subpixel values
Problem
I've got an opengl view running at about the monitor refresh rate, so it's ~60 fps. I use the mouse to look around, so I've hidden the mouse cursor and I'm relying on mouse delta values.
The problem is the mouse events come in at much too low framerate, and values are snapped to integer (whole pixels). This causes a "choppy" viewing experience. Here's a visualization of mouse delta values over time:
mouse delta X
^ xx
2 | x x x x xx
| x x x x xx x x x
0 |x-x-x--xx-x-x-xx--x-x----x-xx-x-----> frame
|
-2 |
v
This is a typical (shortened) curve created from the user moving the mouse a little bit to the right. Each x represent the deltaX value for each frame, and since deltaX values are rounded to whole numbers, this graph is actually quite accurate. As we can see, the deltaX value will be 0.000 one frame, and then 1.000 the next, but then it will be 0.000 again, and then 2.000, and then 0.000 again, then 3.000, 0.000, and so on.
This means that the view will rotate 2.000 units one frame, and then rotate 0.000 units the next, and then rotate 3.000 units. This happens while the mouse is being dragged with more or less constant speed. Nedless to say, this looks like crap.
So, how can I 1) increased the event framerate of the mouse? and 2) get subpixel values?
So far
I've tried the following:
- (void)mouseMoved:(NSEvent *)theEvent {
CGFloat dx, dy;
dx = [theEvent deltaX];
dy = [theEvent deltaY];
// ...
actOnMouse(dx,dy);
}
Well, this one was obvious. dx here is float, but values are always rounded (0.000, 1.000 etc.). This creates the graph above.
So the next step was to try and tap the mouse events before they enter the WindowServer, I thought. So I've created a CGEventTrap:
eventMask = (1 << kCGEventMouseMoved);
eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap,
0, eventMask, myCGEventCallback, NULL);
//...
myCGEventCallback(...){
double dx = CGEventGetDoubleValueField(event, kCGMouseEventDeltaX);
double dy = CGEventGetDoubleValueField(event, kCGMouseEventDeltaY);
}
Still values are n.000, although I believe the rate of event firing is a little higher. But it it's still not at 60 fps. I still get the chart above.
I've also tried setting the mouse sensitivity really high, and then scale the values down on my side. But it seems OSX adds some sort of acceleration or something—the values get really "unstable" and consequently unusable, and the rate of fire is still too low.
With no luck, I've been starting to follow the mouse events down the rabbit hole, and I've arrived at IOKit. This is scary for me. It's the mad hatter. The Apple documentation gets weird and seems to say "if you're this deep down, all you really need is header files".
So I have been reading header files. And I've found some interesting tidbits.
In <IOKit/hidsystem/IOLLEvent.h> on line 377 there's this struct:
struct { /* For mouse-down and mouse-up events */
UInt8 subx; /* sub-pixel position for x */
UInt8 suby; /* sub-pixel position for y */
// ...
} mouse;
See, it says sub-pixel position! Ok. Then on line 73 in <IOKit/hidsystem/IOLLParameter.h>
#define kIOHIDPointerResolutionKey "HIDPointerResolution"
Hmm.
All in all, I get the feeling OSX knows about sub-pixel mouse coordinates deep down, and there just has to be a way to read raw mouse movements every frame, but I've just no idea how to get those values.
Questions
Erh, so, what am I asking for?
Is there a way of getting high framerate mouse events in OSX? (Example code?)
Is there a way of getting sub-pixel mouse coordinates in OSX? (Example code?)
Is there a way of reading "raw" mouse deltas every frame? (Ie not rely on an event.)
Or, how do I get NXEvents or set HIDParameters? Example code? (So I can dig deeper into this on my own...)
(Sorry for long post)
(This is a very late answer, but one that I think is still useful for others that stumble across this.)
Have you tried filtering the mouse input? This can be tricky because filtering tends to be a trade-off between lag and precision. However, years ago I wrote an article that explained how I filtered my mouse movements and wrote an article for a game development site. The link is http://www.flipcode.com/archives/Smooth_Mouse_Filtering.shtml.
Since that site is no longer under active development (and may go away) here is the relevant excerpt:
In almost every case, filtering means averaging. However, if we simply average the mouse movement over time, we'll introduce lag. How, then, do we filter without introducing any side-effects? Well, we'll still use averaging, but we'll do it with some intelligence. And at the same time, we'll give the user fine-control over the filtering so they can adjust it themselves.
We'll use a non-linear filter of averaged mouse input over time, where the older values have less influence over the filtered result.
How it works
Every frame, whether you move the mouse or not, we put the current mouse movement into a history buffer and remove the oldest history value. So our history always contains X samples, where X is the "history buffer size", representing the most recent sampled mouse movements over time.
If we used a history buffer size of 10, and a standard average of the entire buffer, the filter would introduce a lot of lag. Fast mouse movements would lag behind 1/6th of a second on a 60FPS machine. In a fast action game, this would be very smooth, but virtually unusable. In the same scenario, a history buffer size of 2 would give us very little lag, but very poor filtering (rough and jerky player reactions.)
The non-linear filter is intended to combat this mutually-exclusive scenario. The idea is very simple. Rather than just blindly average all values in the history buffer equally, we average them with a weight. We start with a weight of 1.0. So the first value in the history buffer (the current frame's mouse input) has full weight. We then multiply this weight by a "weight modifier" (say... 0.2) and move on to the next value in the history buffer. The further back in time (through our history buffer) we go, the values have less and less weight (influence) on the final result.
To elaborate, with a weight modifier of 0.5, the current frame's sample would have 100% weight, the previous sample would have 50% weight, the next oldest sample would have 25% weight, the next would have 12.5% weight and so on. If you graph this, it looks like a curve. So the idea behind the weight modifier is to control how sharply the curve drops as the samples in the history get older.
Reducing the lag means decreasing the weight modifier. Reducing the weight modifier to 0 will provide the user with raw, unfiltered feedback. Increasing it to 1.0 will cause the result to be a simple average of all values in the history buffer.
We'll offer the user two variables for fine control: the history buffer size and the weight modifier. I tend to use a history buffer size of 10, and just play with the weight modifier until I'm happy.
If you are using the IOHIDDevice callbacks for the mouse you can use this to get a double value:
double doubleValue = IOHIDValueGetScaledValue(inIOHIDValueRef, kIOHIDTransactionDirectionTypeOutput);
The possibility of subpixel coordinates exists because Mac OS X is designed to be resolution independent. A square of 2x2 hardware pixels on a screen could represent a single virtual pixel in software, allowing the cursor to be placed at (x + 0.5, y + 0.5).
On any actual Mac using normal 1x scaling, you will never see subpixel coordinates because the mouse cursor cannot be moved to a fractional pixel position on the screen--the quantum of mouse movement is precisely 1 pixel.
If you need to get access to pointer device delta information at a lower level than the event dispatching system provides then you'll probably need to use the user-space USB APIs.