Correct way to drive Main Loop in Cocoa - macos

I'm writing a game that currently runs in both Windows and Mac OS X. My main game loop looks like this:
while(running)
{
ProcessOSMessages(); // Using Peek/Translate message in Win32
// and nextEventMatchingMask in Cocoa
GameUpdate();
GameRender();
}
Thats obviously simplified a bit, but thats the gist of it. In Windows where I have full control over the application, it works great. Unfortunately Apple has their own way of doing things in Cocoa apps.
When I first tried to implement my main loop in Cocoa, I couldn't figure out where to put it so I created my own NSApplication per this post. I threw my GameFrame() right in my run function and everything worked correctly.
However, I don't feel like its the "right" way to do it. I would like to play nicely within Apple's ecosystem rather than trying to hack a solution that works.
This article from apple describes the old way to do it, with an NSTimer, and the "new" way to do it using CVDisplayLink. I've hooked up the CVDisplayLink version, but it just feels....odd. I don't like the idea of my game being driven by the display rather than the other way around.
Are my only two options to use a CVDisplayLink or overwrite my own NSApplication? Neither one of those solutions feels quite right.

I am curious to see if anyone who has actually done this cares to weigh in, but here is my understanding:
Apple pushes the CVDisplayLink solution over doing a loop on the main thread that uses -nextEventMatchingMask:untilDate:inMode:dequeue: because, I think, it provides better responsiveness for UI controls. This may not be relevant for full-screen games. (Note: You don't need to replace NSApplication to use that form of game loop.) I think the main potential issue with using CVDisplayLink is that it will only run one frame in advance and it does this determination early, which is even stronger than vertical sync. On the plus side, it might improve latency.
Other solutions include decoupling rendering from game logic and running game logic periodically on the main thread and rendering on the CVDisplayLink thread. I would probably only recommend this, however, if you run into issues with the game-driven-by-display paradigm.

You don't necessarily have to make your own NSApplication based class or use CVDisplayLink to get around the fact that an app's runloop is hidden from you in Cocoa.
You could just create a thread and have your run loop in there instead.
For what it's worth though, I just use CVDisplayLink.

I'm sticking something up here to revive this question...mainly out of portability. I found from studying the OLC Pixel Game Engine, that it works with a do{}while loop and std::chrono to check the timing of the frame to calculate fElapsed Time. Below is some code I wrote to do the same thing. It also adds a makeup portion, to govern the framerate from shooting above a certain value, in this case, 60 FPS.
c++ code
int maxSpeedMicros = 16700;
float fTimingBelt; //used to calculate fElapsedTime for internal calls.
std::chrono::steady_clock::time_point timingBelt[2];
bool engineRunning = false; //always have it true, until the engine stops.
bool isPaused = false;
do {
timingBelt[1] = std::chrono::steady_clock::now();
fTimingBelt = std::chrono::duration_cast<std::chrono::microseconds>(timingBelt[1] - timingBelt[0]).count() * 0.000001;
if (isPaused) {
do {
std::this_thread::sleep_for (std::chrono::milliseconds(100));
timingBelt[1] = std::chrono::steady_clock::now();
} while (isPaused);
}
timingBelt[0] = std::chrono::steady_clock::now();
// do updating stuff here.
timingBelt[1] = std::chrono::steady_clock::now();
int frameMakeup = std::chrono::duration_cast<std::chrono::microseconds>(timingBelt[1] - timingBelt[0]).count();
if (frameMakeup < maxSpeedMicros) {
int micros = maxSpeedMicros - frameMakeup;
std::this_thread::sleep_for (std::chrono::microseconds(micros));
}
} while (engineRunning);
However, that code was in direct conflict with Cocoa's event driven model.
Custom main application loop in cocoa
So as a bandaid, I commented out the whole loop, and created a new method that runs one iteration of the loop. I then implemented this in my AppDelegate:
Objective C Code
- (void)applicationDidFinishLaunching:(NSNotification *)notification {
engine->resetTimer();
[NSTimer scheduledTimerWithTimeInterval:0.016666666667 target:self selector:#selector(engineLoop) userInfo:nil repeats:YES];
}
-(void) engineLoop { //Let's handle this by the engine object. That's too complicated!
engine->updateState();
[glView update]; //Since the engine is doing all of its drawing to a GLView
[[glView openGLContext] flushBuffer];
}
Still to do is adjust the tolerance of the timer object. Apple Developer documentation states that if a timer object misses the next window, it will wait for the next frame time. However, a tolerance allows it to shift the timing of future events to make smoother framerate transitions and better use of CPU power.
So at this point I am open to suggestions and input about what others have done to make more portable code. I am planning on a boolean argument in the constructor of the engine named "eventDriven" and if false, will start its own game loop thread, then split out the top event loop to call an "engineUpdate" method that handles all of the code that can be event driven. Then in the case of building on an event driven system, the delegate can just construct the engine with a engineUpdate = TRUE and have their events drive the gameUpdate.
Has anyone done this? and if so, how does it perform cross platform?

Related

What is the recommended frequency for UI changes?

I have a cocoa application window (NSWindow) which position on the screen should be updated frequently (depending on some calculation). As noticed in the documentation, UI changes should be made on the main thread:
void calculationThread()
{
while(true)
{
calculatePosition();
if(positionChanged)
{
dispatch_async(dispatch_get_main_queue(), ^{ setWindowPos(); });
}
}
}
void setWindowPos()
{
[window setFrame:_newFrame display:YES];
}
Now the problem I have is that the window movement is very slow and delayed. After making some profiling I see that the calculation process takes about 40mSec, meaning that I'm queueing up a backlog of UI updates 25 times a second.
I've read here that this might be faster than they can be processed and timer should be used to fire the changes every tenth of a second or so. But, wouldn't it be too slow for the human eye (I mean, in that case the movement wouldn't be delayed but would be lagged causing pretty much the same affect).
I will appreciate some knowledge sharing on this. Actually my main 2 questions are:
Are 25-30 UI updates per second really to much?
If yes, what is the recommended UI changes frequency?
The frequency at which a window can be moved around onscreen without problems will of course depend upon the speed of the user's machine, the video card they have, the size of the window, and probably a bunch of other factors. There is no single good answer to this. However, if you just drag a window around on your screen, you will notice that it can probably be moved very smoothly (unless your machine is very busy or very low on memory or something); I would not expect 25 times per second to produce a problem on a modern Mac. Not even close, in fact.
#RobNapier's points about Core Animation etc. are fine, but overstated I think; there is nothing inherently wrong with changing your UI using a timer or other periodic update if that is what you actually want to do. CoreAnimation is a toolkit for making some types of animation easier; using it is not required, and it is not suited to every problem. Similarly, if you want to make changes that are actually synched to screen refresh then CVDisplayLink is useful, but it doesn't really sound like that's what you want to do.
For your purposes, your basic approach seems fine, although I would suggest adding an NSDate check in order to skip updates if the previous update was less than, say, 1/60th of a second previous. After all, the calculation appears to take 40mSec on your machine, but it might be much faster on some other machine; you want to throttle your drawing to a reasonable rate just to be a good citizen.
So what is the problem, then? I suspect the issue might actually be your call [window setFrame:_newFrame display:YES]. If you look at Apple's docs for that method, they state "When YES the window sends a displayIfNeeded message down its view hierarchy, thus redrawing all views." Each time you call that method, then, you are not only moving your window (which I gather is your intention); you are redrawing all of the contents of the window, too, and that is slow. If you don't need to do that, then that is the overhead you need to eliminate. Call setFrameOrigin: or setFrameTopLeftPoint: instead (which make the semantics clear, that you are moving the window without resizing it or redrawing it), or perhaps just setFrame:display: passing NO instead of YES, and I'm guessing your performance problem will vanish.
If you do in fact need to redraw the window contents every time, then please edit the problem description to reflect that. In that case, the solution will have to involve profiling why your window drawing is slow, and figuring out ways to optimize that, which is an entirely different problem.
As you've discovered, you should never try to drive the UI from a tight loop. You should let the UI drive you. There are three primary tools for that.
For simple problems, AppKit is capable of moving windows around the screen. Just call [NSWindow setFrame:display:animate:]. You can override animationResizeTime: to modify the timing.
In many cases AppKit doesn't give enough control. In those case, the best tool is almost always Core Animation. You should tell the system using Core Animation how you where you want UI elements to wind up, and over what period and path, and let it do the work of getting them there. See the Core Animation Programming Guide for extensive documentation on how to use that. It focuses on animating CALayer, but the techniques are similar for NSWindow. You'll use [NSWindow setAnimations:] to add your animation. Look at the NSAnimatablePropertyContainer protocol (which NSWindow conforms to) for more information. For a simple sample project of animating NSWindow, see Just Say No from CIMGF.
In a few cases, you really do need to update the screen manually at the screen update frequency. I must stress how rare this situation is. In almost all cases, Core Animation is the correct tool. But in those rare case (some kinds of video for instance), you can use a CVDisplayLink to handle this. That will call you each time the screen would like to refresh, giving you an opportunity to update your content to match.

Unity global mouse events

Most Unity tutorials suggest using Mouse events within the Update function, like this:
function Update () {
if (UnityEngine.Input.GetMouseButton(1)) {
}
}
This strikes me as really inefficient though, similar to using onEnterFrame in AS or setInterval in JS to power the whole application - I'd really prefer to use an events based system.
the OnMouseDown() method is useful, but is only fired when the MouseDown is on the object, not anywhere in the scene.
So here's the question: Is there a MouseEvent in Unity for detecting if the mouse button is down globally, or is the Update solution the recommended option?
This strikes me as really inefficient though, similar to using
onEnterFrame in AS or setInterval in JS to power the whole application
- I'd really prefer to use an events based system.
As already pointed out in comments, this isn't necessary less efficient. Every event based system is probably using a polling routine like that behind the scenes, updated at a given frequency.
In many game engines/frameworks you are going to find a polling based approach for input handling. I think this is related to the fact that input update frequency is directly correlated to the frame rate/update loop frequency. In fact it doesn't make much sense to listen for input at higher or lower frequency than your game loop.
So here's the question: Is there a MouseEvent in Unity for detecting
if the mouse button is down globally, or is the Update solution the
recommended option?
No there isn't. Btw if you want you can wrap mouse input detection inside a single class, and expose events from there where other classes can register to.
Something like:
public class MouseInputHandler : MonoBehavior
{
public event Action<Vector2> MousePressed;
public event Action<Vector2> MouseMoved;
...
void Update()
{
if (Input.GetMouseButton(0))
{
MousePressed(Input.mousePosition);
...
}
}
}
Like stated, you can use it without major concerns, Unity will 'make its magic' internally as to set processing power sensitive code execution for you in terms of polling events. That's the beauty of a modern game engine after all. You normally shouldn't have to be hacking your way around a common feature such a mouse click detection.
However if you don't want to go using the main Update() you can make a CoRoutine if you feel more comfortable with that, just bear in mind that Unity coroutines are not multi-threaded neither, so at the end everything needs to wait anyway.

Refresh a NSOpenGLView within a loop without letting go of the main runloop in Cocoa

I am building an Cocoa/OpenGL app, for periods of about 2 second at a time, I need to control every video frame as well as writing to a digital IO device.
If after I make the openGL calls I let go of the main thread (like if I make the openGL calls inside a timer fire-method with an interval of like 0.01 Sec) openGLview is refreshed with every call to glFinish().
But If I instead keep the main thread busy like in a 2 second long while loop, openGl calls won't work (surprisingly the first call to glFinish() would work but the rest won't).
The documentation says that glFinish should block the thread until the gl commands are executed.
Can anybody please help me understand what is going on here or provide a solution to this problem.
To make it clear, I want to present 200 frames one after another without missing a frame and marking each frame refresh by writing to a digital IO port (I don't have a problem with this) all on Snow Leopard.
This is not quite my department - pretty vanilla NSOpenGLView user myself - but from the Mac OpenGL docs it looks like you might want to use a CVDisplayLink (Q&A1385) for this. Even if that won't do it, the other stuff there should probably help.
EDIT
I've only done some basic testing on this, but it looks like you can do what you want as long as you first set the correct OpenGL context and then swap buffers after each frame (assuming you're using a double buffered context):
// inside an NSOpenGLView subclass, somewhere outside the usual drawing loop
- (void) drawMultipleFrames
{
// it might be advisable to also do a [self lockFocus] here,
// although it seems to work without that in my simple tests
[[self openGLContext] makeCurrentContext];
// ... set up common OpenGL state ...
for ( i = 0; i < LOTS_OF_FRAMES; ++i )
{
// ... draw your frame ...
glFinish();
glSwapAPPLE();
}
// unlockFocus here if locked earlier
}
I previously tried using [[self openGLContext] flushBuffer] at the end of each frame instead -- that doesn't need glSwapAPPLE but doesn't block like glFinish so you might get frames trampling over one another. This seems to work OK with other apps, runs in the background etc, but of course YMMV.

How to make a function been called every frame in the cocoa programming?

I know cocoa application has a main run loop, how to make a function been called every frame? I mean this function should be called every process of the main loop. Is that through the -(void) run of the + (NSRunLoop*) currentRunLoop;?
You can call getCFRunLoop to get the Core Foundation RunLoop. Then it's just a matter of adding an observer to the CFRunLoop. See the docs for this function:
CFRunLoopAddObserver()
I'm assuming you are targeting OSX since the tag is cocoa and not cocoa-touch so if you want to synchronize with every screen frame update you should check out CVDisplayLink
I know it's been quite some time. Anyway this might come useful for someone who stumbled upon here like me.

Anyone know why nextEventMatchingMask:untilDate:inMode:dequeue: take many ms to return an event?

In a OS X game calling this was recommended as the way to get keyboard and mouse events.
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
for(;;)
{
NSEvent* event = [NSApp nextEventMatchingMask:NSAnyEventMask untilDate:nil inMode:NSDefaultRunLoopMode dequeue:YES];
if(!event) break;
processevent(event);
...
}
[pool release];
which is called in the games main loop (its cross platform).
Since the most recent versions of OSX 10.5.X this call is suddenly taking many milliseconds per event when there is an event available, and the game' frame rate is affected any time an event appears. If there are multiple events it can take as long as 10ms per frame on a slower mac.
Anyone have a clue as to why this is? Or what I can do alternatively to get events without impacting the game so much?
I tried managing the mouse events myself by getting the mouse position manually and when it gets close to the edge of the screen warping it to the center, but that causes a hitch in the motion (only when the cursor is hidden of course).
Other alternatives might be getting stuff from the HID manager,which we already do for joysticks, but HID is not terribly clear.
The faster the mac the more these hitches from getting events are noticeable.
I think you need to release and re-allocate the autorelease pool inside your loop: as you have the loop all the autoreleased items are just building up and never being flushed.
Offhand, I don't know why the method is taking so long to return. That's worth investigating on the cocoa-dev list or another Apple forum resource. My guess is that managing the events yourself is a bad idea — AppKit is optimized for that, and you can safely bet it will be a lot faster than thrown-together custom code.
However, there is something you can do to keep it from affecting your game: put it in a separate thread. This is a suggested approach to keep your UI from freezing up during a long method call. Apple has published an Introduction to Threading programming guide that can help you get up to speed with the critical concepts you'd need.
I think you have to use an actual value in the untilDate argument, like [NSDate distantFuture] or [NSDate distantPast]. The function will block until an event is available in the former case, while it will return immediately with a nil event in the latter case.
I learned this from the GLFW source code.

Resources