My app got rejected because the dialogs to handle in-app purchases are behind my transparent full screen window. You can still click them, but it's not user-friendly.
How would I handle this? Is there a way to alter the way these dialogs are presented, or should I change the properties of my own window?
I'm talking about these dialogs (the grid is what's drawn on my main window) :
You can set the window level to a lower value so the dialogs appear on top when starting the store request and reset it to it's previous value after the request completes. Or you could exit full screen mode to make the store request. They may be more annoyed by the transparent window which can be confusing more than the window order.
This works for me
#interface FullscreenWindow : NSWindow
#end
#implementation FullscreenWindow
-(id) init
{
// some init code here...
[self setLevel:NSMainMenuWindowLevel+1];
return self;
}
#end
Related
Is there a way to have an NSMenu like object displayed as content of a NSPopover?
Essentially I'd like to reproduce what macOS Dock does when you right click on an app icon (I don't mind about the dark background here, I'm only interested in having the menu displayed in a popover-like window style with the arrow pointing to its target).
I have been looking into what NSPopUpButton does but I couldn't find a way to configure this component in such way; it has a arrowPosition but this is actually referred to the orientation of the arrow on the button itself.
Also NSMenu is an NSObject and again I can't see a clean way to grab its view and add it to a popover, so I guess it's not possible but maybe you have any better idea?
Thanks for any suggestion!
You could check the apple menu programming guide:
https://developer.apple.com/library/archive/samplecode/MenuItemView/Introduction/Intro.html#//apple_ref/doc/uid/DTS10004136
In AppDelegate you could see:
// -------------------------------------------------------------------------------
// applicationDockMenu:sender
// -------------------------------------------------------------------------------
// This NSApplication delegate method is called when the user clicks and holds on
// the application icon in the dock.
//
- (NSMenu *)applicationDockMenu:(NSApplication *)sender
{
return self.appDockMenu;
}
I'm trying to set my NSWindow to be in the center of the screen, but I'm noticing that when I quit and reopen the app, it takes the position that the window was in when the app closed. Is this expected behavior?
If you selected "Restorable" window behaviour then it's the correct behaviour.
You can disable this by behaviour by unchecking restorable and also leave your autosave name empty.
Your application saves the state into "~/Library/Saved Application State/com.identifier.appName.savedState" folder and loads on launch
Also one hidden hack to help:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
[[NSUserDefaults standardUserDefaults] setObject:#NO forKey:#"NSQuitAlwaysKeepsWindows"];
}
I'm using Window Services' CGWindowListCreate and CGWindowListCreateDescriptionFromArray to get window information. When getting kCGWindowBounds in a regular Space everything works fine (I'm drawing borders around the frontmost window on the 0th level). However, when I use the same method while on a fullscreen application's Space, I get nonsense bounds: (0, 855, 480, 1).
I wouldn't care much about this if there was an easy way to tell if I'm currently at a fullscreen app's Space, because then I'd just draw a border around the screen (well... it would depend if the menu bar is showing...).
Is this a bug, or is there a reason for this behavior?
EDIT:
Figured out my problem. It's a bigger issue than I would have liked. The thing is the API goes through ALL NSWindows, even the ones that aren't, well, normal windows. Chrome's loading bar on the bottom is a window by itself, for example, and Mail also has some window on the top of the app. This is a problem because I have no way to differentiate the window that looks to be frontmost.
For my app, I would like to capture a specific window to intercept mouse events in it. I would have liked to be able to have the user press a hotkey and then click on the desired window to select, but there is no API to get the window under the cursor. I have no clue how to proceed.
Edit 2:
To better help people find a useful answer, changed title from: "Quartz Window Services returning wrong window bounds for fullscreen apps"
Have you got these methods defined for the window delegate?
- (NSSize)window:(NSWindow *)window willUseFullScreenContentSize:(NSSize)proposedSize
{
NSRect mainDisplayRect = [[NSScreen mainScreen] frame];
CGSize cgScreenSize = CGSizeMake(mainDisplayRect.size.width, mainDisplayRect.size.height);
return cgScreenSize;
}
- (void)windowWillEnterFullScreen:(NSNotification *)notification
{
}
- (void)windowDidEnterFullScreen:(NSNotification *)notification
{
}
- (void)windowWillExitFullScreen:(NSNotification *)notification
{
}
I proceeded by going through the description dictionaries and checking if the current cursor position was inside the bounds of the windows. The first window to satisfy this would be the window right under the cursor, which is exactly what I needed.
Separately, to find the current top-most window, I used the iChat Apple example of the Accessibility API to register ApplicationActivatedNotification and MainWindowDidChangeNotifications. Both notifications combined would let me keep track of the main window of the active app (top-most). To get the bounds in this case, I just got the main window's position and size using the Accessibility API.
I have an application that will load a couple of windows depending on which button is pressed. All except one of these open on the mainScreen (the screen in which the main window is open in). One of them (the preference window) opens on the first screen (the screen with the menu bar). I cannot understand way it is doing this, is there a way to change the screen that a NSWindow opens on?
I could not get toohtik's answer to work. What I ended up doing was subclassing NSWindow and then overriding constrainFrameRect: toScreen:. This will automatically open the new window on the "main screen" of the application.
- (NSRect)constrainFrameRect:(NSRect)frameRect toScreen:(NSScreen *)screen
{
AppDelegate *delegate = [[NSApplication sharedApplication] delegate];
return [super constrainFrameRect:frameRect toScreen:delegate.window.screen];
}
I dont't know why you have that behaviour but you can change it through initWithFrame method that takes NSScreen argument.
I have noticed when an apps window contains an Outline view (such as XCode) it changes color when that window is in focus. With XCode for example, if the window is current then the outline view has a blueish background, if it looses focus it goes grey,
Can anyone help me to replicate this? I presume its something to do with drawRect: but can only manage to get the color to change when the window loads.
Maybe its a built in function and I'm just missing something?
All you have to do in your -drawRect: is check whether the window has main status and draw accordingly:
- (void)drawRect:(NSRect)rect
{
if ([[self window] isMainWindow]) {
// draw active appearance
} else {
// draw inactive appearance
}
}
A window's delegate gets messages whenever a window gets or resigns main or key window status. You can implement the appropriate methods (like -windowDidBecomeMain: and -windowDidResignMain:) in your window delegate to update the window and its subviews as necessary.