I Currently have a nearly landscape only iPad application on the app store and have been having some issues with the new iOS 6 way on handling rotation locking.
It is a UINavigationController based application and since iOS handles most the responsibility to the rootViewController of the UIWindow I have to manually ask each UIViewController what rotation it wants.
As I have a very large amount of UIViewControllers manually adding code to each Controller to do this would have taken me ages, So I made an extension of both the UINavigationController and UIViewController to override these calls and there I could manually set what views I want to block Portrait to and what ones to allow it to.
UINavigationController-Extension.m:
//
// UINavigationController-Extension.m
// DrivingInstructor
//
// Created by Liam Nichols on 06/12/2012.
// Copyright (c) 2012 Liam Nichols. All rights reserved.
//
#import "UINavigationController-Extension.h"
#implementation UINavigationController (Extension)
-(BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
return [self.topViewController shouldAutorotateToInterfaceOrientation:interfaceOrientation];
}
-(NSUInteger)supportedInterfaceOrientations
{
return self.topViewController.supportedInterfaceOrientations;
}
-(BOOL)shouldAutorotate
{
return YES;
}
#end
#implementation UIViewController (Extension)
-(BOOL)shouldAutorotate
{
return NO;
}
-(NSUInteger)supportedInterfaceOrientations
{
if ([[self portraitClasses] containsObject:NSStringFromClass([self class])])
{
return UIInterfaceOrientationMaskAll;
}
return UIInterfaceOrientationMaskLandscape;
}
-(BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation
{
if ([[self portraitClasses] containsObject:NSStringFromClass([self class])])
{
return YES;
}
return (toInterfaceOrientation == UIInterfaceOrientationLandscapeLeft || toInterfaceOrientation == UIInterfaceOrientationLandscapeRight);
}
-(NSArray*)portraitClasses
{
static NSArray *classes;
if (classes == nil)
{
classes = #[ #"MockTestController", #"PLUILibraryViewController", #"PhotoViewController" ];
}
return classes;
}
#end
At fist I thought this had fixed the issue (I also kept the app locked to landscape in the info.plist so that it would launch to lanscape and then in the app delegate I called application:supportedInterfaceOrientationsForWindow: and returned all orientations so that my select views could access portrait if needed.
So this seemed to have worked and all view controllers where locked to lanscape accept the 3 I specified in my extension class.. I was monitoring the extension and whenever I pushed to the new controller it checked for the orientations and locked the app to the specified orientation.
The one issue I found however and can't seem to fix is that when i for example am in portrait on my allowed view controller and try to pop back to the previous view controller what is locked to landscape, supportedInterfaceOrientations is never called again and the view that should be locked to landscape isn't (this causes issues).
As per apples documents, this is how it should work as the responsibility of handling rotation is passed to the rootViewController and as the user isn't rotating their device and the rootViewController isn't changing there is no need to request supportedInterfaceOrientations..
My question is, is there a way to get the application to force call supportedInterfaceOrientations or should I be doing this differently?
Thanks for reading, and If I can find a solution to this last bug then this code might be a good reference to people who are also in the same situation.
-----Edit-----
Was doing some further investigation and found that just before the viewWillAppear: function, supportedInterfaceOrientations is actually in fact called on the controller I am attempting to pop back to and does return the correct mask UIInterfaceOrientationMaskLandscape to try and make it automatically rotate back from portrait however it doesn't seem to listen to this response and still leaves the UIViewController in portrait...
So this means that I do not need to call the supportedInterfaceOrientations again but instead make the device rotate back round to landscape!
According to the documentation you can call:
+ (void)attemptRotationToDeviceOrientation
which if I understand the documentation correctly then would query rotation to the different view controllers again.
Related
Storyboards for Cocoa apps seems like a great solution as I prefer the methodology you find in iOS. However, while breaking things up into separate view controllers makes a lot of logical sense, I'm not clear as to how to pass window control (toolbar buttons) or menu interaction down to the view controllers that care. My app delegate is the first responder and it receives the the menu or toolbar actions, however, how can I access the view controller that I need to get that message to? Can you just drill down into the view controllers hierarchy. If so, how do you get there from the app delegate since it's the first responder? Can you make the window controller the first responder instead. If so, how? In the storyboard? Where?
Since this is a high level question it may not matter, however, I am using Swift for this project if you're wondering.
I'm not sure if there is a "proper" way to solve this, however, I have come up with a solution that I'll use for now. First a couple of details
My app is a document based application so each window has an instance of the document.
The document the app uses can act as the first responder and forward any actions I've connected
The document is able to get a hold of the top level window controller and from there I am able to drill down through the view controller hierarchy to get to the view controller I need.
So, in my windowDidLoad on the window controller, I do this:
override func windowDidLoad() {
super.windowDidLoad()
if self.contentViewController != nil {
var vc = self.contentViewController! as NSSplitViewController
var innerSplitView = vc.splitViewItems[0] as NSSplitViewItem
var innerSplitViewController = innerSplitView.viewController as NSSplitViewController
var layerCanvasSplitViewItem = innerSplitViewController.splitViewItems[1] as NSSplitViewItem
self.layerCanvasViewController = layerCanvasSplitViewItem.viewController as LayerCanvasViewController
}
}
Which gets me the view controller (which controls the view you see outlined in red below) and sets a local property in the window view controller.
So now, I can forward the toolbar button or menu item events directly in the document class which is in the responder chain and therefore receives the actions I setup in the menu and toolbar items. Like this:
class LayerDocument: NSDocument {
#IBAction func addLayer(sender:AnyObject) {
var windowController = self.windowControllers[0] as MainWindowController
windowController.layerCanvasViewController.addLayer()
}
// ... etc.
}
Since the LayerCanvasViewController was set as a property of the main window controller when it got loaded, I can just access it and call the methods I need.
For the action to find your view controllers, you need to implement -supplementalTargetForAction:sender: in your window and view controllers.
You could list all child controllers potentially interested in the action, or use a generic implementation:
- (id)supplementalTargetForAction:(SEL)action sender:(id)sender
{
id target = [super supplementalTargetForAction:action sender:sender];
if (target != nil) {
return target;
}
for (NSViewController *childViewController in self.childViewControllers) {
target = [NSApp targetForAction:action to:childViewController from:sender];
if (![target respondsToSelector:action]) {
target = [target supplementalTargetForAction:action sender:sender];
}
if ([target respondsToSelector:action]) {
return target;
}
}
return nil;
}
I had the same Storyboard problem but with a single window app with no Documents. It's a port of an iOS app, and my first OS X app. Here's my solution.
First add an IBAction as you did above in your LayerDocument. Now go to Interface Builder. You'll see that in the connections panel to First Responder in your WindowController, IB has now added a Sent Action of addLayer. Connect your toolBarItem to this. (If you look at First Responder connections for any other controller, it will have a Received Action of addLayer. I couldn't do anything with this. Whatever.)
Back to windowDidLoad. Add the following two lines.
// This is the top view that is shown by the window
NSView *contentView = self.window.contentView;
// This forces the responder chain to start in the content view
// instead of simply going up to the chain to the AppDelegate.
[self.window makeFirstResponder: contentView];
That should do it. Now when you click on the toolbarItem it will go directly to your action.
I've been struggling with this question myself.
I think the 'correct' answer is to lean on the responder chain. For example, to connect a tool bar item action, you can select the root window controller's first responder. And then show the attributes inspector. In the attributes inspector, add your custom action (see photo).
Then connect your toolbar item to that action. (Control drag from your Toolbar item to the first responder and select the action you just added.)
Finally, you can then go to the ViewController (+ 10.10) or other object, so long as its in the responder chain, where you want to receive this event and add the handler.
Alternatively, instead of defining the action in the attributes inspector. You can simply write your IBAction in your ViewController. Then, go to the toolbar item, and control drag to the window controller's first responder -- and select the IBAction you just added. The event will then travel thru the responder chain until received by the view controller.
I think this is the correct way to do this without introducing any additional coupling between your controllers and/or manually forwarding the call.
The only challenge I've run into -- being new to Mac dev myself -- is sometimes the Toolbar item disabled itself after receiving the first event. So, while I think this is the correct approach, there are still some issues I've run into myself.
But I am able to receive the event in another location without any additional coupling or gymnastics.
As i'm a very lazy person i came up with the following solution based on Pierre Bernard
's version
#include <objc/runtime.h>
//-----------------------------------------------------------------------------------------------------------
IMP classSwizzleMethod(Class cls, Method method, IMP newImp)
{
auto methodReplacer = class_replaceMethod;
auto methodSetter = method_setImplementation;
IMP originalImpl = methodReplacer(cls, method_getName(method), newImp, method_getTypeEncoding(method));
if (originalImpl == nil)
originalImpl = methodSetter(method, newImp);
return originalImpl;
}
// ----------------------------------------------------------------------------
#interface NSResponder (Utils)
#end
//------------------------------------------------------------------------------
#implementation NSResponder (Utils)
//------------------------------------------------------------------------------
static IMP originalSupplementalTargetForActionSender;
//------------------------------------------------------------------------------
static id newSupplementalTargetForActionSenderImp(id self, SEL _cmd, SEL action, id sender)
{
assert([NSStringFromSelector(_cmd) isEqualToString:#"supplementalTargetForAction:sender:"]);
if ([self isKindOfClass:[NSWindowController class]] || [self isKindOfClass:[NSViewController class]]) {
id target = ((id(*)(id, SEL, SEL, id)) originalSupplementalTargetForActionSender)(self, _cmd, action, sender);
if (target != nil)
return target;
id childViewControllers = nil;
if ([self isKindOfClass:[NSWindowController class]])
childViewControllers = [[(NSWindowController*) self contentViewController] childViewControllers];
if ([self isKindOfClass:[NSViewController class]])
childViewControllers = [(NSViewController*) self childViewControllers];
for (NSViewController *childViewController in childViewControllers) {
target = [NSApp targetForAction:action to:childViewController from:sender];
if (NO == [target respondsToSelector:action])
target = [target supplementalTargetForAction:action sender:sender];
if ([target respondsToSelector:action])
return target;
}
}
return nil;
}
// ----------------------------------------------------------------------------
+ (void) load
{
Method m = nil;
m = class_getInstanceMethod([NSResponder class], NSSelectorFromString(#"supplementalTargetForAction:sender:"));
originalSupplementalTargetForActionSender = classSwizzleMethod([self class], m, (IMP)newSupplementalTargetForActionSenderImp);
}
// ----------------------------------------------------------------------------
#end
//------------------------------------------------------------------------------
This way you do not have to add the forwarder code to the window controller and all the viewcontrollers (although subclassing would make that a bit easier), the magic happens automatically if you have a viewcontroller for the window contentview.
Swizzling always a bit dangerous so it is far not a perfect solution, but I've tried it with a very complex view/viewcontroller hierarchy that using container views, worked fine.
I have been playing around with the new navigationcontroller.hideBarsOnSwipe method and it is really awesome! However, the big downside seems to be that it affects ALL UIView objects in the visible view. To be more presice: when I have a label overlapping my scrollvie/tableview, it recognizes the 'Swipe' gesture and moves the entire view up - would anybody have an idea how to only make this gesture affect the underlying tableview? Thanks!
So the way to fix this in a easy manner, is simply disabling the swipe gesture for certain objects by looking at the recognizer of the object using the following:
- (BOOL)gestureRecognizer:(UISwipeGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
if( [gestureRecognizer view] == self.(your view object) {
return NO;
}
return YES;
}
additionally, you can turn the gesture to false or true depending on what popups are open.
I am writing an application targeting OS X Lion and Snow Leopard. I have a view that I want to have respond to swipe events. My understanding is that three-finger swipes will call -[NSResponder swipeWithEvent:] if that method is implemented in my custom view. I have already looked at this question and corresponding answers, and tried the following modified stub implementation of Oscar Del Ben's code:
#implementation TestView
- (id)initWithFrame:(NSRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code here.
}
return self;
}
- (void)drawRect:(NSRect)dirtyRect
{
[[NSColor redColor] set];
NSRectFillUsingOperation(dirtyRect, NSCompositeSourceOver);
}
- (void)swipeWithEvent:(NSEvent *)event {
NSLog(#"Swipe event detected!");
}
- (void)beginGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture detected!");
}
- (void)endGestureWithEvent:(NSEvent *)event {
NSLog(#"Gesture end detected!");
}
- (void)mouseDown:(NSEvent *)theEvent {
NSLog(#"mouseDown event detected!");
}
#end
This compiles and runs fine, and the view renders as expected. The mouseDown: event is properly registered. However, none of the other events are triggered. Neither the begin/endGestureWithEvent: methods, nor the swipeWithEvent: method. Which makes me wonder: do I need to set a project/application setting somewhere to properly receive and/or interpret gestures? Thanks in advance for the help.
To receive swipeWithEvent: messages, you have to ensure that the 3 finger swipe gesture is not mapped to anything that might cause a conflict. Go to System preferences -> Trackpad -> More Gestures, and set these preferences to one of the following:
Swipe between pages:
Swipe with two or three fingers, or
Swipe with three fingers
Swipe between full-screen apps:
Swipe left or right with four fingers
Specifically, the swipe between full-screen apps should not be set to three fingers, otherwise you will not get swipeWithEvent: messages.
Together, these two preference settings cause swipeWithEvent: messages to be sent to the first responder.
Of course, you still have to implement the actual swipe logic. And if you want to perform a fluid scroll-swipe à la iOS, then you will need to do a little more work. There is an example of how to do this in the Lion App Kit release notes under the section "Fluid Swipe Tracking."
See http://developer.apple.com/library/mac/#releasenotes/Cocoa/AppKit.html
try with [self setAcceptsTouchEvents:YES]; where it says // Initialization code here.
Not sure if it's the problem, but only the key window receives Gestures. Is your window key?
Is your view accepting first responders?
- (BOOL) acceptsFirstResponder
{
return YES;
}
I am converting an existing iPhone app for the iPad app. The iPhone app was built using a container viewcontroller (UINavigationController) which presented the user first with a custom viewcontroller (UITableViewController) that pushed a custom viewcontroller (UIViewController) based on row selection.
In the iPad app, I am presenting the user directly with the custom UIViewController (with NO container controller) and then allow selection of different options via a UIPopoverController. In myAppDelegate.m I am simply adding the custom UIViewController to the window using:
[window addSubview:[myCustomViewController view]];
In myCustomViewController.m I am modifying the view heavily based upon device rotation by registering for orientation change notifications in viewWillAppear:
-(void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(didRotate:) name:#"UIDeviceOrientationDidChangeNotification" object:nil];
}
I am then testing the orientation in the didRotate: method and getting very strange results. It is being called three times simply loading the view? It also seems to be reporting the orientation corresponding to the PREVIOUS drawing of the view?
- (void) didRotate:(NSNotification *)notification
{
if (self.interfaceOrientation == UIInterfaceOrientationPortrait) {
NSLog(#"Portrait");
} else if (self.interfaceOrientation == UIInterfaceOrientationLandscapeLeft || self.interfaceOrientation == UIInterfaceOrientationLandscapeRight) {
NSLog(#"Landscape");
}
}
I was reading in the docs and it appears that adding the subview to the window (without a container class) will not cause viewWillAppear: method to be called, but in my case it seems it is being called, just unreliably.
Is there some other pattern I should be using for this app? I simply want to load a single custom view and use two popover controllers (no other navigation)?
-Derrick
btw - It works exactly as it should if I push the custom viewController onto a UINavigationController in my app delegate. I just don't need a nav controller for this app.
In my app I'm working on, I first have a property to find out if the device is an iPad:
- (BOOL)iPad {
return UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad ? YES : NO;
}
And then you can use the following delegate method of your view.
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration {
if (self.iPad) {
if (toInterfaceOrientation == UIInterfaceOrientationLandscapeLeft ||
toInterfaceOrientation == UIInterfaceOrientationLandscapeRight) {
//do some stuff
}
}
Hope this helps.
I have an NSMenu popping out of an NSStatusItem using popUpStatusItemMenu. These NSMenuItems show a bunch of different links, and each one is connected with setAction: to the openLink: method of a target. This arrangement has been working fine for a long time. The user chooses a link from the menu and the openLink: method then deals with it.
Unfortunately, I recently decided to experiment with using NSMenuItem's setView: method to provide a nicer/slicker interface. Basically, I just stopped setting the title, created the NSMenuItem, and then used setView: to display a custom view. This works perfectly, the menu items look great and my custom view is displayed.
However, when the user chooses a menu item and releases the mouse, the action no longer works (i.e., openLink: isn't called). If I just simply comment out the setView: call, then the actions work again (of course, the menu items are blank, but the action is executed properly). My first question, then, is why setting a view breaks the NSMenuItem's action.
No problem, I thought, I'll fix it by detecting the mouseUp event in my custom view and calling my action method from there. I added this method to my custom view:
- (void)mouseUp:(NSEvent *)theEvent {
NSLog(#"in mouseUp");
}
No dice! This method is never called.
I can set tracking rects and receive mouseEntered: events, though. I put a few tests in my mouseEntered routine, as follows:
if ([[self window] ignoresMouseEvents]) { NSLog(#"ignoring mouse events"); }
else { NSLog(#"not ignoring mouse events"); }
if ([[self window] canBecomeKeyWindow]) { dNSLog((#"canBecomeKeyWindow")); }
else { NSLog(#"not canBecomeKeyWindow"); }
if ([[self window] isKeyWindow]) { dNSLog((#"isKeyWindow")); }
else { NSLog(#"not isKeyWindow"); }
And got the following responses:
not ignoring mouse events
canBecomeKeyWindow
not isKeyWindow
Is this the problem? "not isKeyWindow"? Presumably this isn't good because Apple's docs say "If the user clicks a view that isn’t in the key window, by default the window is brought forward and made key, but the mouse event is not dispatched." But there must be a way do detect these events. HOW?
Adding:
[[self window] makeKeyWindow];
has no effect, despite the fact that canBecomeKeyWindow is YES.
Add this method to your custom NSView and it will work fine with mouse events
- (void)mouseUp:(NSEvent*) event {
NSMenuItem* mitem = [self enclosingMenuItem];
NSMenu* m = [mitem menu];
[m cancelTracking];
[m performActionForItemAtIndex: [m indexOfItem: mitem]];
}
But i'm having problems with keyhandling, if you solved this problem maybe you can go to my question and help me a little bit.
Add this to your custom view and you should be fine:
- (BOOL)acceptsFirstMouse:(NSEvent *)theEvent
{
return YES;
}
I added this method to my custom view, and now everything works beautifully:
- (void)viewDidMoveToWindow {
[[self window] becomeKeyWindow];
}
Hope this helps!
I've updated this version for SwiftUI Swift 5.3:
final class HostingView<Content: View>: NSHostingView<Content> {
override func viewDidMoveToWindow() {
window?.becomeKey()
}
}
And then use like so:
let item = NSMenuItem()
let contentView = ContentView()
item.view = HostingView(rootView: contentView)
let menu = NSMenu()
menu.items = [item]
So far, the only way to achieve the goal, is to register a tracking area manually in updateTrackingAreas - that is thankfully called, like this:
override func updateTrackingAreas() {
let trackingArea = NSTrackingArea(rect: bounds, options: [.enabledDuringMouseDrag, .mouseEnteredAndExited, .activeInActiveApp], owner: self, userInfo: nil)
addTrackingArea(trackingArea)
}
Recently I needed to show a Custom view for a NSStatusItem, show a regular NSMenu when clicking on it and supporting drag and drop operations on the Status icon.
I solved my problem using, mainly, three different sources that can be found in this question.
Hope it helps other people.
See the sample code from Apple named CustomMenus
In there you'll find a good example in the ImagePickerMenuItemView class.
It's not simple or trivial to make a view in a menu act like a normal NSMenuItem.
There are some real decisions and coding to do.