I have a block of code in my app that will be used many times for different values and on different CoreData entities. Basically it consists of a small header label and a TextField that becomes editable when the user taps on the edit button (which manages saving to the context).
I would like to be able to pass a label, an entity attribute (that stays connected to CoreData) and a state variable to it.
The block looks like this:
VStack(alignment: .leading) {
Text("Notes").font(.caption2).foregroundColor(.secondary)
TextField("Notes", text: $todo.notes)
.textFieldStyle(PlainTextFieldStyle())
.disabled(!editMode)
}
The parent view handles the #Environment(\.managedObjectContext) private var context and #ObservedObject var todo: Todo. I would also like to be able to modify it from within the parent view, to change the appearance of the TextField, something like (pseudo-code):
// pseudo-code
MyEditableTextBlock("Notes", $todo.notes, editMode)
.font(.title3)
and apply the .font modifier to the TextField (not the .caption header).
What's the "correct" way to do this, and insure that the item and editMode stay bound to the CoreData context and #State variable in the parent view respectively?
Thanks.
Related
In iOS, a toolbar can be added to any view. In macOS however, it seems only possible to add a toolbar to a window.
I'm working on an app with a split view controller with a toolbar but the toolbar's items only have a meaning with respect to the right view controller's context.
E.g. let's say I have a text editor of some sort, where the left pane shows all documents (like in the Notes app) and the right pane shows the actual text which can be edited. The formatting buttons only affect the text in the right pane. Thus, it seems very intuitive to place the toolbar within that right pane instead of stretching it over the full width of the window.
Is there some way to achieve this?
(Or is there a good UX reason why this would be a bad practice?)
I've noticed how Apple solved this problem in terms of UX in their Notes app: They still use a full-width toolbar but align the button items that are only related to the right pane with the leading edge of that pane.
So in case, there is no way to place a toolbar in a view controller, how can I align the toolbar items with the leading edge of the right view controller as seen in the screenshot above?
Edit:
According to TimTwoToes' answer and the posts linked by Willeke in the comments, it seems to be possible to use Auto Layout for constraining a toolbar item with the split view's child view. This solution would work if there was a fixed toolbar layout. However, Apple encourages (for a good reason) to let users customize your app's toolbar.
Thus, I cannot add constraints to a fixed item in the toolbar. Instead, a viable solution seems to be to use a leading flexible space and adjust its size accordingly.
Initial Notes
It turns out this is tricky because there are many things that need to be considered:
Auto Layout doesn't seem to work properly with toolbar items. (I've read a few posts mentioning that Apple has classified this as a bug.)
Normally, the user can customize your app's toolbar (add and remove items). We should not deprive the user of that option.
Thus, simply constraining a particular toolbar item with the split view or a layout guide is not an option (because the item might be at a different position than expected or not there at all).
After hours of "hacking", I've finally found a reliable way to achieve the desired behavior that doesn't use any internal / undocumented methods. Here's how it looks:
How To
Instead of a standard NSToolbarFlexibleSpaceItem create an NSToolbarItem with a custom view. This will serve as your flexible, resizing space. You can do that in code or in Interface Builder:
Create outlets/properties for your toolbar and your flexible space (inside the respective NSWindowController):
#IBOutlet weak var toolbar: NSToolbar!
#IBOutlet weak var tabSpace: NSToolbarItem!
Create a method inside the same window controller that adjusts the space width:
private func adjustTabSpaceWidth() {
for item in toolbar.items {
if item == tabSpace {
guard
let origin = item.view?.frame.origin,
let originInWindowCoordinates = item.view?.convert(origin, to: nil),
let leftPane = splitViewController?.splitViewItems.first?.viewController.view
else {
return
}
let leftPaneWidth = leftPane.frame.size.width
let tabWidth = max(leftPaneWidth - originInWindowCoordinates.x, MainWindowController.minTabSpaceWidth)
item.set(width: tabWidth)
}
}
}
Define the set(width:) method in an extension on NSToolbarItem as follows:
private extension NSToolbarItem {
func set(width: CGFloat) {
minSize = .init(width: width, height: minSize.height)
maxSize = .init(width: width, height: maxSize.height)
}
}
Make your window controller conform to NSSplitViewDelegate and assign it to your split view's delegate property.1 Implement the following NSSplitViewDelegate protocol method in your window controller:
override func splitViewDidResizeSubviews(_ notification: Notification) {
adjustTabSpaceWidth()
}
This will yield the desired resizing behavior. (The user will still be able to remove the space completely or reposition it, but he can always add it back to the front.)
1 Note:
If you're using an NSSplitViewController, the system automatically assigns that controller to its split view's delegate property and you cannot change that. As a consequence, you need to subclass NSSplitViewController, override its splitViewDidResizeSubviews() method and notify the window controller from there. Your can achieve that with the following code:
protocol SplitViewControllerDelegate: class {
func splitViewControllerDidResize(_ splitViewController: SplitViewController)
}
class SplitViewController: NSSplitViewController {
weak var delegate: SplitViewControllerDelegate?
override func splitViewDidResizeSubviews(_ notification: Notification) {
delegate?.splitViewControllerDidResize(self)
}
}
Don't forget to assign your window controller as the split view controller's delegate:
override func windowDidLoad() {
super.windowDidLoad()
splitViewController?.delegate = self
}
and to implement the respective delegate method:
extension MainWindowController: SplitViewControllerDelegate {
func splitViewControllerDidResize(_ splitViewController: SplitViewController) {
adjustTabSpaceWidth()
}
}
There is no native way to achieve a "local" toolbar. You would have to create the control yourself, but I believe it would be simpel to make.
Aligning the toolbar items using autolayout is described here. Align with custom toolbar item described by Mischa.
The macOS way is to use the Toolbar solution and make them context sensitive. In this instance the text attribute buttons would enable when the right pane has the focus and disable when it looses the focus.
I have the following view:
which has a custom class based on NSVisualEffectView, and contains an image view, a label (NSTextField) and a popup button. isFlipped of the custom view is always false.
The custom view also contains a NSClickGestureRecognizer which uses a delegate. The delegate implements just one method:
func gestureRecognizerShouldBegin(_ gestureRecognizer: NSGestureRecognizer) -> Bool {
let thePoint = gestureRecognizer.location(in: view)
if let theView = view.hitTest(thePoint) {
return !theView.handlesMouseEvents
}
else {
return true
}
}
If I click in the middle of the popup menu, location(in:) returns the value (182, 16) for instance. This seems correct for me for a non-flipped view. But hitTest() returns the background view for that point as result and not the popup button.
What am I doing wrong?
If I use the manually flipped point (y := height - y) for hit-testing I get the popup button as result. But I don't want to use that approach because it seems ugly to me.
If seems to work if I use the window's content view for hit-testing. But I would still like to know why the approach shown does not work.
The parameter point of hitTest(_:) is
A point that is in the coordinate system of the view’s superview, not of the view itself.
Solution: pass a point in superview coordinates.
I have a split view controller with master view being something like a menu allowing users to pick the scene for detail view (I have multiple detail views). On one of the detailView scene, I have a button that presents a view controller modally and "Over Current Context" as it has a translucent background and I wanted to create that fog effect. This particular detailView (lets call it TodayViewController) is also the initial detail view controller when the app loads, and only changes when user selects a new view controller from the master view (menu).
This is what I meant in code:
When app just starts:
splitViewController.viewControllers[1] // returns TodayViewController
When user selects from the menu:
splitViewController.viewControllers[1] // returns a different view controller
So the issue I am having is that when the app just starts (bullet 1), when I present a child view controller of TodayViewController modally and "over current context", the child VC presents itself over both the master view (menu) as well as the detail view (TodayViewController), causing the entire screen to have a foggy effect. This is the effect that I want
However when I select another view controller (from the menu) and then select back TodayViewController and try to present the child VC it only presents itself over the detail view now. Meaning that the foggy effect is only present on the detail view and the master view (the menu again) remains clear. How do I fix this?
I hope I'm clear enough with my explanation. Here are some of my code:
My GlobalSplitViewController.swift:
import UIKit
class GlobalSplitViewController: UISplitViewController, UISplitViewControllerDelegate {
func primaryViewControllerForCollapsingSplitViewController(splitViewController: UISplitViewController) -> UIViewController? {
let detailViewController = self.viewControllers[1] as! TodayViewController
return detailViewController
}
func splitViewController(splitViewController: UISplitViewController, collapseSecondaryViewController secondaryViewController: UIViewController, ontoPrimaryViewController primaryViewController: UIViewController) -> Bool {
return true
}
func splitViewController(svc: UISplitViewController, shouldHideViewController vc: UIViewController, inOrientation orientation: UIInterfaceOrientation) -> Bool {
return false
}
}
GlobalSplitViewController is structured so that TodayViewController is presented first on iPhones, but on iPad it shows both master and detail view, uncollapsed.
'Over current context' is supposed to present over only the master or the detail wherever it is called from. I'm not sure why it doesn't work properly at first (but I understand that's what you want) but then works when you select another option. Anyhow, to achieve what you want, stop using 'over current context'. That will present the fog vc over the whole screen.
I am having trouble detecting a user's double click in swift, I want to detect when they double click on an NSTextField.
func someFunc() {
y.target = self
y.action = "editLabel:"
}
#IBAction func editLabel(obj:AnyObject?) {
NSLog("here");
}
The above code doesn't work, I can't seem to find the basic documentation that shows how to add event handlers. Is there a simpler way to do this?
I guess your text field is a label, not an editable text field in its normal state. Starting with OS X 10.10 (Yosemite), you can use NSClickGestureRecognizer:
func applicationDidFinishLaunching(aNotification: NSNotification) {
let gesture = NSClickGestureRecognizer()
gesture.buttonMask = 0x1 // left mouse
gesture.numberOfClicksRequired = 2
gesture.target = self
gesture.action = "editLabel:"
myLabel.addGestureRecognizer(gesture)
}
func editLabel(sender: NSGestureRecognizer) {
if let label = sender.view as? NSTextField {
print("Hello world")
}
}
A text field does not handling editing, as such. When a text field has focus, a text view is added to the window, overlapping the area of the text field. This is called the "field editor" and it is responsible for handling editing.
It seems the most likely place for you to change the behavior of a double-click is in the text storage object used by that text view. NSTextStorage inherits from NSMutableAttributedString which inherits from NSAttributedString which has a -doubleClickAtIndex: method. That method returns the range of the text that should be selected by a double-click at a particular index.
So, you'll want to implement a subclass of NSTextStorage that overrides that method and returns a different result in some circumstances. NSTextStorage is a semi-abstract base class of a class cluster. Subclassing it requires a bit more than usual. You have to implement the primitive methods of NSAttributedString and NSMutableAttributedString. See the docs about it.
There are a few places to customize the field editor by replacing its text storage object with an instance of your class:
You could implement a custom subclass of NSTextFieldCell. Set your text field to use this as its cell. In your subclass, override -fieldEditorForView:. In your override, instantiate an NSTextView. Obtain its layoutManager and call -replaceTextStorage: on that, passing it an instance of your custom text storage class. (This is easier than putting together the hierarchy of objects that is involved with text editing, although you could do that yourself.) Set the fieldEditor property of the text view to true and return it.
In your window delegate, implement -windowWillReturnFieldEditor:toObject:. Create, configure, and return an NSTextView using your custom text storage, as above.
Is there a way to search a name on the table(Listview) in Xamarin Form without using any MonoTouch components - for crossplatform?
I have a number of names and populated on listview and what I want is that once user types the name and listview would be sorted and it should bring to matched name. I have added searchbar at the top of the listview.
I do not know how to implement OnSearchBarButtonPressed method. I believe that I should not reinvent the wheel again.
SearchBar searchBar = new SearchBar
{
Placeholder = "Search Employee Name",
};
searchBar.SearchButtonPressed += OnSearchBarButtonPressed;
Padding = new Thickness (10, 20, 10, 10);
Content = new StackLayout () {
Children={searchBar,listView}
};
In relation to your other posting here your ListView ItemsSource is a collection of objects.
The ListView control has no way of knowing about your model and wouldn't be able to any filtering by itself.
There appears to be nothing inbuilt into the ListView that would allow you to specify some Predicate to help with this task anyway.
You should therefore do the filtering in the SearchBar.SearchButtonPressed and then re-assign the ListView ItemsSource to reflect the newly filtered ItemsSource you want to show to the end-user. Your filtering logic could then be anything you wanted as your be able to customize this to even filter over many fields etc.
The SearchBar.SearchButtonPressed is just an event which you assign an EventHandler, like you are already doing. You just need to fill in the filtering logic and update the ListView accordingly.