I'm developing a software with Xcode 6 using Swift. When I press a button, my code takes some informations from the web and writes them on my NSWindow.
So imagine something like this:
#IBAction func buttonPressed(sender: AnyObject)
{
for page in listOfPages //Number of pages is 40.
{
var label: NSTextField = NSTextField();
label.stringValue = getInformetionsFromPage(page)
/*
some code to add label in the window here
*/
}
}
The problem is that once I click the button, it takes a while before I see the list of results, in the main time the app is frozen. That's because I'm not using any threads to handle this problem. How could I implement threads to see every label updated every step of the loop? I'm new with threads and Swift, so I would need some help!
Thank you guys.
Swift 3.0 + version
DispatchQueue.main.async() {
// your UI update code
}
Posted this because XCode cannot suggest the correct syntax from swift 2.0 version
There is GCD. Here is a basic usage:
for page in listOfPages {
var label = NSTextField()
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
let result = getInformationFromPage(page)
dispatch_async(dispatch_get_main_queue()) {
label.stringValue = result
}
}
}
dispatch_async function asynchronously runs the block of code on the given queue. In first dispatch_async call we dispatch the code to run on background queue. After we get result we update label on main queue with that result
Related
In iOS 13 the behavior has changed so that by default when Navigation controller appears the search bar is visible (when UISearchController is assigned to a navigationItem.searchController). Some system apps appear with the search bar hidden (you need to swipe down for it to appear), but I don't see any specific property that would allow this. How to achieve this - maybe there is some property or some method to do that?
Via experimentation, I have discovered that if you delay assigning the search controller to the navigation item until viewWillLayoutSubviews or viewDidLayoutSubviews, the search controller starts out hidden, as desired. However, this if you do this on iOS 12 or earlier, the search controller will not be revealed when scrolling down.
I ended up doing the following with a messy version check, which is working for me:
override func viewDidLoad() {
super.viewDidLoad()
searchController = /* make search controller... */
if #available(iOS 13, *) {
// Attaching the search controller at this time on iOS 13 results in the
// search bar being initially visible, so assign it later
}
else {
navigationItem.searchController = searchController
}
}
override func viewWillLayoutSubviews() {
super.viewWillLayoutSubviews()
navigationItem.searchController = searchController
}
To start with a hidden searchBar, simply set the navigationItem.searchController property after your table view (or collection view) has been populated with data.
Inspired by bunnyhero's answer I put the code responsible for setting the UISearchController in navigationItem inside the viewDidAppear method. Seems to be working every time for me on iOS 14/15
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
if navigationItem.searchController == nil {
navigationItem.searchController = searchController
}
}
Edit: I was overly optimistic. On iOS 15.2 this method stopped working for me. What I did to fix it was to move the code after reloading my table/collection view.
I find this works:
self.searchController.searchBar.hidden = YES;
You will need to unhide at the appropriate time.
I managed to make this work by setting isTransculent false on the navigationBar and having initial data on UITableView or UICollectionView. If you have 0 cells initially and trigger reloadData after some time (maybe a network call), SearchBar is visible initially. So have a dummy cell or something similar initially and load the data later, if that's the case for you.
navigationController?.navigationBar.isTranslucent = false
One should set searchController after tableView gets frame
override func scrollViewDidScroll(_ scrollView: UIScrollView) {
super.scrollViewDidScroll(scrollView)
if !scrollView.frame.isEmpty, navigationItem.searchController == nil {
navigationItem.searchController = searchController
}
}
This is what works for me. I have a UISegmentedControl that reloads the tableView when filter changes.
With FRC:
guard let count = try? fetchedResultsController.managedObjectContext.count(for: request) else { return }
called after tableView.reloadData()
navigationItem.searchController = count > 20 ? searchController : nil
Swift 5.2 & iOS 13.3.1:-
Try like this. It works fine
navigationItem.hidesSearchBarWhenScrolling = false
I have simple code :
var openPanel = NSOpenPanel();
openPanel.beginSheet(this.View.Window, (obj) => {
//do stuff
openPanel.endSheet(this.View.Window);
});
Sometime the sheet window is not shown and makes a sound like the window is busy. Is there anything wrong in my code?
I call this code from one item of splitViewcontroller.
I just made this same error, and had trouble with it for quite some time. I was following along with the Apple guide:
Using the Open and Save Panels
The main issue is that the Apple docs show us using the Objective-C method:
[panel beginSheetModalForWindow:window completionHandler:^(NSInteger result){ }
I did an ad hoc translation into Swift, with help from Xcode autocomplete:
let openPanel = NSOpenPanel()
openPanel.beginSheet(window) { (modalResponse: NSApplication.ModalResponse) in
}
This does not work. When the code is run, the Window's title bar disappears + no panel is shown.
Use the correct Swift method instead, beginSheetModal:
openPanel.beginSheetModal(for: window) { (modalResponse: NSApplication.ModalResponse) in
}
Try this:
let panel = NSOpenPanel()
self.window?.beginSheet(panel, completionHandler: { (modalResponse: NSModalResponse) in
if modalResponse == NSModalResponseOK {
// do your stuff
}
})
I'm trying to find how to bring up a second view/window after pushing a button on my primary window. I have read about segues and I can get the first window to display the second but the second is not connected to a view controller so I can't add any code to any controls on the second view. Try as I might I cannot create a SecondViewController.swift file and connect it to a window controller or a view controller. The tutorials I have found all deal with iOS and I want OS X which means there are just enough differences to keep me from figuring this out.
Can anyone show me how to do this?
Ta,
A.
First make new file like:
After that, put these codes in your classes and that should do it.
class SecondWindowController: NSWindowController {
convenience init() {
self.init(windowNibName: "SecondWindowController")
}
}
class ViewController: NSViewController {
private var secondWindowController: SecondWindowController?
#IBAction func showSecondWindow(sender: AnyObject) {
if secondWindowController == nil {
secondWindowController = SecondWindowController()
}
secondWindowController?.showWindow(self)
}
}
I am trying to dismiss the search field by tapping 'Cancel' button in search bar.
The test case is failing to find the cancel button. It was working fine in Xcode 7.0.1
I have added predicate to wait for button to appear. The test case is failing when we tap of "cancel" button
let button = app.buttons[“Cancel”]
let existsPredicate = NSPredicate(format: "exists == 1")
expectationForPredicate(existsPredicate, evaluatedWithObject: button, handler: nil)
waitForExpectationsWithTimeout(5, handler: nil)
button.tap() // Failing here
logs:
t = 7.21s Tap SearchField
t = 7.21s Wait for app to idle
t = 7.29s Find the SearchField
t = 7.29s Snapshot accessibility hierarchy for com.test.mail
t = 7.49s Find: Descendants matching type SearchField
t = 7.49s Find: Element at index 0
t = 7.49s Wait for app to idle
t = 7.55s Synthesize event
t = 7.84s Wait for app to idle
t = 8.97s Type 'vinayak#xmd.net' into
t = 8.97s Wait for app to idle
t = 9.03s Find the "Search" SearchField
t = 9.03s Snapshot accessibility hierarchy for com.test.mail
t = 9.35s Find: Descendants matching type SearchField
t = 9.35s Find: Element at index 0
t = 9.36s Wait for app to idle
t = 9.42s Synthesize event
t = 10.37s Wait for app to idle
t = 10.44s Check predicate `exists == 1` against object `"Cancel" Button`
t = 10.44s Snapshot accessibility hierarchy for com.test.mail
t = 10.58s Find: Descendants matching type Button
t = 10.58s Find: Elements matching predicate '"Cancel" IN identifiers'
t = 10.58s Tap "Cancel" Button
t = 10.58s Wait for app to idle
t = 10.64s Find the "Cancel" Button
t = 10.64s Snapshot accessibility hierarchy for com.test.mail
t = 10.78s Find: Descendants matching type Button
t = 10.78s Find: Elements matching predicate '"Cancel" IN identifiers'
t = 10.79s Wait for app to idle
t = 11.08s Synthesize event
t = 11.13s Scroll element to visible
t = 11.14s Assertion Failure: UI Testing Failure - Failed to scroll to visible (by AX action) Button 0x7f7fcaebde40: traits: 8589934593, {{353.0, 26.0}, {53.0, 30.0}}, label: 'Cancel', error: Error -25204 performing AXAction 2003
I guess here "Cancel" button returns false for hittable property, that is preventing it from tapping.
If you see tap() in documentation it says
/*!
* Sends a tap event to a hittable point computed for the element.
*/
- (void)tap;
It seems things are broken with Xcode 7.1. To keep myself (and you too ;)) unblocked from these issues I wrote an extension on XCUIElement that allows tap on an element even if it is not hittable. Following can help you.
/*Sends a tap event to a hittable/unhittable element.*/
extension XCUIElement {
func forceTapElement() {
if self.hittable {
self.tap()
}
else {
let coordinate: XCUICoordinate = self.coordinateWithNormalizedOffset(CGVectorMake(0.0, 0.0))
coordinate.tap()
}
}
}
Now you can call as
button.forceTapElement()
Update - For Swift 3 use the following:
extension XCUIElement {
func forceTapElement() {
if self.isHittable {
self.tap()
}
else {
let coordinate: XCUICoordinate = self.coordinate(withNormalizedOffset: CGVector(dx:0.0, dy:0.0))
coordinate.tap()
}
}
}
For me, the root cause was that the objects I wanted to tap
have been set to hidden (and back)
have been removed and re-attached
In both cases the isAccessibilityElement property was false afterwards. Setting it back to true fixed it.
This question ranks well for Google queries around the term "Failed to scroll to visible (by AX action) Button". Given the age of the question I was inclined to think this was no longer an issue with the XCUITest framework as the accepted answer suggests.
I found this issue was due to the XCElement existing, but being hidden behind the software keyboard. The error is emitted by the framework since it is unable to scroll a view that exists into view to be tappable. In my case the button in question was behind the software keyboard sometimes.
I found the iOS Simulator's software keyboard may be toggled off in some cases (eg: on your machine) and toggled on in others (eg: on your CI). In my case I had toggled the software keyboard off on one machine, and by default it was toggled on on others.
Solution: Dismiss the keyboard before attempting to tap buttons that may be behind it.
I found tapping somewhere that explicitly dismissed the keyboard before tapping on the button solved my problem in all environments.
I added add some actions to get the current responder to resignFirstResponder. The views behind my text views will force the first responder to resign, so I tap somewhere just underneath the last text area.
/// The keyboard may be up, dismiss it by tapping just below the password field
let pointBelowPassword = passwordSecureTextField.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 1))
pointBelowPassword.press(forDuration: 0.1)
The workaround of Sandy seemed help for a while but then no more - I then changed it like this:
func waitAndForceTap(timeout: UInt32 = 5000) {
XCTAssert(waitForElement(timeout: timeout))
coordinate(withNormalizedOffset: CGVector(dx:0.5, dy:0.5)).tap()
}
Main point being that as the issue is that isHittable check throws an exception, I don't do this check at all and go straight for coordinates after the element is found.
Please check the trait of the element, i was facing the same issue with TableViewSectionHeader, i was trying to tap but it was failing at every point
Try this:
if !button.isHittable {
let coordinate: XCUICoordinate = button.coordinate(withNormalizedOffset: CGVector(dx:0.0, dy:0.0))
coordinate.tap()
}
In my case it was having a programmatically added UI element covering the button.
If you're using the AppCenter simulator to run the tests, you should make sure that you're running the tests on the same device version than your local simulator. I lost 3 days of work because of this.
In the spirit of things that can cover your element, I had the RN debugger partially overlayed on top of my icon:
I had this issue because I set a toolbar (and ToolbarItem) on a GeometryReader:
GeometryReader { proxy in
ZStack { ... }
}
.toolbar {
ToolbarItem(placement: .navigationBarTrailing) {
Button(action: { ... }) {
Text(Localizable.Global.login.localized)
}
.accessibilityIdentifier("welcomeViewLoginButton")
}
}
After setting the toolbar on the ZStack instead, the toolbar item button was hittable again.
GeometryReader { proxy in
ZStack {
...
}
.toolbar {
ToolbarItem(placement: .navigationBarTrailing) {
Button(action: { ... }) {
Text(Localizable.Global.login.localized)
}
.accessibilityIdentifier("welcomeViewLoginButton")
}
}
}
I try to make an app, and now i shoud make some changes when screen resolution will change, but i coudn't find how to intercept this event.
Do you have any ideea how can i take that event?
The NSApplicationDidChangeScreenParametersNotification is posted when the configuration of the displays attached to the computer is changed, so
you can register for that notification, e.g. with
NSNotificationCenter.defaultCenter().addObserverForName(NSApplicationDidChangeScreenParametersNotification,
object: NSApplication.sharedApplication(),
queue: NSOperationQueue.mainQueue()) {
notification -> Void in
println("screen parameters changed")
}
Note that there can be various reasons why this notification is
fired, e.g. a change in the dock size (as observed in Cocoa Dock fires NSApplicationDidChangeScreenParametersNotification), so you have to
"remember" the old resolution and compare it with the new resolution.
Swift 4:
The didChangeScreenParametersNotification is posted when the configuration of the displays attached to the computer is changed.
Inside the func applicationDidFinishLaunching() in AppDelegate class or func viewDidLoad() in ViewController class, insert the following code:
NotificationCenter.default.addObserver(forName: NSApplication.didChangeScreenParametersNotification,
object: NSApplication.shared,
queue: OperationQueue.main) {
notification -> Void in
print("screen parameters changed")}
I personally, used it to center the position of my application when switching between the Mac and the external screen.
Here is the updated Swift 3 code:
NotificationCenter.default.addObserver(forName: NSNotification.Name.NSApplicationDidChangeScreenParameters,
object: NSApplication.shared(),
queue: OperationQueue.main) {
notification -> Void in
print("screen parameters changed")
}
Code for Swift 5+
NotificationCenter.default.addObserver(
forName: NSNotification.Name(rawValue: "NSApplicationDidChangeScreenParametersNotification"),
object: NSApplication.shared,
queue: .main) { notification in
self.adjustUIIfNeeded()
}