How to set the width of an NSTableColumn in Swift - xcode

I'd like to programmatically set the width of some NSTableColumns in code (so that I can restore the widths on startup), but I don't really know how to apply what's written in the docs
for column in table.tableColumns {
var w: CGFloat = 125
column.setWidth(w)
println("\(column.identifier!)") // this prints my identifiers, so I know these are my columns and not something else I'm not interested in
}
The error I get is as follows: '(#lvalue CGFloat) -> $T3' is not identical to 'CGFloat'
With just 125 as the argument to setWidth the error says '(IntegerLiteralConvertable) -> etc...'
Code completion in XCode shows four versions of setWidth() each of which take at least two arguments, and none with just the width which is all I care about. My guess is that the docs don't match XCode 6.1.1, perhaps? It suggests there's just a setWidth() method, but in real life I have to choose between four equally confusing versions.

One Quincey_Morris gave me this answer in the Apple developer forums (I hope this isn't a breach of Apple's terms and conditions). I had to cast column "as [NSTableColumn]" before the opening brace of my for loop before I could just call "column.width = 125".

Related

Where is dragThatCannotResizeWindow in Interface Builder/Storyboard?

I found the NSLayoutConstraint.Priority.dragThatCannotResizeWindow to be very useful. Though I usually write constraints with code, it sometimes gets too long and I want to move them to IB.
How I can set the priority of a constraint to be dragThatCannotResizeWindow in Interface Builder? There are only 3 options, corresponding to required, defaultHigh, defaultLow in code.
I tried looking up dragThatCannotResizeWindow storyboard on Google, and nothing useful came up (< 50 results!)
There are also a bunch of other priorities like dragThatCanResizeWindow, and fittingSizeCompression, which I'm also interested about.
One guess I had was that unlike required, defaultHigh, defaultLow, these "semantic" priorities don't have a fixed value, and hence cannot be selected in the storyboard.* Is this true? If so, what does their values depend on?
*This is kind of a weak reason since semantic colors like "label" and "background" don't have a fixed value either and they are in IB...
.Priority is a struct with these "semantic" defined values:
.required: 1000.0
.defaultHigh: 750.0
.dragThatCanResizeWindow: 510.0
.windowSizeStayPut: 500.0
.dragThatCannotResizeWindow: 490.0
.defaultLow: 250.0
.fittingSizeCompression: 50.0
As we know, Storyboard / IB lists only Required, High and Low ... but we can type any value between 0 and 1000 into the Priority field.

Configure larger limit of columns on Format module

The Format module
The Format module is used to model and combine pretty printers with a syntactic extension that allows typed formats and it helps a lot when you are writing something like a code generator or a data structure printer.
The problem
However, there is a limit of 78 columns that is initialized on the margin of the formatter and will pull to the left anything that takes more than this limit.
I'm printing a lighter version of a Yojson.Basic.json program using the Format module, but when the input is too large, the output is collapsed, and that is not really "prettily".
Preview
Here is how it is is formatted when it is short:
Here is how it is formatted when the indentation becomes too large
I've been trying to exceed and configure this limit to 120 columns, but didn't have any success.
What have I tried?
Using Format.pp_set_margin ppf 120 to reconfigure
Using Format.pp_set_max_indent to a larger value
But they doesn't seem to have any effect and there is no documentation easily available about this limit. I've discovered it only by reading the source code.
What am I doing?
let string_of_cst program =
let ppf = Format.str_formatter in
(* I've enabled colors. *)
Format.pp_set_tags ppf colors;
Format.pp_set_formatter_tag_functions ppf with_colors;
(* [print_json] is my printer. *)
print_json ppf program;
(* Get string out of printer. *)
Format.flush_str_formatter ()
How can I configure a larger limit?
The issue is that the values for margin and max_indent are implicitly constrained to the cone 1 < max_indent < margin and the function set_max_indent silently fails and does nothing if this constraint is not respected.
To avoid this issue, in OCaml ≥4.08, it would be possible to use the new set_geometry function that requires to set both value simultaneously and fails with an exception if the required max_indent is greater than the margin.
Otherwise, you should always set both values at the same time, and always in the order
margin first, and max_indent second. If you don't know which value to chose for max_indent, margin - 10 is generally an alright choice.

XCTest self measureBlock modification?

It appears that XCtest "self measureBlock" is limited to milliseconds and 10 runs of the test code. Are there any ways to modify the behavior of measureBlock for more runs and/or nano or microsecond accuracy?
TL;DR
Apple provides a way to modify the behavior of measureBlock: by providing extra string constants but they don't support any string constants other than the default.
Long explanation
measureBlock calls the following function
- (void)measureMetrics:(NSArray *)metrics automaticallyStartMeasuring:(BOOL)automaticallyStartMeasuring withBlock:(void (^)(void))block;
//The implementation looks something like this (I can't say 100% but i'm pretty sure):
- (void)measureBlock:(void (^)(void))block {
NSArray<NSString *> *metrics = [[self class] defaultPerformanceMetrics];
[self measureMetrics:metrics automaticallyStartMeasure:YES withBlock:block];
}
defaultPerformanceMetrics is a class function that returns an array of strings.
From the Xcode source
"Subclasses can override this to change the behavior of
-measureBlock:"
Lovely, that sounds promising; we have customization behavior right? Well, they don't give you any strings to return. The default is XCTPerformanceMetric_WallClockTime ("com.apple.XCTPerformanceMetric_WallClockTime")
It turns out there aren't any string constants to return besides that one.
See the slides for WWDC 2014 Session 414 Testing in Xcode 6 (link).
I quote from page 158:
Currently supports one metric: XCTPerformanceMetric_WallClockTime
Nothing else has been added in Xcode 7 so it seems you're out of luck trying to modify measureBlock, sorry.
I've never found measureBlock: very useful. Check out Tidbits.xcodeproj/TidbitsTestBase/TBTestHelpers/comparePerformance https://github.com/tipbit/tidbits if you'd like to look at an alternative.

Using termios in Swift

Now that we've reached Swift 2.0, I've decided to convert my, as yet unfinished, OS X app to Swift. Making progress but I've run into some issues with using termios and could use some clarification and advice.
The termios struct is treated as a struct in Swift, no surprise there, but what is surprising is that the array of control characters in the struct is now a tuple. I was expecting it to just be an array. As you might imagine it took me a while to figure this out. Working in a Playground if I do:
var settings:termios = termios()
print(settings)
then I get the correct details printed for the struct.
In Obj-C to set the control characters you would use, say,
cfmakeraw(&settings);
settings.c_cc[VMIN] = 1;
where VMIN is a #define equal to 16 in termios.h. In Swift I have to do
cfmakeraw(&settings)
settings.c_cc.16 = 1
which works, but is a bit more opaque. I would prefer to use something along the lines of
settings.c_cc.vim = 1
instead, but can't seem to find any documentation describing the Swift "version" of termios. Does anyone know if the tuple has pre-assigned names for it's elements, or if not, is there a way to assign names after the fact? Should I just create my own tuple with named elements and then assign it to settings.c_cc?
Interestingly, despite the fact that pre-processor directives are not supposed to work in Swift, if I do
print(VMIN)
print(VTIME)
then the correct values are printed and no compiler errors are produced. I'd be interested in any clarification or comments on that. Is it a bug?
The remaining issues have to do with further configuration of the termios.
The definition of cfsetspeed is given as
func cfsetspeed(_: UnsafeMutablePointer<termios>, _: speed_t) -> Int32
and speed_t is typedef'ed as an unsigned long. In Obj-C we'd do
cfsetspeed(&settings, B38400);
but since B38400 is a #define in termios.h we can no longer do that. Has Apple set up replacement global constants for things like this in Swift, and if so, can anyone tell me where they are documented. The alternative seems to be to just plug in the raw values and lose readability, or to create my own versions of the constants previously defined in termios.h. I'm happy to go that route if there isn't a better choice.
Let's start with your second problem, which is easier to solve.
B38400 is available in Swift, it just has the wrong type.
So you have to convert it explicitly:
var settings = termios()
cfsetspeed(&settings, speed_t(B38400))
Your first problem has no "nice" solution that I know of.
Fixed sized arrays are imported to Swift as tuples, and – as far as I know – you cannot address a tuple element with a variable.
However,Swift preserves the memory layout of structures imported from C, as
confirmed by Apple engineer Joe Groff:. Therefore you can take the address of the tuple and “rebind” it to a pointer to the element type:
var settings = termios()
withUnsafeMutablePointer(to: &settings.c_cc) { (tuplePtr) -> Void in
tuplePtr.withMemoryRebound(to: cc_t.self, capacity: MemoryLayout.size(ofValue: settings.c_cc)) {
$0[Int(VMIN)] = 1
}
}
(Code updated for Swift 4+.)

Why doesn’t Xcode argument completion filter suggestions by type?

Let’s say I have the following code:
- (void) handlePanGesture: (UIPanGestureRecognizer*) sender
{
CGPoint foo = [sender translationInView:XXX];
}
Now when I place the cursor at the point marked as XXX and press Esc to get a list of completion suggestions, Xcode happily suggests all sensible symbols regardless of their type (including NSArray*, CGSize, or even SEL). Since Xcode presumably has enough information to know the argument type, why doesn’t it filter the suggestions down to only those compatible with UIView*?

Resources