How do you invoke HIDictionaryWindowShow in Swift? I try this:
import Carbon
if let text = _dictionaryText, let range = _dictionaryRange
{
let font = CTFontCreateWithName("Baskerville", 16, nil);
let point = CGPoint(x: 0.0, y: 0.0);
var trns = CGAffineTransform();
HIDictionaryWindowShow(nil, text, range, font, point, false, &trns);
}
But getting error
Cannot invoke 'HIDictionaryWindowShow' with an argument list of type
'(nil, String, CFRange, CTFont!, CGPoint, Bool, CGAffineTransform)'
Not seeing the wrong argument here. First and last argument should be allowed to be nil but docs say that NULL is Ok as the first argument which is nil in Swift or is it? As there is no NULL in Swift what do I need to specify instead of it?
Sorry I don't have the Swift code, but you can probably work from the following untested objective-c code. Note that showDefinitionForAttributedString was added to replace HIDictionaryWindowShow which is, as you know, a carbon function and won't be supported forever.
[self.view showDefinitionForAttributedString:[[NSAttributedString alloc] initWithString:text] atPoint:NSMakePoint(0.0, 0.0)];
EDIT:
In looking further the second argument is not correct in your example. Carbon lib does not understand String, it wants CFTypeRef.
Related
I am trying to grab data from a text field labeled 'temperatureTextField' and assigning it to 't' which is a Double. Ideally the user is meant to add a number value to the temperatureTextField.
Here is my method:
#IBOutlet weak var temperatureTextField: UITextField!
#IBAction func convert(sender: AnyObject) {
let t = Double(temperatureTextField.text!)
let tempM = TemperatureModel(temp: t!)
temperatureTextField.text = String(tempM.toCelsius())
}
The red exclamation is coming from the line "let t = Double(temperatureTex...)"
You're probably using Xcode 6, so Swift 1.2, but the String initializer for Double is only available in Swift 2 (Xcode 7).
You can always use NSString's doubleValue property:
let t = (temperatureTextField.text! as NSString).doubleValue
but I'd recommend using Xcode 7 and Swift 2 as soon as possible.
As Eric suggested, I ran into this issue because I was running an outdated version of xcode.
Here is what my code looked liked after, in case anyone runs into trouble and is unable to update:
let t = (inputText.text! as NSString).doubleValue
let tempModel = TemperatureModel(temp: t)
inputText.text = "\(tempModel.toCelsius())"
Since I am fairly new to Swift programming on OSX, this question may contain several points that needs clarification.
I have a method which iterates over all subviews of a given NSView instance. For this, I get the array of subviews which is of type [AnyObject] and process one element at a time.
At some point I would like to access the identifier property of each instance. This property is implemented from a protocol in NSView named NSUserInterfaceItemIdentification, which type is given in the documentation as (optional) String?. In order to get that identifier I would have written
var view : NSView = subview as NSView;
var viewIdent : String = view.identifier!;
The second line is marked by the compiler with an error stating that identifier is not of an optional type, but instead of type String, and hence the post-fix operator ! cannot be applied.
Removing this operator compiles fine, but leads to a runtime error EXC_BAD_ACCESS (code=1, address=0x0) because identifier seems to be nil for some NSButton instance.
I cannot even test for this property, because the compiler gives me a String is not convertible to UInt8 while I try
if (view.identifier != nil) {viewIdent = view.identifier;}
My questions are
Is the documentation wrong? I.g. the property identifier is not optional?
How can I ship around this problem and get code that runs robust?
If the documentation states that view.identifier is an Optional, it means it can be nil. So it's not a surprise that for some button instances it is indeed nil for you.
Force unwrapping this element that can be nil will lead your app to crash, you can use safe unwrapping instead:
if let viewIdent = view.identifier {
// do something with viewIdent
} else {
// view.identifier was nil
}
You can easily check the type of an element in Xcode: click on the element while holding the ALT key. It will reveal a popup with informations, including the type. You can verify there that your element is an Optional or not.
Tip: you can safe unwrap several items on one line, it's rather convenient:
if let view = subview as? NSView, viewIdent = view.identifier {
// you can do something here with `viewIdent` because `view` and `view.identifier` were both not nil
} else {
// `view` or `view.identifier` was nil, handle the error here
}
EDIT:
You have to remove this line of yours before using my example:
var viewIdent : String = view.identifier!
Because if you keep this line before my examples, it won't work because you transform what was an Optional in a non-Optional by adding this exclamation mark.
Also it forces casting to a String, but maybe your identifier is an Int instead, so you shouldn't use this kind of declaration but prefer if let ... to safe unwrap and cast the value.
EDIT2:
You say my example doesn't compile... I test every answer I make on SO. I tested this one in a Playground before answering, here's a screenshot:
Also, after checking it, I confirm that the identifier is an Optional String, that's the type given by Xcode when using ALT+CLICK on the property. The documentation is right.
So if it's different for you, it means you have a different problem unrelated to this one; but my answer for this precise question remains the same.
I'm using Xcode 6.3.1 and Swift.
When a function with multiple parameter get some error on parameter type, it's hard to know which argument is wrong.
For example, CGBitmapContextCreate(), this code:
let colorSpace:CGColorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(nil, UInt(size.width), UInt(size.height), 8, 0, colorSpace, bitmapInfo)
will produce an error like this:
MyFile.swift:23:19: Cannot invoke 'CGBitmapContextCreate' with an argument list of type '(nil, UInt, UInt, Int, Int, CGColorSpace, CGBitmapInfo)'
By comparing the document and my argument list carefully, I can find that it is the 2nd and 3rd arguments, which should be Int.
Is there any way to make the compiler more smarter on this?
The problem is probably that the online documentation you are looking at for CGBitmapContextCreate is wrong based on the definition you are actually accessing when you are compiling.
The last parameter needs to be of type UInt32 , and CGBitmapInfo is returning a CGBitmapInfo object. That is why the compiler is erring. You are passing in the wrong type of parameters. You can even right click the function and click "see definition", this will verify what I am saying.
Try instead, passing in CGImageAlphaInfo.PremultipliedLast.rawValue directly, as it is the UInt32 that is being looked for.
Example Solution:
let colorSpace:CGColorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGImageAlphaInfo.PremultipliedLast.rawValue
let context = CGBitmapContextCreate(nil, UInt(size.width), UInt(size.height), 8, 0, colorSpace, bitmapInfo)
You'll find that you will be able to compile the source, and get the expected result. Note that you can still apply any bitwise operations you want to on this value.
PS: I had the same issue you were having, and was amply frustrated when I couldn't find a solution.
It works!
let width = CGImageGetWidth(image)
let height = CGImageGetHeight(image)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bytesPerRow = 4 * width;
let bitsPerComponent :Int = 8
let pixels = UnsafeMutablePointer<UInt8>(malloc(width*height*4))
var context = CGBitmapContextCreate(pixels, width, height, bitsPerComponent, bytesPerRow, colorSpace, CGBitmapInfo())
let timeFont = [NSFontAttributeName:UIFont(name: "Voyage", size: 20.0)]
var attrString3 = NSAttributedString("(Time)", attributes : timeFont); // <--- compiler error "Extra argument in call"
This code worked in xcode 6.0, but now that I've upgraded to xcode 6.1 it doesn't work anymore and I can't figure out what I need to get it back working. It says that there is an extra argument, but that's not correct. I believe that it has something to do with the new failable initializers, but everything that I've tried doesn't' work.
There are two reasons your code is failing to compile:
The initializer for NSAttributedString that you want to use now requires the explicit labeling of the string parameter
The UIFont initializer that you are using now returns an optional (i.e., UIFont?), which needs to be unwrapped before you pass it in the attributes dictionary.
Try this instead:
let font = UIFont(name: "Voyage", size: 20.0) ?? UIFont.systemFontOfSize(20.0)
let attrs = [NSFontAttributeName : font]
var attrString3 = NSAttributedString(string: "(Time)", attributes: attrs)
Note the use of the new coalescing operator ??. This unwraps the optional Voyage font, but falls back to the System Font if Voyage is unavailable (which seems to be the case in the Playground). This way, you get your attributed string regardless, even if your preferred font can't be loaded.
Xcode 6.1 comes with Swift 1.1 that supports constructors that can fail. UIFont initialisation can fail and return nil. Also use string: when creating NSAttributedString:
if let font = UIFont(name: "Voyage", size: 20.0) {
let timeFont = [NSFontAttributeName:font]
var attrString3 = NSAttributedString(string: "(Time)", attributes : timeFont)
}
Coding in Swift and get the above error...
Is the message masking something else OR can you really not add two CGFloat operands? If not, why (on earth) not?
EDIT
There is nothing special in the code I am trying to do; what is interesting is that the above error message, VERBATIM, is what the Swift assistant compiler tells me (underlining my code with red squiggly lines).
Running Xcode 6.3 (Swift 1.2)
It's absolutely possible, adding two CGFloat variables using the binary operator '+'. What you need to know is the resultant variable is also a CGFloat variable (based on type Inference Principle).
let value1 : CGFloat = 12.0
let value2 : CGFloat = 13.0
let value3 = value1 + value2
println("value3 \(value3)")
//The result is value3 25.0, and the value3 is of type CGFloat.
EDIT:
By Swift v3.0 convention
let value = CGFloat(12.0) + CGFloat(13.0)
println("value \(value)")
//The result is value 25.0, and the value is of type CGFloat.
I ran into this using this innocent-looking piece of code:
func foo(line: CTLine) {
let ascent: CGFloat
let descent: CGFloat
let leading: CGFloat
let fWidth = Float(CTLineGetTypographicBounds(line, &ascent, &descent, &leading))
let height = ceilf(ascent + descent)
// ~~~~~~ ^ ~~~~~~~
}
And found the solution by expanding the error in the Issue Navigator:
Looking into it, I think Swift found the +(lhs: Float, rhs: Float) -> Float function first based on its return type (ceilf takes in a Float). Obviously, this takes Floats, not CGFloats, which shines some light on the meaning of the error message. So, to force it to use the right operator, you gotta use a wrapper function that takes either a Double or a Float (or just a CGFloat, obviously). So I tried this and the error was solved:
// ...
let height = ceilf(Float(ascent + descent))
// ...
Another approach would be to use a function that takes a Double or CGFloat:
// ...
let height = ceil(ascent + descent)
// ...
So the problem is that the compiler prioritizes return types over parameter types. Glad to see this is still happening in Swift 3 ;P
Mainly two possible reasons are responsible to occur such kind of error.
first:
Whenever you try to compare optional type CGFloat variable
like
if a? >= b?
{
videoSize = size;
}
This is responsible for an error so jus make it as if a! >= b!{}
Second:
Whenever you direct use value to get a result at that time such kind of error occure
like
var result = 1.5 / 1.2
Don't use as above.
use with object which is declare as a CGFloat
like
var a : CGFloat = 1.5
var b : CGFloat = 1.2
var result : CGFloat!
result = a / b