unable to use optional int "possibleNumber" in optional binding - xcode

I'm new to Swift and is trying to learn the concept of optional binding. I have came up with the following code:
let possibleNumber = Int("123")
possibleNumber.dynamicType
if let actualNumber = Int(possibleNumber){
print("\(possibleNumber) has an integer value of \(actualNumber)")
} else {
print("\(possibleNumber) could not be converted to an int")
}
Xcode playground output error message:
value of optional type "int?" not unwrapped, did you mean to use "!" or "?"
However, when I added the "!" to if let actualNumber = Int(possibleNumber!){
let possibleNumber = Int("123")
possibleNumber.dynamicType
if let actualNumber = Int(possibleNumber!){
print("\(possibleNumber) has an integer value of \(actualNumber)")
} else {
print("\(possibleNumber) could not be converted to an int")
}
Xcode display another error message:
initialiser for conditional binding must have Optional type, not int
Why is this happening?

The result of
let possibleNumber = Int("123")
is an optional Int - Int?
Then you're trying to create another Int with
Int(possibleNumber)
which does not work because the initializer expects a non-optional type.
The error message is related to the initializer rather than to the optional binding.
Try this to get the same error message.
let possibleNumber = Int("123")
let x = Int(possibleNumber)
In your second example when you initialize an Int with an implicit unwrapped Int! argument you get a non-optional Int and the compiler complains about the missing optional.

In the if let construct
if let actualNumber = Int(possibleNumber!){
print("\(possibleNumber) has an integer value of \(actualNumber)")
}
you don't need to use the Int initializer. You simply need to write
if let actualNumber = possibleNumber {
print("\(possibleNumber) has an integer value of \(actualNumber)")
}
Now Swift will try to unwrap possibleNumber. If the operation does succeed the unwrapped value is put inside actualNumber and the THEN block executed.

Related

F# record: ref vs mutable field

While refactoring my F# code, I found a record with a field of type bool ref:
type MyType =
{
Enabled : bool ref
// other, irrelevant fields here
}
I decided to try changing it to a mutable field instead
// Refactored version
type MyType =
{
mutable Enabled : bool
// other fields unchanged
}
Also, I applied all the changes required to make the code compile (i.e. changing := to <-, removing incr and decr functions, etc).
I noticed that after the changes some of the unit tests started to fail.
As the code is pretty large, I can't really see what exactly changed.
Is there a significant difference in implementation of the two that could change the behavior of my program?
Yes, there is a difference. Refs are first-class values, while mutable variables are a language construct.
Or, from a different perspective, you might say that ref cells are passed by reference, while mutable variables are passed by value.
Consider this:
type T = { mutable x : int }
type U = { y : int ref }
let t = { x = 5 }
let u = { y = ref 5 }
let mutable xx = t.x
xx <- 10
printfn "%d" t.x // Prints 5
let mutable yy = u.y
yy := 10
printfn "%d" !u.y // Prints 10
This happens because xx is a completely new mutable variable, unrelated to t.x, so that mutating xx has no effect on x.
But yy is a reference to the exact same ref cell as u.y, so that pushing a new value into that cell while referring to it via yy has the same effect as if referring to it via u.y.
If you "copy" a ref, the copy ends up pointing to the same ref, but if you copy a mutable variable, only its value gets copied.
You have the difference not because one is first-value, passed by reference/value or other things. It's because a ref is just a container (class) on its own.
The difference is more obvious when you implement a ref by yourself. You could do it like this:
type Reference<'a> = {
mutable Value: 'a
}
Now look at both definitions.
type MyTypeA = {
mutable Enabled: bool
}
type MyTypeB = {
Enabled: Reference<bool>
}
MyTypeA has a Enabled field that can be directly changed or with other word is mutable.
On the other-side you have MyTypeB that is theoretically immutable but has a Enabled that reference to a mutable class.
The Enabled from MyTypeB just reference to an object that is mutable like the millions of other classes in .NET. From the above type definitions, you can create objects like these.
let t = { MyTypeA.Enabled = true }
let u = { MyTypeB.Enabled = { Value = true }}
Creating the types makes it more obvious, that the first is a mutable field, and the second contains an object with a mutable field.
You find the implementation of ref in FSharp.Core/prim-types.fs it looks like this:
[<DebuggerDisplay("{contents}")>]
[<StructuralEquality; StructuralComparison>]
[<CompiledName("FSharpRef`1")>]
type Ref<'T> =
{
[<DebuggerBrowsable(DebuggerBrowsableState.Never)>]
mutable contents: 'T }
member x.Value
with get() = x.contents
and set v = x.contents <- v
and 'T ref = Ref<'T>
The ref keyword in F# is just the built-in way to create such a pre-defined mutable Reference object, instead that you create your own type for this. And it has some benefits that it works well whenever you need to pass byref, in or out values in .NET. So you should use ref. But you also can use a mutable for this. For example, both code examples do the same.
With a reference
let parsed =
let result = ref 0
match System.Int32.TryParse("1234", result) with
| true -> result.Value
| false -> result.Value
With a mutable
let parsed =
let mutable result = 0
match System.Int32.TryParse("1234", &result) with
| true -> result
| false -> result
In both examples you get a 1234 as an int parsed. But the first example will create a FSharpRef and pass it to Int32.TryParse while the second example creates a field or variable and passes it with out to Int32.TryParse

Unable to type alias a specialised generic enum

Given the following:
use std::fmt::Debug;
#[derive(Debug)]
enum A<T: Debug> {
X,
Y(T),
}
#[derive(Debug)]
struct B;
type C = A<B>;
// use A<B> as C; // Does not compile
I can use it as:
fn main() {
let val0 = A::X::<B>;
let val1 = A::Y::<B>(B);
println!("{:?}\t{:?}", val0, val1);
}
But then for more than one generic parameter (or if A, B etc were much longer names then to alias it I tried the following but it doesn't compile:
fn main() {
let val0 = C::X;
let val1 = C::Y(B);
println!("{:?}\t{:?}", val0, val1);
}
with errors:
src/main.rs:656:16: 656:20 error: no associated item named `X` found for type `A<B>` in the current scope
src/main.rs:656 let val0 = C::X;
^~~~
src/main.rs:657:16: 657:20 error: no associated item named `Y` found for type `A<B>` in the current scope
src/main.rs:657 let val1 = C::Y(B);
As also noted i am unable to use use to solve the problem. Is there a way around it (because typing the whole thing seems to be cumbersome) ?
rustc --version
rustc 1.9.0 (e4e8b6668 2016-05-18)
Is there a way around it (because typing the whole thing seems to be cumbersome)?
You can specify C as the type of the variable so you can use A::X or A::Y without explicit specifying the type parameter:
let val0: C = A::X;
let val1: C = A::Y(B);

xcode: need to convert strings to double and back to string

this is my line of code.
budgetLabel.text = String((budgetLabel.text)!.toInt()! - (budgetItemTextBox.text)!.toInt()!)
the code works, but when I try to input a floating value into the textbox the program crashes. I am assuming the strings need to be converted to a float/double data type. I keep getting errors when i try to do that.
In Swift 2 there are new failable initializers that allow you to do this in more safe way, the Double("") returns an optional in cases like passing in "abc" string the failable initializer will return nil, so then you can use optional-binding to handle it like in the following way:
let s1 = "4.55"
let s2 = "3.15"
if let n1 = Double(s1), let n2 = Double(s2) {
let newString = String( n1 - n2)
print(newString)
}
else {
print("Some string is not a double value")
}
If you're using a version of Swift < 2, then old way was:
var n1 = ("9.99" as NSString).doubleValue // invalid returns 0, not an optional. (not recommended)
// invalid returns an optional value (recommended)
var pi = NSNumberFormatter().numberFromString("3.14")?.doubleValue
Fixed: Added Proper Handling for Optionals
let budgetLabel:UILabel = UILabel()
let budgetItemTextBox:UITextField = UITextField()
budgetLabel.text = ({
var value = ""
if let budgetString = budgetLabel.text, let budgetItemString = budgetItemTextBox.text
{
if let budgetValue = Float(budgetString), let budgetItemValue = Float(budgetItemString)
{
value = String(budgetValue - budgetItemValue)
}
}
return value
})()
You need to be using if let. In swift 2.0 it would look something like this:
if let
budgetString:String = budgetLabel.text,
budgetItemString:String = budgetItemTextBox.text,
budget:Double = Double(budgetString),
budgetItem:Double = Double(budgetItemString) {
budgetLabel.text = String(budget - budgetItem)
} else {
// If a number was not found, what should it do here?
}

concatenate in swift: Binary operator '+' cannot be applied to operands of type 'String' and 'AnyObject'

I get the error in Swift and don't understand it when I do this:
if(currentUser["employer"] as! Bool == false) { print("employer is
false: "+currentUser["employer"] as! Bool) }
But I can do (Its not actually printing anything though, maybe another problem):
if(currentUser["employer"] as! Bool == false) {
print(currentUser["employer"]) }
Results in error:
Binary operator '+' cannot be applied to operands of type 'String' and
'AnyObject'
Similarly:
let currentUser = PFUser.currentUser()!
let isEmployer = currentUser["employer"]
print("isEmployer: \(isEmployer)")
print(currentUser["employer"])
But these two don't work:
print("employer: "+currentUser["employer"])
print("employer: \(currentUser["employer"])")
I also happen to be using Parse to get data, and not sure if that is the right way either.
The error message might be misleading in the first example
if currentUser["employer"] as! Bool == false {
print("employer is false: "+currentUser["employer"] as! Bool)
}
In this case, the error message is supposed to be
binary operator '+' cannot be applied to operands of type 'String' and 'Bool'
because currentUser["employer"] as! Bool is a non-optional Bool and cannot be implicitly cast to String
Those examples
print("employer: "+currentUser["employer"])
print("employer: \(currentUser["employer"])")
don't work because
In the first line, currentUser["employer"] without any typecast is an optional AnyObject (aka unspecified) which doesn't know a + operator.
In the second line, the string literal "employer" within the String interpolated expression causes a syntax error (which is fixed in Xcode 7.1 beta 2).
Edit:
This syntax is the usual way.
let isEmployer = currentUser["employer"]
print("isEmployer: \(isEmployer)")
Or alternatively, you can write
print("employer is " + String(currentUser["employer"] as! Bool))

Casting struct ProcessSerialNumber to UnsafePointer<Void>

I have a ProcessSerialNumber and want to create a NSAppleEventDescriptor from it, the same way as shown in issue 14 of objc.io. However the constructor expects an UnsafePointer<Void>.
let psn = ProcessSerialNumber(highLongOfPSN: UInt32(0), lowLongOfPSN: UInt32(kCurrentProcess))
let target = NSAppleEventDescriptor(
descriptorType: typeProcessSerialNumber,
bytes: &psn, // <-- this fails
length: sizeof(ProcessSerialNumber)
)
What am I missing to convert it correctly?
Yet another glorious swift error message failure, the real problem is that typeProcessSerialNumber is an Int and the initializer expects a DescType. Use:
let target = NSAppleEventDescriptor(descriptorType: DescType(typeProcessSerialNumber), bytes:&psn, length:sizeof(ProcessSerialNumber))

Resources