How to check for a Not a Number (NaN) in Swift 2 - swift2

The following method calculates the percentage using two variables.
func casePercentage() {
let percentage = Int(Double(cases) / Double(calls) * 100)
percentageLabel.stringValue = String(percentage) + "%"
}
The above method is functioning well except when cases = 1 and calls = 0.
This gives a fatal error: floating point value can not be converted to Int because it is either infinite or NaN
So I created this workaround:
func casePercentage() {
if calls != 0 {
let percentage = Int(Double(cases) / Double(calls) * 100)
percentageLabel.stringValue = String(percentage) + "%"
} else {
percentageLabel.stringValue = "0%"
}
}
This will give no errors but in other languages you can check a variable with an .isNaN() method. How does this work within Swift2?

You can "force unwrap" the optional type using the ! operator:
calls! //asserts that calls is NOT nil and gives a non-optional type
However, this will result in a runtime error if it is nil.
One option to prevent using nil or 0 is to do what you have done and check if it's 0.
The second is option is to nil-check
if calls != nil
The third (and most Swift-y) option is to use the if let structure:
if let nonNilCalls = calls {
//...
}
The inside of the if block won't run if calls is nil.
Note that nil-checking and if let will NOT protect you from dividing by 0. You will have to check for that separately.
Combining second and your method:
//calls can neither be nil nor <= 0
if calls != nil && calls > 0

Related

Swift. Want an if statement that will give an error if value is not an integer

I am new to this an am trying to learn as much as I can. I have a variable that has a numerical value, I want an if statement that will look at this value and give an err if this value is not an integer. Can someone help?
Thanks
You can check it like this and then work with number inside the if clause:
if let number = numericalValue as? Int {
// numericalValue is an Int
} else {
// numericalValue is not an Int
}
I use Int() coupled with an if statement to achieve this:
//var number = 17 - will print "17 is an integer"
//var number = "abc" - will print "Error"
if let numberTest = Int(number) {
print("\(number) is an integer")
} else {
print("Error")
}
I managed to solve it in the end. I found the remainder operator in the Swift manual. Thought if I used that, divided by 1, if there was a remainder then the original value couldn't be an integer. So my code was - else if ((Double(guessEnteredNumber.text!)!) % 1 ) > 0 { resultText.text = "You need to guess a whole number between 1 and 5"
In swift 2 you can now use the 'guard' keyword like this
guard let number = myNumber as Int else {
// myNumber is not an Int
return
}
// myNumber is an Int and you can use number as it is not null
You can replace the 'let' keyword by a 'var' if you need to modify 'number' afterward

Designing Calculator in Swift - How to add single "." when require to enter floating point number?

I am developing calculator in Swift for IOS8. When pressed [0-9] button. It displays correctly. Also operator [+ - / *] works successfully.
I would need to add "." button, so user can enter floating number 19.2, 21.67, 40.6725 - How to write code so that "." appear once only, instead of 19.3.4? (not legal floating point number in calculator)
I found out rangeOfString(subString: String) might be great use. It returns an Optional. If passed String argument cannot be found in the receiver, it returns nil
#IBOutlet weak var display: UILabel!
#IBAction func appendDigit(sender: UIButton) {
let digit = sender.currentTitle!
if (digit == "․") {
if (display.text.rangeOfString("﹒") != nil) {
//DO Something Here
} else{
}
You're 99% of the way there. You only want to add the "." to your display if you don't already have one. I'd suggest looking for == nil instead
of looking for != nil.
if (digit == "․") {
if (display.text.rangeOfString(".") == nil) {
// append "." to the display text because we
// haven't see one yet
display.text = display.text + "."
} else {
// do nothing, we already have a "." in the
// display
}
You can leave off the else clause. I left it here for explanation purposes.
Vacawana's answer did't work out, as the syntax is out of date. I came up with simpler solution,
let dupDecimal = decimalDupCheck(inputDigit: digit, stringToCheck: display.text)
if userIsInTheMiddleOfTyping {
if !dupDecimal {
display.text = display.text! + digit
}
} else {
display.text = digit == "." ? "0" + digit : digit
userIsInTheMiddleOfTyping = true
}
}
func decimalDupCheck(inputDigit digit: String, stringToCheck text: String?) -> Bool {
if userIsInTheMiddleOfTyping {
return digit == "." && text?.contains(".") == true
} else {
return false
}
Basically you take in a BOOL function, checking for any duplication of ("."), and upon its result executes the next conditional brackets.

Swift distance() method throws fatal error: can not increment endIndex

I was trying to find a substring match in a string, and get the matched position.
I can't figure out what's wrong with the following code:
let str1 = "hello#゚Д゚"
let cmp = "゚Д゚"
let searchRange = Range(start: str1.startIndex, end: str1.endIndex)
let range = str1.rangeOfString(cmp, options: .allZeros, range: searchRange)
println("\(searchRange), \(range!)") // output: 0..<9, 6..<9
let dis = distance(searchRange.startIndex, range!.startIndex) // fatal error: can not increment endIndex! reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
// let dis = distance(searchRange.startIndex, range!.endIndex) // This will go and output: distance=7
println("distance=\(dis)")
As the comments suggested, although the range had valid values, the distance() method threw a fatal error.
If I'm wrong about the use of distance(), what method should I use to archive the target?
Any advice would be helpful. Thanks.
range!.startIndex points here:
"hello#゚Д゚"
^
But, in this case, #゚ is a single character in Swift.
Therefore, This code:
for var idx = searchRange.startIndex; idx != range!.startIndex; idx = idx.successor() {
println("\(idx): \(str1[idx])");
}
prints:
0: h
1: e
2: l
3: l
4: o
5: #゚
7: Д゚
fatal error: Can't form a Character from an empty String
// and emits BAD_INSTRUCTION exception
As you can see range!.startIndex never matches to the character boundaries, and the for loop run out the string. That's why you see the exception.
In theory, since String is considered as "Collection of Characters" in Swift, "゚Д゚" should not be a substring of "hello#゚Д゚".
I think .rangeOfString() uses NSString implementation which treats string as a sequence of unichar. I don't know this should be considered as a bug or not.
Try this:
func search<C: CollectionType where C.Generator.Element: Equatable>(col1: C, col2: C) -> C.Index? {
if col2.startIndex == col2.endIndex {
return col1.startIndex
}
var col1Ind = col1.startIndex
while col1Ind != col1.endIndex {
var ind1 = col1Ind
var ind2 = col2.startIndex
while col1[ind1] == col2[ind2] {
++ind1; ++ind2
if ind2 == col2.endIndex { return col1Ind }
if ind1 == col1.endIndex { return nil }
}
++col1Ind
}
return nil
}
Searches for the first instance of the col2 sequence in col1. If found, returns the index of the start of the sub-sequence. If not found, returns nil. If col2 is empty, returns the startIndex of col1.

For loop won't end. Don't know why

I'm writing a for loop for a project that prompts the user to input a number and keeps prompting, continually adding the numbers up. When a string is introduced, the loop should stop. I've done it with a while loop, but the project states that we must do it with a for loop also. The problem is that the prompt keeps running even when 'a = false'. Could someone explain javascript's thinking process? I want to understand why it keeps running back through the loop even though the condition isn't met. Thank you
var addSequence2 = function() {
var total = 0;
var a;
for (; a = true; ) {
var input = prompt("Your current score is " +total+ "\n" + "Next number...");
if (!isNaN(input)) {
a = true;
total = +total + +input;
}
else if (isNaN(input)) {
a = false;
document.write("Your total is " + total);
}
}
};
There is a difference between a = true and a == true.
Your for-loop is basically asking "can I set 'a' to true?", to which the answer is yes, and the loop continues.
Change the condition to a == true (thus asking "Is the value of 'a' true?")
To elaborate, in most programming languages, we distinguish between assignment ("Make 'x' be 4") and testing for equality ("Is 'x' 4?"). By convention (at least in languages that derive their syntax from C), we use '=' to assign/set a value, and '==' to test.
If I'm understanding the specification correctly (no guarantee), what happens here is that the condition condenses as follows:
Is (a = true) true?
Complete the bracket: set a to true
Is (a) true? (we just set it to true, so it must be!)
Try using the equal to operator, i.e. change
for (; a = true; ) {
to
for (; a == true; ) {
You should use a == true instead of a = true......= is an assignment operator
for (; a = true; ), you are assigning the value to the variable "a" and it will always remain true and will end up in infinite loop. In JavaScript it should a===true.
I suspect you want your for to look like this :
for(;a==true;)
as a=true is an assignment, not a comparison.
a == true. The double equal sign compares the two. Single equal assigns the value true to a so this always returns true.
for (; a = true; ) <-- this is an assignation
for (; a == true; ) <-- this should be better
Here's your fixed code :
var addSequence2 = function() {
var total = 0;
var a = true;
for(;Boolean(a);) {
var input = prompt("Your current score is " +total+ "\n" + "Next number...");
if (!isNaN(input)) {
total = total + input;
}
else{
a = false;
document.write("Your total is " + total);
}
}
};

Why does the debugger have to jump "back and forth" before it sets my tuple value?

I've actually fixed this problem already (while documenting it for this post), but I still want to know is why it is happening, so that I can understand what I did and hopefully avoid wasting time with it in the future.
In a Swift project, I have a function that parses out a string that I know will be presented in a specific format and uses it to fill in some instance variables.
There is a helper function that is passed the string, a starting index, and a divider character and spits out a tuple made up of the next string and the index from which to continue. Just in case a poorly formatted string gets passed in, I define a return type of (String, Int)? and return nil if the divider character isn't found.
The helper function looks, in relevant part, like this:
func nextChunk(stringArray: Array<Character>, startIndex: Int, divider: Character) -> (String, Int)?
{
[...]
var returnValue: (String, Int)? = (returnString, i)
return returnValue
}
So far, so good. I set a breakpoint, and just before the function returns, I see that all is as it should be:
(lldb) po returnValue
(0 = "21三體綜合症", 1 = 7)
{
0 = "21三體綜合症"
1 = 7
}
That's what I expected to see: the correct string value, and the correct index.
However, when I go back to the init() function that called the helper in the first place, and put a breakpoint immediately after the call:
var returnedValue = self.nextChunk(stringArray, startIndex: stringArrayIndex, divider: " ")
I get a completely different value for returnedValue than I had for returnValue:
(lldb) po returnedValue
(0 = "I", 1 = 48)
{
0 = "I"
1 = 48
}
Now here's the really weird part. After I get the return value, I want to test to see if it's nil, and if it's not, I want to use the values I fetched to set a couple of instance variables:
if(returnedValue == nil)
{
return
}
else
{
self.traditionalCharacter = returnedValue!.0
stringArrayIndex = returnedValue!.1
}
If I comment out both of the lines in the "else" brackets:
else
{
// self.traditionalCharacter = returnedValue!.0
// stringArrayIndex = returnedValue!.1
}
then my original breakpoint gives the expected value for the returned tuple:
(lldb) po returnedValue
(0 = "21三體綜合症", 1 = 7)
{
0 = "21三體綜合症"
1 = 7
}
Again: the breakpoint is set before this if/else statement, so I'm taking the value before any of this code has had the chance to execute.
After banging my head against this for a few hours, I realize that...there isn't actually a problem. If I press the "step over" button in the debugger, the execution pointer jumps back from the if() line to the call to nextChunk. Pressing it again sends it forward to "if" again, and sets the values properly.
This extra double-jump only happens if the assignment code is active, consistently and reproducibly. As I know, since I reproduced it for hours trying to figure out what was wrong before even trying stepping forward and noticing that it "fixed itself."
So my question is: why? Is this a bug in the debugger, or am I using breakpoints wrong? It happens just the same whether I put the breakpoint between the function call and the if() or on the if() line. Can someone explain why the debugger is jumping back and forth and when the value I need is actually getting set?

Resources