Binary operator '+' cannot be applied to two CGFloat operands? - cocoa

Coding in Swift and get the above error...
Is the message masking something else OR can you really not add two CGFloat operands? If not, why (on earth) not?
EDIT
There is nothing special in the code I am trying to do; what is interesting is that the above error message, VERBATIM, is what the Swift assistant compiler tells me (underlining my code with red squiggly lines).
Running Xcode 6.3 (Swift 1.2)

It's absolutely possible, adding two CGFloat variables using the binary operator '+'. What you need to know is the resultant variable is also a CGFloat variable (based on type Inference Principle).
let value1 : CGFloat = 12.0
let value2 : CGFloat = 13.0
let value3 = value1 + value2
println("value3 \(value3)")
//The result is value3 25.0, and the value3 is of type CGFloat.
EDIT:
By Swift v3.0 convention
let value = CGFloat(12.0) + CGFloat(13.0)
println("value \(value)")
//The result is value 25.0, and the value is of type CGFloat.

I ran into this using this innocent-looking piece of code:
func foo(line: CTLine) {
let ascent: CGFloat
let descent: CGFloat
let leading: CGFloat
let fWidth = Float(CTLineGetTypographicBounds(line, &ascent, &descent, &leading))
let height = ceilf(ascent + descent)
// ~~~~~~ ^ ~~~~~~~
}
And found the solution by expanding the error in the Issue Navigator:
Looking into it, I think Swift found the +(lhs: Float, rhs: Float) -> Float function first based on its return type (ceilf takes in a Float). Obviously, this takes Floats, not CGFloats, which shines some light on the meaning of the error message. So, to force it to use the right operator, you gotta use a wrapper function that takes either a Double or a Float (or just a CGFloat, obviously). So I tried this and the error was solved:
// ...
let height = ceilf(Float(ascent + descent))
// ...
Another approach would be to use a function that takes a Double or CGFloat:
// ...
let height = ceil(ascent + descent)
// ...
So the problem is that the compiler prioritizes return types over parameter types. Glad to see this is still happening in Swift 3 ;P

Mainly two possible reasons are responsible to occur such kind of error.
first:
Whenever you try to compare optional type CGFloat variable
like
if a? >= b?
{
videoSize = size;
}
This is responsible for an error so jus make it as if a! >= b!{}
Second:
Whenever you direct use value to get a result at that time such kind of error occure
like
var result = 1.5 / 1.2
Don't use as above.
use with object which is declare as a CGFloat
like
var a : CGFloat = 1.5
var b : CGFloat = 1.2
var result : CGFloat!
result = a / b

Related

Heartrate with no decimal places

Im doing a watch app for the Apple Watch in Xcode and the sample code from the Apple Developer site SpeedySloth: Creating a Workout has the HeartRate rounded to one decimal place, e.g. 61.0
How can I fix this?
case HKQuantityType.quantityType(forIdentifier: .heartRate):
/// - Tag: SetLabel
let heartRateUnit = HKUnit.count().unitDivided(by: HKUnit.minute())
let value = statistics.mostRecentQuantity()?.doubleValue(for: heartRateUnit)
let roundedValue = Double( round( 1 * value! ) / 1 )
label.setText("\(roundedValue) BPM")
I tried changing both the 1's in there to 0 but that gave me a 6.1 BPM or a 0.0 BPM
Thanks
A simple solution is to round to an integer and show the integer.
let value = // ... some Double ...
let s = String(Int(value.rounded(.toNearestOrAwayFromZero)))
label.setText(s + " BPM")
Properly, however, you should put formatting a string based on a number into the hands of a NumberFormatter. That's its job: to format a number.
let i = value.rounded(.toNearestOrAwayFromZero)
let nf = NumberFormatter()
nf.numberStyle = .none
let s = nf.string(from: i as NSNumber)!
// ... and now show the string

HIDictionaryWindowShow usage in Swift

How do you invoke HIDictionaryWindowShow in Swift? I try this:
import Carbon
if let text = _dictionaryText, let range = _dictionaryRange
{
let font = CTFontCreateWithName("Baskerville", 16, nil);
let point = CGPoint(x: 0.0, y: 0.0);
var trns = CGAffineTransform();
HIDictionaryWindowShow(nil, text, range, font, point, false, &trns);
}
But getting error
Cannot invoke 'HIDictionaryWindowShow' with an argument list of type
'(nil, String, CFRange, CTFont!, CGPoint, Bool, CGAffineTransform)'
Not seeing the wrong argument here. First and last argument should be allowed to be nil but docs say that NULL is Ok as the first argument which is nil in Swift or is it? As there is no NULL in Swift what do I need to specify instead of it?
Sorry I don't have the Swift code, but you can probably work from the following untested objective-c code. Note that showDefinitionForAttributedString was added to replace HIDictionaryWindowShow which is, as you know, a carbon function and won't be supported forever.
[self.view showDefinitionForAttributedString:[[NSAttributedString alloc] initWithString:text] atPoint:NSMakePoint(0.0, 0.0)];
EDIT:
In looking further the second argument is not correct in your example. Carbon lib does not understand String, it wants CFTypeRef.

Swift: get more detailed info on function parameter type error

I'm using Xcode 6.3.1 and Swift.
When a function with multiple parameter get some error on parameter type, it's hard to know which argument is wrong.
For example, CGBitmapContextCreate(), this code:
let colorSpace:CGColorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(nil, UInt(size.width), UInt(size.height), 8, 0, colorSpace, bitmapInfo)
will produce an error like this:
MyFile.swift:23:19: Cannot invoke 'CGBitmapContextCreate' with an argument list of type '(nil, UInt, UInt, Int, Int, CGColorSpace, CGBitmapInfo)'
By comparing the document and my argument list carefully, I can find that it is the 2nd and 3rd arguments, which should be Int.
Is there any way to make the compiler more smarter on this?
The problem is probably that the online documentation you are looking at for CGBitmapContextCreate is wrong based on the definition you are actually accessing when you are compiling.
The last parameter needs to be of type UInt32 , and CGBitmapInfo is returning a CGBitmapInfo object. That is why the compiler is erring. You are passing in the wrong type of parameters. You can even right click the function and click "see definition", this will verify what I am saying.
Try instead, passing in CGImageAlphaInfo.PremultipliedLast.rawValue directly, as it is the UInt32 that is being looked for.
Example Solution:
let colorSpace:CGColorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGImageAlphaInfo.PremultipliedLast.rawValue
let context = CGBitmapContextCreate(nil, UInt(size.width), UInt(size.height), 8, 0, colorSpace, bitmapInfo)
You'll find that you will be able to compile the source, and get the expected result. Note that you can still apply any bitwise operations you want to on this value.
PS: I had the same issue you were having, and was amply frustrated when I couldn't find a solution.
It works!
let width = CGImageGetWidth(image)
let height = CGImageGetHeight(image)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bytesPerRow = 4 * width;
let bitsPerComponent :Int = 8
let pixels = UnsafeMutablePointer<UInt8>(malloc(width*height*4))
var context = CGBitmapContextCreate(pixels, width, height, bitsPerComponent, bytesPerRow, colorSpace, CGBitmapInfo())

‘CGFloat’ is not convertible to ‘UInt8' and other CGFloat issues with Swift and Xcode 6 beta 4

In case this illuminates the problem, here's the original Objective-C code.
int x = (arc4random()%(int)(self.gameView.bounds.size.width*5)) - (int)self.gameView.bounds.size.width*2;
int y = self.gameView.bounds.size.height;
drop.center = CGPointMake(x, -y);
I started out with this code. Lines 2 and 3 are fine, I'm presenting them for clarity later.
let x = CGFloat(arc4random_uniform(UInt32(self.gameView.bounds.size.width * 5))) - self.gameView.bounds.size.width * 2
let y = self.gameView.bounds.size.height
dropView.center = CGPointMake(x, -y)
In Xcode 6 beta 3, it was necessary to cast the arc4random_uniform UInt32 result to CGFloat in order for the minus and multiplication to work. This doesn't work anymore and the compiler shows an error:
‘CGFloat’ is not convertible to ‘UInt8’
The release notes state:
"CGFloat is now a distinct floating-point type that wraps either a Float on 32-bit architectures or a Double on 64-bit architectures. It provide all of the same comparison and arithmetic operations of Float and Double and may be created using numeric literals. Using CGFloat insulates your code from situations where your code would be
!fine for 32-bit but fail when building for 64-bit or vice versa. (17224725)"
Am I just doing something wrong with types? I don't even know how to describe this problem better to submit a bug report to Apple for beta 4. Pretty much every single Swift project I have that does any kind of point or rect manipulation got hit by this issue, so I'm looking for some sanity.
Since Swift doesn't have implicit type conversions, you must specify all the type conversions that take place. What makes this case particularly tedious, is that currently Swift seems to lack direct conversions between CGFloat and types such as UInt32, and you must go through an intermediate type as you've discovered.
In the end, two double conversions are needed for the arc4random_uniform:
let bounds = CGRectMake(0.0, 0.0, 500.0, 500.0)
var x = CGFloat(UInt(arc4random_uniform(UInt32(UInt(bounds.size.width) * 5))))
x -= bounds.size.width * 2
let center = CGPointMake(x, -bounds.size.height)
Had the same problem ... try wrapping
arc4random_uniform
with
Int()
like
Int(arc4random_uniform)
this worked for me ... don't know why Swift/Xcode hast problems converting unsigned INT's
TL;DR simple shortcuts cause HCF: Halt and Catch Fire bugs
Note that there are some obvious work legal work arounds like implementing conversion to and from CGFloat:
Totally legal, but don't do this:
extension Float {
func __conversion() -> CGFloat { return CGFloat(self) }
}
extension CGFloat {
func __conversion() -> Float { return Float(self) }
func __conversion() -> Double { return Double(self) }
}
extension Double {
func __conversion() -> CGFloat { return CGFloat(self) }
}
I did not notice when typing, but later my machine kept overheating and hanging and SourceKit went to 300-500%, and the swift proceess + kernel_task took up 10+ gigs of RAM, consuming all that was left of my 16 gigs. It took a long time to trace it back to this - it wasn't swift.
#Arkku provided the correct solution, so the one-liner for x is...
let x = CGFloat(UInt(arc4random_uniform(UInt32(UInt(self.gameView.bounds.size.width) * 5)))) - self.gameView.bounds.size.width * 2
As of Xcode 6 beta 5, you can still use an intermediate conversion if you want and your code will continue to work. However, it is no longer necessary, so the following now works as expected.
let x = CGFloat(arc4random_uniform(UInt32(self.gameView.bounds.size.width * 5))) - self.gameView.bounds.size.width * 2
Since the original question is only relevant to Xcode 6 beta 4, what is the proper way to handle the question? Is there a historical mark? Should it be deleted?

Retain a random number across different functions in Cocoa?

I know how to do a global variable, but whenever I try to define a global variable with a random number function, xcode says "initializer element is not constant." The compiler doesn't want to make a variable from a random number because the random number function is not constant.
How do I generate a random number and then use that same value for more than one action? (For example, to define a color and then write that value to a label?)
Code:
#import "Slider_with_IBAppDelegate.h"
float * const hue = ((arc4random() % ((unsigned)100 + 1))/100.0);
//^this is where I get the error: "initializer element is not constant"
#synthesize label
//write value to label
- (IBAction) doButton {
label.text = [NSString stringWithFormat:#"%f", hue];
}
//set background color
- (void)applicationDidBecomeActive:(UIApplication*)application
{
self.label5.backgroundColor = [UIColor colorWithHue:hue
saturation:1.0
brightness:1.0
alpha:1.0];
}
----edit------
Thanks for the suggestions. It still doesn't work for me, though, what am I doing wrong?
New code:
#import "Slider_with_IBAppDelegate.h"
float const hue = ((arc4random() % ((unsigned)100 + 1))/100.0);
//^I still get the error: "initializer element is not constant."
#synthesize label
//write value to label
- (IBAction) doButton {
label.text = [NSString stringWithFormat:#"%f", hue];
}
//^this is where I get the error "'hue' undeclared (first use of this function)"
//set background color
- (void)applicationDidBecomeActive:(UIApplication*)application
{
hue = ((arc4random() % ((unsigned)1000 + 1))/1000.0);
/*here I get the error "assignment of read-only variable 'hue.'"
If I insert "float" just before hue, I do not get this error,
but it still won't compile because of the error above.*/
self.label5.backgroundColor = [UIColor colorWithHue:hue
saturation:1.0
brightness:1.0
alpha:1.0];
}
Make it non-const and initialize it in applicationDidBecomeActive. Is there a reason it must be constant?
I know how to do a global variable, but whenever I try to define a global variable with a random number function, xcode says "incompatible types in initialization."
float * const hue = ((arc4random() % ((unsigned)100 + 1))/100.0);
That's not a function; it's an expression. I'd be surprised if you're not also getting an error here, because you can't initialize a global variable with an expression that isn't constant. As alltom.com says, you need to assign to it from applicationDidBecomeActive:.
The warning is because you've given the variable a pointer type (float *), but you're not assigning a pointer to it. Cut out the asterisk, because you're not going to put a pointer in this variable.
Xcode doesn't want to make a variable from a random number because the random number function is not constant.
Xcode doesn't care one way or the other. It's just reporting the findings of the compiler. By default, the compiler for Objective-C is GCC, but Xcode supports other compilers (and Xcode does come with one other C/Objective-C compiler: LLVM-GCC).
… I couldn't call the same value for the label.
You're not showing a label here, and you can't call a value. You can only call a function, and you don't have one in the code shown.
It gave me the error "function undefined: first use of this function" in doButton even though it was defined in applicationDidBecomeActive.
No, it wasn't. Assigning to a variable does not create a function.
In case anyone is wondering, I finally found a way to do this effectively. (I am sure this is what alltom was saying, I was just too dumb to understand.)
I declared a float and a seed in my .h file:
- (float)generate:(id)sender;
- (void)seed;
And in the implementation file, I defined the float as a random number, and I used srandom() as a random seed generator.
- (float)generate:(id)sender
{
//Generate a number between 1 and 100 inclusive
int generated;
generated = (random() % 100) + 1;
return(generated);
}
- (void)seed {
srandom(time(NULL));
}
Then anywhere I wanted to retain a random number, I used
srandom(time(NULL));
generated1 = ((random() % 100) + 1)/100.0;
to initiate the number, and from there I was able to use generated1, generated2, hue, etc. as variables in any function I wanted (and I made sure to declare these variables as floats at the top of the file).

Resources