Swift 2 - kCTForegroundColorAttributeName - swift2

let nameAttributes = [NSFontAttributeName:nameFont, kCTForegroundColorAttributeName:UIColor.whiteColor().CGColor] as [String:AnyObject]
var nameAttributedString = NSMutableAttributedString(string:name, attributes:nameAttributes)
I have these values which are working on Swift 1.2. But on Swift 2, they don't work.
I receive an error on first line:
'_' is not convertible to 'CFString'
And the problem is kCTForegroundColorAttributeName. Without kCTForegroundColorAttributeName, it would works. But I need it to change the color..
In addition:
kCTForegroundColorAttributeName:UIColor.whiteColor().colorWithAlphaComponent(0.7).CGColor
gives me an error:
'_' is not convertible to 'CGFloat'
In Swift 2, attributes on NSMutableAttributedString has to be [String:AnyObject] while in Swift 1.2 was [NSObject:AnyObject].
Any ideas ?

Why not just use NSForegroundColorAttributeName?

Use this Code
let attrs2 = [NSFontAttributeName : FontWithBook(10),NSForegroundColorAttributeName :UIColor.grayColor()]
var gString = NSMutableAttributedString(string:"(Mandatory)", attributes:attrs2)

Related

Xcode - Swift 4 Encoding error

I have this code :
let data = "NgAzADYANQA1ADEANwA0ADgANQA1ADQANgA4ADgAMAA0ADcALwAvAGIAYQAwAGQAZABlAGQANAAtAGYANAAzAGUALQA0ADAANABkAC0AYQAzAGYAYgAtADQAZQA2ADIAZQBhADkAMgBiADMAYgBiAA=="
let dataDecoded = Data(base64Encoded: data, options: NSData.Base64DecodingOptions.ignoreUnknownCharacters)!
let decodedString = String(decoding: dataDecoded, as: UTF8.self)
print(decodedString)
On my output window, i have this result :
636551748554688047//ba0dded4-f43e-404d-a3fb-4e62ea92b3bb
But on my variable, i have :
6\03\06\05\05\01\07\04\08\05\05\04\06\08\08\00\04\07\0/\0/\0b\0a\00\0d\0d\0e\0d\04\0-\0f\04\03\0e\0-\04\00\04\0d\0-\0a\03\0f\0b\0-\04\0e\06\02\0e\0a\09\02\0b\03\0b\0b\0
please help me :)
Seems your input data is not int UTF-8, try this:
let decodedString = String(data: dataDecoded, encoding: .utf16LittleEndian)

Numeral.JS zeroFormat includes $ and % symbol at end result

Why does zero formatting include $ and % symbols in the formatted result?
numeral.js version is 1.5.3
var number = numeral(0);
numeral.zeroFormat('N/A');
var zero = number.format('0.0%')
// 'N/A%'
var zero = number.format('$0.0')
// '$N/A'
// What I expect is 'N/A'
Is it a bug or am I missing something?
Problem duplication - https://jsfiddle.net/wbuu53qr/
Quickly found the solution. This problem happens in the older version.
Just have to move to the latest version:
var number = numeral(0);
numeral.zeroFormat('N/A');
var zero = number.format('0.0%')
// 'N/A'
https://jsfiddle.net/4jz4vp5h/

Swift 2: expression is ambiguous?

The following code used to work before I migrated to Swift 2 now I can't seem to work around it:
let cal = NSCalendar(calendarIdentifier: GregorianCalendar)
let components = cal!.components(.CalendarUnitDay | .CalendarUnitMonth | .CalendarUnitYear, fromDate: date) **error at this line**
let newDate = cal!.dateFromComponents(components)
I'm getting the following error message:
Type of expression is ambiguous without more context
In Swift 2 there is a change in the declaration of NSCalendarUnit as well as in the handling of OptionSetType
let cal = NSCalendar(calendarIdentifier: NSCalendarIdentifierGregorian)
let components = cal!.components([.Day, .Month, .Year], fromDate: date)
Please folks read the Swift Blog and the current Swift Language Guide
I was able to fix the error by force typecasting
let components = calendar.components([.Hour, .Minute, .Second], fromDate: date as! NSDate)

Initializing NSOpenGLPixelFormat in Swift

Apparently I'm the only one to attempt this, as none of my Google searches turned up anything helpful. Assume I'm initializing an attribute array like this:
let glPFAttributes = [
NSOpenGLPFAAccelerated,
NSOpenGLPFADoubleBuffer,
NSOpenGLPFAColorSize, 48,
NSOpenGLPFAAlphaSize, 16,
NSOpenGLPFAMultisample,
NSOpenGLPFASampleBuffers, 1,
NSOpenGLPFASamples, 4,
NSOpenGLPFAMinimumPolicy,
0
]
These things are all regular Ints, I've checked. Now if I do
let glPixelFormat = NSOpenGLPixelFormat(attributes: glPFAttributes)
the compiler gives me this error message:
'Int' is not identical to 'NSOpenGLPixelFormatAttribute'
If I made a mistake somewhere, I'm not seeing it.
NSOpenGLPixelFormatAttribute is typeAlias of UInt32. NSOpenGLPixelFormat intializer takes array of NSOpenGLPixelFormatAttribute so you need to make array and convert all Int to UInt32.Below code will work
let glPFAttributes:[NSOpenGLPixelFormatAttribute] = [
UInt32(NSOpenGLPFAAccelerated),
UInt32(NSOpenGLPFADoubleBuffer),
UInt32(NSOpenGLPFAColorSize), UInt32(48),
UInt32(NSOpenGLPFAAlphaSize), UInt32(16),
UInt32(NSOpenGLPFAMultisample),
UInt32(NSOpenGLPFASampleBuffers), UInt32(1),
UInt32(NSOpenGLPFASamples), UInt32(4),
UInt32(NSOpenGLPFAMinimumPolicy),
UInt32(0)
]
let glPixelFormat = NSOpenGLPixelFormat(attributes: glPFAttributes)

Compiler errors in xCode 6 final release

I was working with a game in swift StriteKit using the beta Xcode. But now with the final release I had a lot of error all of them I been able to fix except this one.
THIS WAS MY ORIGINAL CODE WITH NO ERRORS USING BETA XCODE:
bird.zRotation = self.clamp(-1, max: 0.5, value: bird.physicsBody.velocity.dy * (bird.physicsBody?.velocity.dy < 0 ?0.003 : 0.001 ))
But xCode final release indicates a compiler error on physicsBody stating that: 'SKphysicsBody?' does not have a member called velocity.
I fix this by adding '?' the optional type to physicsBody.
bird.zRotation = self.clamp(-1, max: 0.5, value: bird.physicsBody?.velocity.dy * (bird.physicsBody?.velocity.dy < 0 ?0.003 : 0.001 ))
But stil a new error appears this time on dy stating GGFloat not unwrapped I tried using '!' after dy or '?' still the compiler after doing this suggest deleting it, stating Postfix '?' should have optional type ; type is CGFloat.
I have tried to look for information on what is going on exactly but I can't fix this error. Please help.
It's possible that Xcode is getting confused trying to perform arithmetic on potentially nil values. I'd try moving the optionals out of the self.clamp call. Also, watch out for the spacing around the ternary check: it might be trying to unwrap the value adjacent to the ? operator.
Try
if let dy = bird.physicsBody?.velocity.dy {
self.clamp(-1, max: 0.5, value: dy * ((dy < 0) ? 0.003 : 0.001)
}

Resources