I'm having this strange issue with my code where it seems as if the compiler is implicitly casting my argument to another type. However when I tag the constructor as explicit, it didn't seem to fix the issue.
I have this in my unit test
JsonValue stringItem("test");
CHECK(stringItem.type() == JsonValue::Type::String);
This fails with the result
4 == 3
These are what the constructors look like...
JsonValue::JsonValue()
: mType(Type::Null) {
}
JsonValue::JsonValue(bool ab)
: mType(Type::Boolean) {
mData.b = ab;
}
JsonValue::JsonValue(int ai)
: mType(Type::Int) {
mData.i = ai;
}
JsonValue::JsonValue(std::uint32_t aui)
: mType(Type::UnsignedInt) {
mData.ui = aui;
}
// It should be using this constructory
// but mType is not getting set to Type::String
JsonValue::JsonValue(const std::string &astr)
: mType(Type::String) {
mData.str = new std::string(astr);
}
As I mentioned before, tagging JsonValue(bool) as explicit didn't fix the issue. I also compiled with -Wconversion without warnings
Enum looks like this
enum Type {
Null = 0,
Object,
Array,
String,
Boolean,
Int,
UnsignedInt
};
You need to be explicit about the argument to the constructor:
JsonValue stringItem(std::string("test"));
What is happening is that you are getting an implicit conversion from const char* to bool, because this is a conversion between built-in types, and a better match than the conversion from const char* to std::string, which is a conversion involving a built-in type.
Alternatively, you can add a constructor that takes const char* and stores a string internally. This is a better option because it avoids the easy to make error you encountered:
JsonValue::JsonValue(const char* astr)
: mType(Type::String) {
mData.str = new std::string(astr);
}
Note that on the surface these seems to be no reason to store a dynamically allocated string. This probably adds unnecessary complications.
Related
Given the following enum defined in an external api.
public enum Status {
COMPLETE,
RUNNING,
WAITING
}
I would like a way to add a int flag to each enum value. I know that I can extend the enum:
fun Status.flag(): Int {
when(this) {
RUNNING -> return 1;
WAITING -> return 2;
else -> return 0;
}
}
However I would like to define those int flag values as constants. Maybe a companion object, but I don't think I can extend an existing enum and add a companion object.
Any ideas?
Unless you are using a field that already exists in the original enum (like ordinal), you won't be able to do what you're asking without wrapping the external enum in your own enum.
Sure you could use ordinal, but a newer version of the external API may change the order of the items in the enum, so I wouldn't recommend it. But, if you REALLY want to, you could do something like this (again, this is NOT recommended):
val Status.flag: Int
get() = this.ordinal
But I'd definitely recommend wrapping it. That way you guarantee that the flag integers you define won't change.
enum class MyStatus(val status: Status, val flag: Int) {
COMPLETE(Status.COMPLETE, 0),
RUNNING(Status.RUNNING, 1),
WAITING(Status.WAITING, 2);
companion object {
private val STATUS_TO_MYSTATUS = values().associateBy { it.status }
fun fromStatus(status: Status): MyStatus {
return STATUS_TO_MYSTATUS[status] ?: throw Exception("No MyStatus found for status ${status.name}")
}
}
}
You can then convert Status to MyStatus by using MyStatus.fromStatus(...). Or you can add an extension function to Status to easily convert to MyStatus.
fun Status.toMyStatus() = MyStatus.fromStatus(this)
You can add extension properties/methods to the companion object of enum/class/etc. if one exists:
val Status.Companion.COMPLETE_INT = 0
val Status.Companion.RUNNING_INT = 1
but indeed you can't currently "create" the companion object if it doesn't. So just put the constants into your own non-companion object:
object StatusFlags {
const val COMPLETE_INT = 0
const val RUNNING_INT = 1
const val WAITING_INT = 2
}
fun Status.flag(): Int {
when(this) {
RUNNING -> return StatusFlags.RUNNING_INT
...
}
}
I've got multiple enums with raw values, but I don't like having to say rawValue: every time I initialize one from a raw value, so I've supplied an alternative delegating initializer with no external label:
enum E1 : Int {
case One, Two
init?(_ what:Int) {
self.init(rawValue:what)
}
}
enum E2 : Int {
case One, Two
init?(_ what:Int) {
self.init(rawValue:what)
}
}
Very nice. I can say let e = E1(0) and the right thing happens.
Now I'd like to consolidate the repeated code. I was hoping that Swift 2.0 protocol extensions would allow me to do this - writing the init?(_ what:Int) initializer in one place and injecting / inheriting it in both enums. However, I haven't found a way that works. The problem is that the protocol extension doesn't know that the adopter will have an init(rawValue:) initializer, and I have not found a way to reassure it.
I suspect that this is because of the automagic way that the rawValue initializer comes into existence, and so probably nothing can be done. But perhaps someone has a suggestion.
Sounds like you're looking to extend the RawRepresentable protocol:
extension RawRepresentable {
init?(_ what: RawValue) {
self.init(rawValue: what)
}
}
Any enum with a raw type automatically conforms to RawRepresentable, therefore you haven't got to make E1 or E2 conform to any extra protocols:
enum E1: Int {
case One = 1, Two
}
enum E2: String {
case One = "1", Two = "2"
}
let e1 = E1(1) // .One
let e2 = E2("2") // .Two
I was able to get it to work, like this:
protocol P {
var rawValue : Int {get}
init?(rawValue: Int)
}
extension P {
init?(_ what:Int) {
self.init(rawValue:what)
}
}
enum E1 : Int, P {
case One, Two
}
enum E2 : Int, P {
case One, Two
}
We use the regular protocol declaration to reassure the protocol extension that our adopter will have an init?(rawValue:).
[However, there is still something very odd going on. This code compiles in one of my projects, but not in another. I think it has something to do with how other code uses the P protocol; if no other code uses the P protocol, all is well, but if other code uses the P protocol, in ways that I have not precisely determined, the code fails to compile, with a mysterious error about the enum not conforming to the protocol.]
Dunno what can be wrong. Would be nice to have isolated case. Maybe more generic version helps?
protocol BaseRaw {
typealias T
var rawValue : T { get }
init?(rawValue: T)
}
extension BaseRaw {
init?(_ value: T) {
self.init(rawValue: value)
}
}
enum E1 : Int, BaseRaw {
case One = 1
case Two = 2
}
enum E2 : String, BaseRaw {
case One = "1"
case Two = "2"
}
let e = E1(1)
let e2 = E2("2")
print(e) // Optional(E1.One)
print(e!.rawValue) // 1
print(e2) // Optional(E2.Two)
print(e2!.rawValue) // "2"
In Swift you can define an enum and give it a property via an associated value, e.g.:
protocol SizeEnum {
var length : Double? { get } // Length should be >= 0 - has to be an Optional for errors
}
enum SizesEnum : SizeEnum {
case Short(length : Double) // 0 <= length <= maxShort
case Long(length : Double) // length > maxShort
private static let maxShort = 1.0
var length : Double? {
get {
switch self {
case let .Short(length):
if length >= 0 && length <= SizesEnum.maxShort { // Need to error check every access
return length
}
case let .Long(length):
if length > SizesEnum.maxShort { // Need to error check every access
return length
}
}
return nil // There was an error
}
}
}
SizesEnum.Short(length: 0.5).length // [Some 0.5]
SizesEnum.Short(length: 2).length // nil
SizesEnum.Long(length: 2).length // [Some 2.0]
SizesEnum.Long(length: -1).length // nil
However this is not ideal because:
The error checking for the length parameter can only be done on access, you cannot intercept the init
The length parameter is surprisingly long winded
An alternative, which seems better to me, is to use a static factory, e.g.:
protocol SizeStruct {
var length : Double { get } // Length should be >= 0 - is *not* an Optional
}
struct SizesStruct : SizeStruct {
static func Short(length : Double) -> SizeStruct? {
if length >= 0 && length <= maxShort { // Check at creation only
return SizesStruct(length)
}
return nil
}
static func Long(length : Double) -> SizeStruct? {
if length > maxShort { // Check at creation only
return SizesStruct(length)
}
return nil
}
let length : Double
private static let maxShort = 1.0
private init(_ length : Double) {
self.length = length
}
}
SizesStruct.Short(0.5)?.length // [Some 0.5]
SizesStruct.Short(2)?.length // nil
SizesStruct.Long(2)?.length // [Some 2.0]
SizesStruct.Long(-1)?.length // nil
Given that the static factory solution is neater, when would I actually use an enum with values? Am I missing something? Is there a killer use case?
In response to drewag
For Optional other languages, e.g. Java and Scala, you use factories, the Java version is described here: http://docs.oracle.com/javase/8/docs/api/java/util/Optional.html the factory is the of method.
In Swift you would do something like:
class Opt { // Note only Some stores the value, not None
//class let None = Opt() - class variables not supported in beta 4!
class Some<T> : Opt {
let value : T
init(_ value : T) {
self.value = value
}
}
private init() {} // Stop any other ways of making an Opt
}
Opt.Some(1).value // 1
This is probably the optimal example for enum since no error checking is required, but even so the factory version is competitive. The Optional example is so straightforward you don't even need a factory, you just create the Somes directly. Note how None doesn't use any storage.
The Barcode example shows how much better the factory technique is; in practice not all collections of 4 Ints are a valid UPCA and not all Strings are a valid QR Code, therefore you need error checking which is painful with enums. Here is the factory version:
class Barcode { // Note seperate storage for each case
class UPCABarcode : Barcode {
let type : Int, l : Int, r : Int, check : Int
private init(type : Int, l : Int, r : Int, check : Int) {
(self.type, self.l, self.r, self.check) = (type, l, r, check)
}
}
class func UPCA(#type : Int, l : Int, r : Int, check : Int) -> UPCABarcode? {
if ok(type: type, l: l, r: r, check: check) {
return UPCABarcode(type: type, l: l, r: r, check: check)
}
return nil
}
class func QRCode(#s : String) -> Barcode? { // Have not expanded this case; use same pattern as UPCA
return Barcode()
}
private init() {} // Prevent any other types of Barcode
class func ok(#type : Int, l : Int, r : Int, check : Int) -> Bool {
return true // In practice has to check supported type, range of L and R, and if check digit is correct
}
}
Barcode.UPCA(type: 0, l: 1, r: 2, check: 3)
If you use the enum version of Barcode then every time you use a Barcode you have to check its validity because there is nothing to stop invalid barcodes. Whereas the factory version does the checking at creation. Note how Barcode has no storage and UPCA has custom storage. I didn't code QRCode because it uses the same design pattern as UPCA.
My impression is that the enum version looks great in tutorials but soon becomes painful in practice because of error handling.
I believe the biggest killer use case is Optional. It is built into the language, but an optional is simply an enum:
enum Optional<T> {
case None
case Some(T)
}
In this case, a member variable like value would not make sense because in the None case, there is literally no value.
Also, right out of the Swift tour:
enum Barcode {
case UPCA(Int, Int, Int, Int)
case QRCode(String)
}
With a struct, there would be a lot of wasted member variables and a confusing interface to model this kind of data.
I don't believe it makes sense to use an associated value for an enum if the same type is used for every case. In that case a member variable is cleaner. The focus of associated values is to more accurately model certain types of data. The most useful cases are ones where different instances of a type can have different data associated with them. This could potentially be done with subclasses but then one would need to downcast to access more specific variables and the declarations would be much more verbose. Enums are a concise way to represent this type of data.
Another example could be web requests:
struct NetRequest {
enum Method {
case GET
case POST(String)
}
var URL: String
var method: Method
}
var getRequest = NetRequest(URL: "http://drewag.me", method: .GET)
var postRequest = NetRequest(URL: "http://drewag.me", method: .POST("{\"username\": \"drewag\"}"))
When I think of "enum" I don't think of "factory" at all. Normally factories are for larger more complex class structures. Enums are supposed to be very small pieces of data with little logic.
I recently updated to Xcode-Beta4. I'm working with a c api and there is a typedef enum like this:
typedef enum {
XML_ELEMENT_NODE= 1,
XML_ATTRIBUTE_NODE= 2,
...
} xmlElementType;
Now I have a xml node, which type I want to check. Therefore the right way would be:
if currentNode.memory.type != XML_ELEMENT_NODE {
In Beta 3 I had to replace XML_ELEMENT_NODE with 1. Now this does not work anmyore. In both cases I get the error xmlElementType is not convertible to UInt8
The simplest workaround is to find the header and replace the typedef enum with an typedef NS_ENUM(...). The problem with this solution is that everybody in your team has to make the changes.
The problem is caused by the fact that the C enum is converted into an opaque type (struct?) C.xmlElementType. This type has one single property value of type UInt32. Unfortunately, this property is not public. You can call it from the debugger but using it in compiled code results in an error.
I managed to do a workaround using reflect but it's a big hack:
extension xmlElementType : Equatable {
}
public func ==(lhs: xmlElementType, rhs: xmlElementType) -> Bool {
var intValue1 = reflect(lhs)[0].1.value as UInt32
var intValue2 = reflect(rhs)[0].1.value as UInt32
return (intValue1 == intValue2)
}
var elementType = currentNode.memory.type
if elementType == xmlElementType(1) {
println("Test")
}
I think this is a bug. Either the equality should be defined or some way to cast the struct to an integer.
EDIT:
Another option is to add an inline conversion function to your bridging header:
static inline UInt32 xmlElementTypeToInt(xmlElementType type) {
return (UInt32) type;
}
And then define equality as
public func ==(lhs: xmlElementType, rhs: xmlElementType) -> Bool {
return ((xmlElementTypeToInt(lhs) == xmlElementTypeToInt(rhs))
}
However, the most simple option I have found is to brutally cast the struct to an UInt32:
public func ==(lhs: xmlElementType, rhs: xmlElementType) -> Bool {
var leftValue: UInt32 = reinterpretCast(lhs)
var rightValue: UInt32 = reinterpretCast(rhs)
return (leftValue == rightValue)
}
Note this is less reliable because you have to make sure that the struct actually has 32 bytes and it is not an UInt8, for example. The C conversion function is more stable.
Consider I have the following classes:
/// File classes.d
class C {
string type() { return "C"; };
}
class C1 : C {
override string type() { return "C1"; };
}
class C2 : C {
override string type() { return "C1"; };
}
Now I want to implement a factory somewhere else, like:
/// File factory.d
module factory;
import std.functional;
import std.stdio;
void main() {
mixin(import("classes.d"));
auto c = cast(typeof(mixin("C1"))) Object.factory("C1");
writeln(c.type());
}
The compiler (dmd 2.058) says me:
factory.d(7): Error argument C1 to typeof is not an expression
I know the following line compiles well:
auto c = cast(C1) Object.factory("classes.C1");
but this requires me to know the type (C1) at compile time. I want to get the type at runtime (like a string).
mixin(`auto c = cast(typeof(C1)) Object.factory("C1");`)
I don't understand you question, I think:
You want to cast dynamically at runtime to a given type? This is not possible in a statically typed language! What you can do is, passing your classes (or whatever) around as void* and using a switch … case to cast it to the needed type (you can also use std.variant, which is maybe the better way, but I've never used it) and then calling different functions (templated functions):
final switch(which_type) {
case "C1": foo(cast(C1)bar); break;
case "C2": foo(cast(C2)bar); break;
}
void foo(T)(T a) {
}
You can also generate the switch case at compile-time:
final switch(which_type) {
foreach(i; TypeTuple("C1", "C2")) {
case i: foo(mixin(`cast(` ~ i ~ `)bar`); break;
}
}
If you just wanna get the string of the type typeof(bar).stringof will give you the type of bar as string (but this is already known at compile time).
Mixins are a compile time artifact and can only be used at compile time. typeof can only be used at compile time. D is a statically typed language, and all types must be known at compile time.
You can have a reference of a base class type which refers to a derived class instance, so something like this is possible:
C makeC(string type)
{
switch(type)
{
case "C": return new C;
case "C1": return new C1;
case "C2": return new C2;
default: throw new Exception(format("%s is not a C", tye));
}
}
C c = makeC("C1");
There's also std.variant which can be used to hold different types (it uses a union internally).
But you can't decide what the type of a variable is going to be at runtime, because D is statically typed.