Intellisense with Union Types - visual-studio

I find that intellisense is missing when assigning to a var with a type that is a union type. This makes sense - the compiler doesn't know which of the unioned types you are assigning (although at some point it could deduce when it has enough information but it does not do this either...).
Fine - so I can be explicit and cast the assignment to the type I intend, and the intellisense returns. But this leads to a second problem - for some reason it seems that TypeScript will allow the cast of an empty object literal to any interface, but as soon as a single property is added, the object literal must satisfy the entire interface.
If have two direct questions about this behavior, and they are in the comments in the following code example. I realize I could declare the test vars of more specific types - that is not the point of this topic. Thanks for your help.
interface ITestOne {
a: string;
b?: string;
}
interface ITestTwo {
c: string;
}
type EitherType = ITestOne | ITestTwo;
var test1: EitherType = {}; // ERROR, no intellisense to help fill out the required properties in the object literal
var test2: EitherType = {} as ITestOne; // ALLOWED - Why is this allowed?
var test3: EitherType = { b: 'blah' } as ITestOne; // ERROR: property a is missing. Why ISN'T this allowed if the line above is allowed?
UPDATE 2017-0131
Reply From a bug report I opened on the typescript project on this topic:
What type assertion does, it tells the compiler to "shut up" and trust you. The operator behaves both as an upcast and as a downcast operator. The only check is that the one of the types is assignable to the other.
In the example above, for test: {a: string, b?:string} is assignable to {} (which requires no arguments); for test2 {a: string, b?:string} is assignable to {b:string}, since the type of the only required argument in the target b matches. for test3 neither {a: string, b?:string} is assignable to {b:string, x:string} since it is missing x nor {b:string, x:string} to {a: string, b?:string} since it is missing a.
So, when casting, the source or the target are only verified not to be two completely unrelated types (i.e. number and string), but otherwise the assignment is allowed. My test3 case produced the described result in TypeScript 1.7, but it is now allowed in TypeScript 2.1.
My question about how to get meaningful intellisense in this scenario still stands. However, I suspect the answer is that it is simply not supported without the use of a type guard block.

Related

F# Equivalent of Enum.TryParse

In C#, the following code is valid:
MyEnum myEnum = MyEnum.DEFAULT;
if (Enum.TryParse<MyEnum>(string, out myEnum))
{
Console.WriteLine("Success!");
}
So I thought I would use this in F#. Here is my attempt:
let mutable myEnum = MyEnum.DEFAULT
if Enum.TryParse<MyEnum>(string, &myEnum) then
printfn "Success!"
But it complains
a generic construct requires that the type 'MyEnum' have a default constructor
What in the world does that mean?
This is a rather unhelpful (if technically correct) message that the compiler gives you if you try to parse a discriminated union value using Enum.TryParse.
More precisely, if you look at that function, you'll see that it's parameterized with a type that's constrained to be a value type with a default constructor. DU's meet neither of this criteria - this is what the compiler complains about.
When defining an enum in F#, unlike C#, you need to explicitly give each label a value:
type MyEnum =
| Default = 0
| Custom = 1
| Fancy = 2
Skipping the values would make the compiler interpret the type as a discriminated union, which is a very different beast.

Coping with misleading error messages of the Swift compiler (context dependence, type inference)

While the Swift compiler (Xcode 7.2) seems perfectly correct in diagnosing an error for some source text equivalent to the following, it took long to detect the actual error made. Reason: the programmer needs to look not at the text marked, but elsewhere, thus mislead, wondering why an optional string and a non-optional string can not be operands of ??...
struct Outer {
var text : String
}
var opt : String?
var context : Outer
context = opt ?? "abc"
Obviously, the last line should have had context.text as the variable to be assigned. This is diagnosed:
confusion2.swift:9:19: error: binary operator '??' cannot be applied\
to operands of type 'String?' and 'String'
context = opt ?? "abc"
~~~ ^ ~~~~~
The message is formally correct. (I am assuming that type checking the left hand side establishes an expected type (Outer) for the right hand side, and this, then, renders the expression as not working, type-wise.) Taken literally, though, the diagnosis is wrong, as is seen when fixing the left hand side: ?? can be applied to operands of type String? and String.
Now, if this is as good as it gets, currently, in terms of compiler messages, what are good coping strategies? Is remembering
Type inference!
Context!
…
a start? Is there a more systematical approach? A check list?
Update (I'm adding to the list as answers come in. Thanks!)
break statements apart, so as to have several lines checked separately (#vacawama)
Beware of optionals (such as values got from dictionaries), see testSwitchOpt below
Another one
enum T {
case Str(String)
case Integer(Int)
}
func testSwitchOpt(x : T?) -> Int {
switch x {
case .Integer(let r): return r
default: return 0
}
}
The compiler says
optandswitch.swift:8:15: error: enum case 'Integer' not found in type 'T?'
case .Integer(let r): return r
A fix is to write switch x! (or a more cautious let), so as to make type checking address the proper type, I guess.
I could, perhaps should, file some report at Apple, but the issue seems to represent a recurring subject—I have seen this with other compilers—and I was hoping for some general and re-usable hints, if you don't mind sharing them.
Swift's type inference system is great in general, but it can lead to very confusing to outright wrong error messages.
When you get one of these Swift error messages that makes no sense, a good strategy is to break the line into parts. This will allow Swift to return a better error message before it goes too far down the wrong path.
For example, in your case, if you introduce a temporary variable, the real problem becomes clear:
// context = opt ?? "abc"
let temp = opt ?? "abc"
context = temp
Now the error message reads:
Cannot assign value of type 'String' to type 'Outer'

Swift optionals - warning on conditional cast from 'x' to 'x' always succeeds

I was wondering if there is a way to turn off/avoid 'yellow' warnings in xcode on if let...NSUserDefaults constructs where the key is of a known value.
For example:
if let x = NSUserDefaults.standardUserDefaults().integerForKey("myKey") as? Int {...}
Because of the if let I have to use as?. However, as I am using a known value type (in this case integer) the as? Int is effectively redundant - which is why I am getting the 'yellow warning'.
Thoughts? Is there a better way to code these types of constructs?
My suggestion would be to address the issue instead of silencing the warnings. :)
NSUserDefaults.standardUserDefaults().integerForKey("myKey") does not return an Optional, and the type is known, so you don't need neither optional binding with if let nor type casting with as?.
Just this:
let x = NSUserDefaults.standardUserDefaults().integerForKey("myKey")
will suffice, since .integerForKey just returns 0 if it can't get the actual value.
If you don't like this behavior of getting a default value (I don't), then don't use .integerForKey and use objectForKey with optional binding and type casting instead. Like you were doing first but with .objectForKey replacing .integerForKey. That way you'll get an actual nil if the value for the key is unreachable, not a default value.
if let x = NSUserDefaults.standardUserDefaults(). objectForKey("myKey") as? Int {...}
First of all check always the signature:
⌥-click on the symbol integerForKey: or look at Quick Help.
You will see:
func integerForKey(_ defaultName: String) -> Int
It reveals the return value is a non optional.
Non optionals can retrieved directly as described in Eric's answer without any type casting, optional binding causes an error.
That's one of the essential semantics in Swift.

How do the different enum variants work in TypeScript?

TypeScript has a bunch of different ways to define an enum:
enum Alpha { X, Y, Z }
const enum Beta { X, Y, Z }
declare enum Gamma { X, Y, Z }
declare const enum Delta { X, Y, Z }
If I try to use a value from Gamma at runtime, I get an error because Gamma is not defined, but that's not the case for Delta or Alpha? What does const or declare mean on the declarations here?
There's also a preserveConstEnums compiler flag -- how does this interact with these?
There are four different aspects to enums in TypeScript you need to be aware of. First, some definitions:
"lookup object"
If you write this enum:
enum Foo { X, Y }
TypeScript will emit the following object:
var Foo;
(function (Foo) {
Foo[Foo["X"] = 0] = "X";
Foo[Foo["Y"] = 1] = "Y";
})(Foo || (Foo = {}));
I'll refer to this as the lookup object. Its purpose is twofold: to serve as a mapping from strings to numbers, e.g. when writing Foo.X or Foo['X'], and to serve as a mapping from numbers to strings. That reverse mapping is useful for debugging or logging purposes -- you will often have the value 0 or 1 and want to get the corresponding string "X" or "Y".
"declare" or "ambient"
In TypeScript, you can "declare" things that the compiler should know about, but not actually emit code for. This is useful when you have libraries like jQuery that define some object (e.g. $) that you want type information about, but don't need any code created by the compiler. The spec and other documentation refers to declarations made this way as being in an "ambient" context; it is important to note that all declarations in a .d.ts file are "ambient" (either requiring an explicit declare modifier or having it implicitly, depending on the declaration type).
"inlining"
For performance and code size reasons, it's often preferable to have a reference to an enum member replaced by its numeric equivalent when compiled:
enum Foo { X = 4 }
var y = Foo.X; // emits "var y = 4";
The spec calls this substitution, I will call it inlining because it sounds cooler. Sometimes you will not want enum members to be inlined, for example because the enum value might change in a future version of the API.
Enums, how do they work?
Let's break this down by each aspect of an enum. Unfortunately, each of these four sections is going to reference terms from all of the others, so you'll probably need to read this whole thing more than once.
computed vs non-computed (constant)
Enum members can either be computed or not. The spec calls non-computed members constant, but I'll call them non-computed to avoid confusion with const.
A computed enum member is one whose value is not known at compile-time. References to computed members cannot be inlined, of course. Conversely, a non-computed enum member is once whose value is known at compile-time. References to non-computed members are always inlined.
Which enum members are computed and which are non-computed? First, all members of a const enum are constant (i.e. non-computed), as the name implies. For a non-const enum, it depends on whether you're looking at an ambient (declare) enum or a non-ambient enum.
A member of a declare enum (i.e. ambient enum) is constant if and only if it has an initializer. Otherwise, it is computed. Note that in a declare enum, only numeric initializers are allowed. Example:
declare enum Foo {
X, // Computed
Y = 2, // Non-computed
Z, // Computed! Not 3! Careful!
Q = 1 + 1 // Error
}
Finally, members of non-declare non-const enums are always considered to be computed. However, their initializing expressions are reduced down to constants if they're computable at compile-time. This means non-const enum members are never inlined (this behavior changed in TypeScript 1.5, see "Changes in TypeScript" at the bottom)
const vs non-const
const
An enum declaration can have the const modifier. If an enum is const, all references to its members inlined.
const enum Foo { A = 4 }
var x = Foo.A; // emitted as "var x = 4;", always
const enums do not produce a lookup object when compiled. For this reason, it is an error to reference Foo in the above code except as part of a member reference. No Foo object will be present at runtime.
non-const
If an enum declaration does not have the const modifier, references to its members are inlined only if the member is non-computed. A non-const, non-declare enum will produce a lookup object.
declare (ambient) vs non-declare
An important preface is that declare in TypeScript has a very specific meaning: This object exists somewhere else. It's for describing existing objects. Using declare to define objects that don't actually exist can have bad consequences; we'll explore those later.
declare
A declare enum will not emit a lookup object. References to its members are inlined if those members are computed (see above on computed vs non-computed).
It's important to note that other forms of reference to a declare enum are allowed, e.g. this code is not a compile error but will fail at runtime:
// Note: Assume no other file has actually created a Foo var at runtime
declare enum Foo { Bar }
var s = 'Bar';
var b = Foo[s]; // Fails
This error falls under the category of "Don't lie to the compiler". If you don't have an object named Foo at runtime, don't write declare enum Foo!
A declare const enum is not different from a const enum, except in the case of --preserveConstEnums (see below).
non-declare
A non-declare enum produces a lookup object if it is not const. Inlining is described above.
--preserveConstEnums flag
This flag has exactly one effect: non-declare const enums will emit a lookup object. Inlining is not affected. This is useful for debugging.
Common Errors
The most common mistake is to use a declare enum when a regular enum or const enum would be more appropriate. A common form is this:
module MyModule {
// Claiming this enum exists with 'declare', but it doesn't...
export declare enum Lies {
Foo = 0,
Bar = 1
}
var x = Lies.Foo; // Depend on inlining
}
module SomeOtherCode {
// x ends up as 'undefined' at runtime
import x = MyModule.Lies;
// Try to use lookup object, which ought to exist
// runtime error, canot read property 0 of undefined
console.log(x[x.Foo]);
}
Remember the golden rule: Never declare things that don't actually exist. Use const enum if you always want inlining, or enum if you want the lookup object.
Changes in TypeScript
Between TypeScript 1.4 and 1.5, there was a change in the behavior (see https://github.com/Microsoft/TypeScript/issues/2183) to make all members of non-declare non-const enums be treated as computed, even if they're explicitly initialized with a literal. This "unsplit the baby", so to speak, making the inlining behavior more predictable and more cleanly separating the concept of const enum from regular enum. Prior to this change, non-computed members of non-const enums were inlined more aggressively.
There are a few things going on here. Let's go case by case.
enum
enum Cheese { Brie, Cheddar }
First, a plain old enum. When compiled to JavaScript, this will emit a lookup table.
The lookup table looks like this:
var Cheese;
(function (Cheese) {
Cheese[Cheese["Brie"] = 0] = "Brie";
Cheese[Cheese["Cheddar"] = 1] = "Cheddar";
})(Cheese || (Cheese = {}));
Then when you have Cheese.Brie in TypeScript, it emits Cheese.Brie in JavaScript which evaluates to 0. Cheese[0] emits Cheese[0] and actually evaluates to "Brie".
const enum
const enum Bread { Rye, Wheat }
No code is actually emitted for this! Its values are inlined. The following emit the value 0 itself in JavaScript:
Bread.Rye
Bread['Rye']
const enums' inlining might be useful for performance reasons.
But what about Bread[0]? This will error out at runtime and your compiler should catch it. There's no lookup table and the compiler doesn't inline here.
Note that in the above case, the --preserveConstEnums flag will cause Bread to emit a lookup table. Its values will still be inlined though.
declare enum
As with other uses of declare, declare emits no code and expects you to have defined the actual code elsewhere. This emits no lookup table:
declare enum Wine { Red, Wine }
Wine.Red emits Wine.Red in JavaScript, but there won't be any Wine lookup table to reference so it's an error unless you've defined it elsewhere.
declare const enum
This emits no lookup table:
declare const enum Fruit { Apple, Pear }
But it does inline! Fruit.Apple emits 0. But again Fruit[0] will error out at runtime because it's not inlined and there's no lookup table.
I've written this up in this playground. I recommend playing there to understand which TypeScript emits which JavaScript.

Is Swift type-inference contradicting itself here?

Here's my test code:
var myDict: [String: AnyObject] = ["k":"v"]
var a = myDict["k"]
var b = a as String
var c = myDict["k"] as String
Here's my Swift playground in Xcode6-beta6:
According to the rules of type inference, doesn't complaining about c logically contradict not-complaining about b?
I believe that this is a bug. Part of what is going on here is that String is not an object. If you change the first line to:
var myDict: [String: Any] = ["k":"v"]
then everything is fine. So, given that string is not an object, casting a variable of type AnyObject? to a String should definitely yield an error. And, since the compiler has already decided that a is of type AnyObject? it should complain about casting a to a String.
Note that if you change the last line to:
var c = myDict["k"] as NSString
the error goes away supporting the notion that the issue is that String is not an object. You get the same complaint if you put an Int as the value in the array and try to cast that to an Int.
Update:
So the plot thickens. If you don't import Foundation or import something that imports Foundation, then you get additional errors. Without Foundation:
So clearly some of this has to do with the dual nature of Strings as non-objects and NSStrings as objects and the ability to use Strings as NSStrings when Foundation is imported.
This has to do with the fact that Dictionary has two subscript overloads:
subscript (key: Key) -> Value?
subscript (i: DictionaryIndex<Key, Value>) -> (Key, Value) { get }
The first is the familiar one where you pass a key and it gives you an optional of the value; and you can use to set the value on a key.
The second one is less common. I believe DictionaryIndex is a kind of iterator into the dictionary, and you can use it as a subscript to directly get the key-value pair at that iterator.
When the compiler can't find an overload that matches (in this case, the first one doesn't match because it returns an optional, which cannot be cast to non-optional String), it just picks one arbitrarily (well, it seems arbitrary to me anyway) to show in the error. In this place, it picks the second one, which you don't recognize. That's why the error seems weird to you.
This works.
var c = myDict["k"] as AnyObject! as String // "v"
To answer your question, the reason Swift complains could be that you are trying to do these two conversions in one go. Remember, the statement var a = myDict["k"] contains an implicit conversion already. The implied conversion is AnyObject?, so the above would also work like this:
var c = myDict["k"] as AnyObject? as String // "v"
Note that the above would lead to a run time error if the key "k" where not defined. You would allow this to return nil by casting to String?.

Resources