Cast integer to enum type with Flags attribute - enums

I have an enum type with the flags attribute:
[<Flags>]
type StatusWord =
| DATAPROCESS = 0b0000000001
| ERRORPRESENT = 0b0000000010
| CHECKSUMERROR = 0b0000000100
| PACKETSIZEERROR = 0b0000001000
| TIMEOUT = 0b0000010000
| DONOTRETRY = 0b1000000000
and during some data initialization, I have a uint16 value I want to convert to the enum's type, StatusWord, so I can compare it's properties:
let value: uint16 = 0b1000001001us
let flags: StatusWord = StatusWord value
but, as you might be able to guess, this code doesn't compile; the conversion isn't available. Likewise, I also can't do explicit casts, eg. value :> StatusWord or value :?> StatusWord. This is a simple task in C#, so I'm having trouble figuring out why I can't do it in F#.

So, two things you have to worry about. One (which I think you already realize) is that your underlying enum type is int32, and your value is uint16, so a conversion will need to happen somewhere. Two, you have to construct the enum type.
StatusWord looks like a constructor (similar to a union case member), but it's not. So here are two ways to do it with your uint16 value, and a third way to do it which is much better for readability, if you can do it that way.
let value = 0b1000001001us
// use F# enum operator
let flags1 = enum<StatusWord> (int value)
// use static Enum class
let flags2 = Enum.Parse(typeof<StatusWord>, string value) :?> StatusWord
// do bitwise stuff, of course now the compiler knows what you're doing
let flags3 = StatusWord.DATAPROCESS ||| StatusWord.PACKETSIZEERROR ||| StatusWord.DONOTRETRY
Because there are multiple ways, I had to refresh my memory, which I did at
https://fsharpforfunandprofit.com/posts/enum-types/
which is highly recommended reading (that article and the rest of that blog - it's how many people learn F#).

To convert a numerical value to an enum in F#, you can use the built-in LanguagePrimitives.EnumOfValue function.
let flags: StatusWord = LanguagePrimitives.EnumOfValue value
In your example, this does not actually work, because the type of value is uint16, but the underlying type of the enum is int (because the values do not have the us suffix). To get that to work, you'll need to either convert uint32 to int, or change the definition of your enum. The following works perfectly:
[<Flags>]
type StatusWord =
| DATAPROCESS = 0b0000000001us
| ERRORPRESENT = 0b0000000010us
| CHECKSUMERROR = 0b0000000100us
| PACKETSIZEERROR = 0b0000001000us
| TIMEOUT = 0b0000010000us
| DONOTRETRY = 0b1000000000us
let value: uint16 = 0b1000001001us
let flags: StatusWord = LanguagePrimitives.EnumOfValue value
EDIT: Jim's answer mentions the enum function. For some reason (I'm not sure why!) this only works with int32 arguments, so using EnumOfValue is probably better if you want to keep the base type of your enum as uint16. If you wanted to keep that as int32, then enum is much nicer option!

Related

How to write a Go type constraint for something you can take len() of?

I am trying to write a type constraint for a Go program, that accepts "anything you can take len() of". But I can't really figure it out.
I want something like:
LenOf[m Measurable](m M) int {
return len(m)
}
I tried a few things. Fx. this naiive thing, which do compile, but doesn't work on all types (like fx []User):
type Measurable interface {
~string | []any | ~map[any]any
}
Then went on to something like, the below which not only makes the function signature for LenOf() extremely clunky, but also have clumsy to write on call sites (and still can't get it to compile)
type Measurable[K comparable, V any] interface {
~string | []V | ~map[K]V
}
Why? The builtin len is already "generic".
With that said, let's see why defining such a constraint is a bad idea. The Go spec has a paragraph — Length and capacity, that can help:
If the argument type is a type parameter P, the call len(e) (or cap(e) respectively) must be valid for each type in P's type set. The result is the length (or capacity, respectively) of the argument whose type corresponds to the type argument with which P was instantiated.
The issues with writing an all-encompassing constraint for "measurable" types are:
it includes arrays [N]T, where the array length is part of the type, so your constraint would have to specify all possible arrays you want to capture
it includes array pointers *[N]T, which you can't easily abstract in a type constraint
it includes maps, which forces you to capture keys K and values V, which may or may not be the same as T. Plus, K must implement comparable.
So you'd have to write something like:
type Measurable[T any, K comparable, V any] interface {
~string | ~[]T | ~map[K]V | ~chan T
}
which notably doesn't include arrays, and doesn't distinctly capture pointer literals, e.g. to match []*int, you would have to instantiate T with *int.
You might simplify V away:
type Measurable[T any, K comparable] interface {
~string | ~[]T | ~map[K]T | ~chan T
}
The function LenOf then becomes:
func LenOf[T any, K comparable, M Measurable[T, K]](m M) int {
return len(m)
}
but you still have to supply K, so you have to instantiate LenOf at call site with bogus types for the map key:
LenOf[string, int]("foo")
// ^ actually useless
and you can't take advantage of type inference with the argument either.
In conclusion: just use len. Design your generic functions to work with type literals that support length, or add to the constraint only those types that your functions are reasonably expected to handle.

How to create constant uint32 tags with four ASCII codes at compile time?

I need to create many uint32 tags which are mapping of ASCII letters. For instance the tag "abcd" is encoded as 0x61626364 where each byte correspond to the ASCII code of the letter.
A straightforward solution is to define the tag values like this
type Tag uint32
const Tag_abcd = Tag(0x61626364)
But this is error prone.
A less error prone solution would be to define the tag values with a function receiving the letters as argument.
const Tag_abcd = foo("abcd")
or like this as can easily been done with a macro in C
const Tag_abcd = bar('a','b','c','d')
But this would require support of functions evaluated at compile time. As far as I know, it is not possible with Go. Am I correct ? Could there be another way ?
You may assemble the constant using rune literals and bit shifts. It won't be too compact, but it will be "safe" (meaning you can see the characters in the constant expression):
const TagABCD Tag = 'a'<<24 + 'b'<<16 + 'c'<<8 + 'd'
Alternatively you may write it in multiple lines, so the letters are aligned in a column:
const TagABCD2 Tag = 0 +
'a'<<24 +
'b'<<16 +
'c'<<8 +
'd'
To expand on icza's answer, and to improve readability of the tag declaration, you can:
declare helper constants in the form <letter><number>, where <letter> is the related ASCII character and <number> is the position of that char in the 4-character tag.
bit-shift the rune by iota* 8
compose the tag constant by OR |'ing the helper constants
import "fmt"
const (
a1 uint32 = 'a'<<(iota*8)
a2
a3
a4
)
// other similar const declarations for b1,b2,b3,b4 and so on
// must repeat the keyword const to reset iota
const Tag_abcd = a4 | b3 | c2 | d1
const Tag_ddba = d4 | d3 | b2 | a1
func main() {
fmt.Printf("%x\n", Tag_abcd) // 61626364
fmt.Printf("%x\n", Tag_ddba) // 64646261
}
The advantage is that:
the tag declaration is probably easier to read and more straightforward for human maintainers
the helper identifiers can be easily refactored with IDE support
The disadvantage is that:
the source might become more verbose, but you can mitigate this by isolating the helper consts into a separate file
for uppercase ASCII, helper consts as A1 will become exported, so then you might have to prefix the identifier with _ or similar tricks
YMMV
Playground
You could use go generate to parse your compound constant value to different memory allocations.
it is quite complex to maintain this implementation though if you have a team that instantly needs to know what this does and when it is safe to refactor it

What is the difference between literals and non-literals, other than the fact that non-literals go into the heap?

I am confused by the difference between literals and non-literals (the ones that go on the heap, I do not know what they are called). For example, taking the String type as an example:
We’ve already seen string literals, where a string value is hardcoded
into our program. String literals are convenient, but they aren’t
always suitable for every situation in which you want to use text. One
reason is that they’re immutable. ...
I do not understand the above, as we have already seen an example like this:
let mut a = "a"; // this is String literal here, so sitting on the stack
a = "b";
println!("a is being changed to...{}", a); // this is the same String literal sitting on the stack?
Clearly literals can be mutable in Rust. What is the difference between the two, other than the fact that literals go into the stack, while non-literals go into the heap?
I am trying to understand why I shouldn't just use mutable literals in my code, considering that the stack is faster than the heap.
// a is mutable literal
let mut a = "a";
a = "b";
// b is mutable 'non-literal'
let mut b = String::from("a");
b = String::from("b");
Clearly literals can be mutable in Rust
First, you need to understand what a literal is. Literals are never mutable because they are literally written in the source code and compiled into the final binary. Your program does not change your source code!
An example showing that you cannot modify a literal:
fn main() {
1 += 2;
}
error[E0067]: invalid left-hand side expression
--> src/main.rs:2:5
|
2 | 1 += 2;
| ^ invalid expression for left-hand side
On the other hand, a literal can be copied into a variable and then the variable can be changed, but we still are not mutating the literal 1:
fn main() {
let mut a = 1;
a += 2;
}
To be honest, I don't know what I would call a "non-literal". A literal is a specific type of expression, but there are other types of things in a program besides expressions. It's kind of like saying "cats" and "non-cats" — does that second group include dogs, mushrooms, sand, and/or emotions?
the fact that literals go into the stack, while non-literals go into the heap
Those two qualities aren't really directly related. It's pretty easy to have non-literals on the stack:
fn main() {
let a = 1;
let b = 2;
let c = a + b;
}
All three variables are on the stack, but there is no literal 3 anywhere in the source code.
Right now, Rust doesn't allow for a literal value to have a heap-allocation, but that's a language-specific thing that might change over time. Other languages probably allow it.
In fact, you have to go out of your way in Rust to put something on the heap. Types like Box, Vec, and String all call functions to allocate space on the heap. The only way for your code to use heap memory is if you use these types, other types that use them, or types which allocate heap memory in some other way.
What is the reason we cannot use String literal data-type
There is no String literal — none. The source code "foo" creates a literal of type &'static str. These are drastically different types. Specifically, the Rust language can work in environments where there is no heap; no literal could assume that it's possible to allocate memory.
have to specifically use String::from()
String::from converts from &str to a String; they are two different types and a conversion must be performed.
Clearly, as per the example, in my code, both can be mutable
No, they cannot. It is impossible to start with let mut foo = "a" and modify that "a" to become anything else. You can change what that foo points to:
let mut foo = "a";
foo
+-----------+
|
|
+---v---+
| |
| "a" |
| |
+-------+
foo = "b";
foo
+----------+
|
|
+-------+ +---v---+
| | | |
| "a" | | "b" |
| | | |
+-------+ +-------+
Neither "a" nor "b" ever change, but what foo points to does.
This isn't specific to Rust. Java and C# strings are also immutable, for example, but you can reassign a variable to point to a different immutable string.
See also:
What does the word "literal" mean?
What's the difference in `mut` before a variable name and after the `:`?
What are the differences between Rust's `String` and `str`?

Enums in F# for strings

I'm working on a legacy database with no systems to which I can modify. Because of type providers I decided to use F#. It's not so simple now that I'm into it. How do I structure a field in my DB that has only the following strings? It's sort of like an ENUM. I've tried the following but haven't quite gotten it.
Attempt 1
type SuspendedDriver = "SI"
type ActiveDriver = "IMPRESO"
type Applicant = "NO"
type TemporaryDriver = "TEMP"
type Uncertain = "NULL"
type DriverStatus =
| SuspendedDriver
| ActiveDriver
| Applicant
| TemporaryDriver
| Uncertain
type Driver = {
status : DriverStatus
}
Error
Error FS0618: Invalid literal in type (FS0618) (ScriptTest)
Attempt 2
type DriverStatus =
| SuspendedDriver = "SI"
| ActiveDriver = "IMPRESO"
| Applicant = "NO"
| TemporaryDriver = "TEMP"
| Uncertain = "NULL"
Error
Error FS0951: Literal enumerations must have type int, uint, int16,
uint16, int64, uint64, byte, sbyte or char (FS0951) (ScriptTest)
As mentioned in the comment, you cannot define an enum that has values erased to strings. Enums can only be integers (of various types).
Using enums in an ugly way
You could use the actual names of the enum fields and then parse them using Enum.Parse, but that's probably not good idea, because the enum cases would be fairly obscure codes. But just for the record, the following works:
type DriverStatus =
| SI = 0
System.Enum.Parse(typeof<DriverStatus>, "SI") :?> DriverStatus
Cleaner parsing into discriminated union
In practice, I would do what Sam recommends and write a function to parse the string from the database. This has the advantage that you'll need to figure out what to do with errors in the database (the value can always be something wrong):
type DriverStatus =
| SuspendedDriver
| ActiveDriver
let parseDriverStatus = function
| "SI" -> SuspendedDriver
| "IMPRESO" -> ActiveDriver
| code -> failwith (sprintf "Wrong driver code! %s" code)
Using the cool Enum SQL provider
Finally, if you happen to be using MS SQL database, then the SQL Command Provider project has a neat feature that lets you automatically import enum-like types from SQL database itself, which might actually be exactly what you need in this case.

Map from discriminated union to enum

Currently, I'm trying to teach myself some F# by making an application that consists of a C# GUI layer and an F# business layer. In the GUI layer, the user will at some point have to make a choice by selecting a value that is part of a simple enum, e.g. selecting either of the following:
enum {One, Two, Three}
I have written a function to translate the enum value to an F# discriminated union
type MyValues =
| One
| Two
| Three
Now I have to translate back, and am already tired of the boilerplate code. Is there a generic way to translate my discriminated union to the corresponding enum, and vice versa?
Cheers,
You can also define the enum in F# and avoid doing conversions altogether:
type MyValues =
| One = 0
| Two = 1
| Three = 2
The = <num> bit tells the F# compiler that it should compile the type as a union. When using the type from C#, this will appear as a completely normal enum. The only danger is that someone from C# can call your code with (MyValues)4, which will compile, but it will cause incomplete pattern match exception if you are using match in F#.
Here are generic DU/enum converters.
open Microsoft.FSharp.Reflection
type Union<'U>() =
static member val Cases =
FSharpType.GetUnionCases(typeof<'U>)
|> Array.sortBy (fun case -> case.Tag)
|> Array.map (fun case -> FSharpValue.MakeUnion(case, [||]) :?> 'U)
let ofEnum e =
let i = LanguagePrimitives.EnumToValue e
Union.Cases.[i - 1]
let toEnum u =
let i = Union.Cases |> Array.findIndex ((=) u)
LanguagePrimitives.EnumOfValue (i + 1)
let du : MyValues = ofEnum ConsoleColor.DarkGreen
let enum : ConsoleColor = toEnum Three
It maps the DU tag to the enum underlying value.

Resources