-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rework defaulting #730
Comments
Having thought about this some more, I want to make the following concrete proposal:
The numeric literals will have preferred types according to the following rules:
If the type of two literals needs to be unified and their preferred type does not match, the result is a that the type has no preferred type and the result will be an ambiguous type error unless the type is eventually unified with something else. This squashes ALL the usual defaulting warnings (they either disappear or become errors) and I think it makes the whole system both more flexible and more predictable. |
The proposal overall seems good. However, I prefer the stricter types we have right now on hex, octal, and binary literals. In the years since the change was made, I’ve never written a spec where that caused an issue. It’s nice to run examples in the interpreter without typing all the leading zeroes, but the backtick works just fine there. We did write specs that called for 6-bit and 7-bit ascii, and so relaxing the character types would be sometimes beneficial. |
The idea for relaxing the rules for hex literals, etc, was to deal with the issue mentioned in this comment #744 (comment). Basically cryptol will print things like Regarding character types, are you also suggesting that string literals have more flexible types, so you could directly write, e.g. |
I like the general plan that Rob is suggesting but I also think that we should keep the literals monophonic, and fix the printer to be smarter about how it shows things. The reason I think we should keep things monophonic is that defaulting only happens when a type would be otherwise ambiguous (which is itself a bit of a hard thing to specify). If a type is not ambiguous, then we just infer a polymorphic type. If literals are polymorphic, then a lot more declarations end up being polymorphic and you end up with polymorphic code on accident. We should also keep in mind that we have two defaulting algorithms which serve different purposes:
(1) is quite important for the semantics of the language; (2) is a quality of life thing. I think folks new to Cryptol often tend to lump these two together as they are both warnings so we should consider how to phrase things better. Perhaps, being more aggressive on turning warnings into errors in (1) if we can't default would help. I am all for that! |
As I said, I don't have a strong opinion about the hex, octal and binary literals. However, I don't think you'd end up with overly-polymorphic code very often. The whole idea of the "preferred" type is that if a type variable has a preferred type, you simply instantiate it at the preferred type instead of generalizing. |
Hmm, wouldn't that be very confusing? If I understand correctly, just naming a literal would change its behavior... Consider this example:
Also, at least at the moment, the Overall, sticking with the simple monomorphic story is a lot simpler. EDIT: Fix example as it was bogus (thanks @robdockins ) |
By the way, I think that having
Thoughts? |
Maybe you meant the types of In any case, I don't think this is terribly confusing, but obviously opinions can differ. |
Yes that's true, but in that example there are no character literals :-) The whole characters in the type thing is a terrible hack. |
I've implemented essentially the suggestion from #730 (comment) in PR #774. This leaves the fixed-width literals alone (they still correspond to precise bitwidths). Decimal literals default to unlimited precision types (Integer or Rational) only; the associated warnings are suppressed by default. Other situations that used to default are now errors instead (except that we will default polymorphic values at the REPL as before). Overall, I'm fairly pleased with the result, and I think we should seriously consider adopting it. |
I'm also quite happy with what I've seen. |
Fixed via #774 |
The defaulting rules, especially for bitvector widths, have long been a source of irritation and confusion in Cryptol. With PR #724, we have a good opportunity/excuse to revisit this topic, especially as we are integrating the long-planned feature of generalizing indexing operations over
Integral
types.In my experience, the vast majority of defaulting and resulting warnings occur because of source-file literal values (including finite enumerations). I think our goal should be to eliminate the vast majority of these warnings via smart defaulting so that the remaining cases can be upgraded to errors instead.
I suggested (CF #724 (comment)) the following strategies:
For
Literal n a
constraints wherea
is an unconstrained variable or constrained only byZero
,Ring
orIntegral
, we should default without warning toInteger
. ForLiteral n a
wherea
is a variable constrained byField
orRound
, we should default toRational
.In addition, I think we should consider the following rule: All other cases should be an ambiguous type error (including
Literal n [_]
cases we currently default with warnings). This includes removing special case defaulting rules for indexing primitives, etc. If we adopt this rule, decimal literals at bitvector type will always have to have their widths fixed by the surrounding context (type signatures or an explicit type ascription).The text was updated successfully, but these errors were encountered: