Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

TFA notes that rustc is not a particular _efficient_ SAT solver, taking something like 10^4 times as long as a commonly used algorithm [1] would on a small example.

So, I'm curious: what's the _strongest_ case one could make for including a more robust SAT solver as part of the exhaustiveness checking? Is there any reasonable place where this could matter? (Extremely autogenerated code?)

[1] https://en.wikipedia.org/wiki/DPLL_algorithm



I was under the impression that cases where it would matter are typically code that wouldn't usually be written. E.g. C# and Java also can solve SAT as part of overload resolution with generic types and it's an interesting peculiarity, but not a pattern that is interesting enough to optimize. Perhaps that's the case here as well.

I could also imagine the compiler code being easy to read, understand and modify are often more valuable traits than compilation speed of highly atypical patterns in a language. But that may well depend on how atypical those are.


The fact that most dependently typed languages (which have a much higher propensity for complicated exhaustiveness checking when pattern matching) don’t use a serious SAT solver for this makes me doubt Rust would ever need to.


Doesn't Liquid Haskell[1] use a SAT solver?

[1] https://ucsd-progsys.github.io/liquidhaskell-blog/


Like most languages with refinement types it uses an SMT solver (a more general problem), though I’m not sure it uses it for pattern matching. I was thinking of Coq, Agda, Lean, etc. which are full-spectrum dependently typed languages. LiquidHaskell is more specialized, though its automated proof capabilities are much stronger.


Dependent typing goes beyond SMT and SAT, so those languages wouldn't be able to leverage those solvers except in specific circumstances. See the language F* [1], for instance.

[1] https://www.fstar-lang.org/


It’s easy to embed the exact problem Rust has to solve for exhaustiveness checking into e.g. Lean’s pattern matching so they have the same opportunity to use SAT for such checks at least for some set of pattern matching situations (though as I said doubt it would be worthwhile). Using SMT for refinement types like F* does is mostly orthogonal to the problem this article is talking about, which arises just from structural pattern matching.


If you were using rustc as some sort of JIT compiler compile times could become extremely important, and error messages less important, at which point things like this might be worth it.

If you wanted the SAT solver for some other reason (optimization? proving code correct?) it might make sense to use it for this too.


One perhaps-interesting use case might be to use the fastest possible solver in the case of compiling generated/foreign code, so basically code that's expected to always compile (but which still needs to be checked for safety).

One could even imagine using it as a first-pass for code and to just fall back to a slower-but-with-better-error-messages type checker if errors are found.

(Of course you'd end up with two orthogonal implementations of the type checker, but that might not even be a bad thing since you'd be easily able to check for inconsistencies between them, aka. type checker bugs.)


Rust's proc-macro system makes it pretty easy to create some hard to compile code. but there's many ways to generate very slow compiling code with proc macros.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: