There is a particular kind of intellectual cowardice endemic to programming-language design, which consists of writing a feature that almost works, convincing oneself that it works enough, and then moving on to the next item on the roadmap before the cracks start to show. I have indulged in this pastime for years. Cure, for the first dozen or so releases, indulged in it with me. The last four tags—v0.16.0, v0.17.0 with its two patch siblings, v0.18.0, and v0.19.0 followed by v0.19.1—are my belated attempt to stop. Each of them picks one area of the language that had previously been garnished with a thin coat of varnish and strips it down to bare wood.
What follows is the tour. I will try to explain both what changed and why it had to. If the tone reads as exasperated in places, that is because I am mostly exasperated at the person who wrote the earlier versions, who was, as luck would have it, me.
v0.16.0: the turnstile that could not contain itself
For fifteen releases Cure had finite state machines as a language primitive, and for fifteen releases the canonical FSM example—the turnstile—looked like this: four lines of beautifully declarative transition graph in turnstile.cure, followed by a hundred and twenty lines of Elixir GenServer plumbing in a wrapper module that had nothing to do with turnstiles and everything to do with bridging the gap between a gen_statem process and the rest of the application. The FSM definition said what you meant. The wrapper said what the runtime demanded. You read one of them to understand the domain and the other to make the program run. I cannot in good conscience call that a first-class primitive. It was more of a first-class primitive’s slightly embarrassed cousin, the one who turns up at family dinners and talks about his crypto portfolio.
Finitomata, my Elixir library for finite automata that has been in production for years, got this right on the first attempt by insisting that the graph and the transition handler belong in the same module and behind the same abstraction. v0.16.0 borrows that insight wholesale, and then some. The turnstile now reads:
fsm Turnstile with Integer
Locked --coin--> Unlocked
Unlocked --push--> Locked
Unlocked --coin--> Unlocked
Locked --push--> Locked
on_transition
(:locked, :coin, _payload, data) -> %[:ok, :unlocked, data + 1]
(:unlocked, :push, _payload, data) -> %[:ok, :locked, data]
(:unlocked, :coin, _payload, data) -> %[:ok, :unlocked, data + 1]
(_, _, _, data) -> %[:ok, :__same__, data]
The four transition lines on top are the graph you always wrote. The on_transition block underneath takes pattern-matching clauses of shape (current_state, event, event_payload, state_payload) and returns either %[:ok, next_state, new_payload] or %[:error, reason]. When the compiler sees an on_transition block it silently changes mode: instead of generating raw gen_statem Erlang abstract forms, it produces a GenServer-based Elixir module with an embedded transition table, pre-dispatch validation, compiled do_on_transition/4 clauses, and the optional lifecycle hooks on_enter, on_exit, on_failure, and on_timer. If you do not write on_transition, the FSM compiles the way it always did, through Erlang abstract forms to gen_statem. No flags, no configuration—the compiler decides based on whether you asked for the new thing.
Finitomata also contributed two conventions that I refused to live without once I tried them. A hard event, written with a trailing exclamation mark, must be the sole outgoing event of its source state, and fires automatically on arrival, so initialisation chains and guaranteed progressions stop requiring baby-sitting from the caller. A soft event, written with a trailing question mark, silently leaves the state alone if it cannot transition, instead of logging a warning and invoking on_failure. Health checks, optimistic polling, any kind of “try it, and if it does not apply this tick, never mind”—all become one-liners. The lexer grew a small counter that tracks whether it is currently inside an arrow --...-->, and only inside an arrow does it absorb the trailing ! or ? into the identifier. Everywhere else, ! stays reserved for effect annotations and ? for predicates and holes. The verifier enforces the hard-event rule; the compiler uses {:continue, ...} from the GenServer return tuple to fire hard events without yielding.
The turnstile’s wrapper is now fifty lines rather than one-hundred-and-twenty, and the fifty it retains are real application logic: counting how many people went through today. The twelve turnstile tests pass without modification, which is the only part of this release I will allow myself to take unironic pride in.
v0.17.0: toward Idris, at last
For seven versions Cure had been marketing itself as a dependently-typed language while behaving, in every practical respect, like a vaguely refinement-typed one. You could write Vector(T, n) in a signature. The checker would nod politely. It would not, however, verify that the length you promised was the length you produced, because the machinery to do so did not exist. v0.17.0 stops nodding.
Three type shapes that should have been there from the start arrive simultaneously. Sigma types pair a value with a type that depends on it; Sigma(n: Nat, Vector(T, n)) is “a natural number together with a vector of exactly that length”. Pi types let a function’s return type depend on its arguments, so the canonical example
fn append(xs: Vector(T, m), ys: Vector(T, n)) -> Vector(T, m + n)
now actually means what it reads as: at each call site the checker substitutes the concrete arguments into the return type, normalises with a tiny terminating reducer called Cure.Types.Reduce, and resolves the result. Closed type-level arithmetic never troubles the SMT solver—Vector(T, 2 + 3) is syntactically the same type as Vector(T, 5) before Z3 is even woken up. Equality types arrive with the single constructor refl(x) : Eq(T, x, x) and the single eliminator rewrite eq in expr, both erased at codegen to the atom :cure_refl. The standard library gains a Std.Equal module exposing refl, sym, trans, and cong, which is to say the usual suspects.
The second big arrival is implicit arguments with proper first-order unification. Write fn id({T}, x: T) -> T = x, call id(42), and at the call site an occurs-check-equipped unifier resolves T from the explicit argument. When resolution fails—and it does, more often than one would like—the pipeline emits a :unification_trace event carrying the argument, the position, and the substitution that killed it. The LSP renders the trace in hover; the CLI prints it in error output. No more staring at “cannot infer T” the way one stares at a Rorschach blot. Implicit parameters are erased at codegen, so they cost exactly nothing at runtime.
What ties the dependent-type stack together, in practical terms, is hole-driven development. Write
fn safe_head(xs: List(T)) -> T = ?body
and compile, and the compiler does not merely tell you “?body is missing”: it tells you ?body : T in scope: xs : List(T). Anonymous holes (??) get numbered ?_1, ?_2, in source order. Every encountered hole emits a :hole_goal event with the goal type and the local context, and the REPL’s new :holes meta-command lists everything recorded during the last evaluation. This is how Idris programmers write programs. It is now, finally, how Cure programmers can.
Totality joins the party, gently. Cure.Types.Totality classifies every function as :total, :partial, or :unknown, combining pattern-coverage (via Cure.Types.PatternChecker) with a structural-recursion check. By default totality is report-only; decorating a function with #[total] promotes the classification to a hard error if it fails:
#[total]
fn factorial(n: Int) -> Int
| 0 -> 1
| n -> n * factorial(n - 1)
Only direct structural recursion is caught in v0.17.0. Mutual recursion has to wait for v0.19.0.
Refinement types grow a backbone. Path-sensitive refinement flows along if and match guards, so inside
if x != 0 then 100 / x else 0
the then branch sees x : {x: Int | x != 0}, and the division is safe without an explicit refinement annotation. The new Std.Refine module ships drop-in refinements one does not wish to keep rewriting: NonZero, Positive, Negative, NonNegative, Percentage, Probability, plus predicate helpers. These are the kinds of tiny conveniences that, had they been present in v0.10.0, would have saved everyone a small amount of grief each day for a year.
The tooling catches up in the same release. Cure.REPL is a complete rewrite: multi-line input (terminated by a blank line or ;;), the meta-commands :t, :doc, :effects, :load, :reload, :use, :holes, :env, :reset, :fmt, :help, :quit, and command history persisted to ~/.cure_history. A watch mode, invoked as cure watch lib/ --action check, recompiles, type-checks, or tests on every save with a 200 ms debounce, and works without :file_system thanks to a small polling fallback. The LSP server, Cure.LSP.Server, acquires inlay hints, signature help, formatting via a round-trip-tested source-preserving printer, prepare-rename and rename, code lenses, semantic tokens, and workspace symbols—seven capabilities in one commit, which is the sort of thing that happens when one has been putting off the LSP for three releases in a row. Doctests arrive, too: any ## or ### docstring immediately above a function whose body contains cure> / => pairs is executed by cure test --doctests.
The patch releases tidied up. v0.17.1 stopped the stdlib preloader from polluting the code path, fixed two LSP crashes around inlay hints and semantic tokens, and retargeted the vicure grammar at v0.17.0 syntax. v0.17.2 re-enabled LSP formatting once the new source-preserving formatter proved it would not eat users’ comments for breakfast—a promise the old AST-based formatter had never quite been willing to make.
v0.18.0: pattern matching grows up
Here is a question I did not want to answer for a long time. If one writes
match value
%{list: [h | t]} -> handle(h, t)
_ -> default
and the subject value is a map whose list field is not present, what does the old compiler do? The correct answer is “fails the match and falls through to the wildcard”. The answer the old compiler produced is “succeeds, because the map pattern is miscompiled into Erlang’s construction form (K => V) rather than the match form (K := V), so it accepts any subject, and h and t are bound to whatever happens to be lying around, possibly nothing sensible”. There were tests pointing straight at this. What there was not, was a pattern compiler willing to recurse.
v0.18.0 replaces the pattern layer wholesale. The headline is that this now actually works:
match value
%[_, %{list: [head | tail]}, _] -> handle(head, tail)
Person{name, address: Address{city}} -> greet(name, city)
[Ok(v) | _] -> v
_ -> default
Every sub-pattern compiles as a real pattern. Nested map patterns use exact matching and actually require the key. Record patterns check the struct tag and each field. The cons pattern binds v through the Ok(...) constructor. The wildcard mops up the rest. There is no magic here; merely a module (Cure.Compiler.PatternCompiler) that does the one thing its name promises, and which every other codegen path—compile_multi_clause_function, compile_pattern_match, compile_assignment, compile_comprehension, compile_catch_and_finally—now routes through rather than reinventing pattern handling locally. The original sin, I discovered after far too long, was treating patterns and expressions uniformly because their AST nodes have the same shape. Patterns are not evaluated; they are matched against a subject with bindings as side effects. Nothing good comes of pretending otherwise.
Alongside the compiler rewrite, the type checker’s bind_pattern_vars/3 was rewritten to thread the scrutinee type through every pattern shape, so nested pattern variables now carry the tightest type the structure allows rather than the old :any. Tuple patterns zip element-wise. List and cons patterns bind head and tail at the correct element and list types respectively. Map patterns look up each key through the scrutinee’s schema when it is a known record, or through the map value type otherwise. Record patterns resolve every field against the schema registered at rec time, and unknown fields emit a warning under the new error code E021. The practical consequence is that match p { Person{age: a} when a > 17 -> ... } finally knows a : Int, so subsequent refinement fires against the right type.
Field punning arrives in both directions: Point{x, y} desugars to Point{x: x, y: y} in pattern position and in construction position. It is purely a parser change, and if you have ever found yourself writing Person{name: name, age: age, email: email, ...} seven fields deep you will be glad of it. Repeated occurrences of the same variable in one pattern now do what a reasonable person would expect: the first occurrence binds, later ones lower to equality guards. %[x, x] matches exactly the pairs whose slots are equal, which I should have supported five releases ago.
The pin operator (^name) lands behind --experimental-pin as the official escape hatch for “compare against this already-bound value”:
let target = get_tag()
match event.tag
^target -> :hit
_ -> :miss
Internally it lowers to a fresh variable plus a V_fresh =:= V_target guard—the same transformation one used to write by hand. Zero runtime cost, considerably fewer characters on screen. It is promoted to default in v0.19.0.
Exhaustiveness checking grows a second pass. The old flat classifier (:wildcard | :empty_list | :cons | {:literal, ...} | {:constructor, ...} | {:tuple, n}) is still there, because it is fast and covers the common case; a new Maranget-style column walker, Cure.Types.PatternChecker.check_nested/2, now descends into tuple scrutinees whose element types are enumerable and emits concrete, source-shaped witnesses for missing patterns:
Warning: match expression has nested non-exhaustive cases (E025)
missing: %[Error(_), _]
Five new error codes—E021 through E025—land in Cure.Compiler.Errors for unknown record fields in patterns, record field type mismatches, non-literal map-pattern keys, unbound pin variables, and non-exhaustive nested matches. cure explain E0xx works for every one.
There is a caveat worth naming explicitly. Map patterns that used to silently succeed against arbitrary subjects no longer do so. If your code relied on the old broken behaviour—for example, a map pattern whose key was not in the subject but which the compiler accepted anyway—you will now see either a runtime badmatch or a compile-time non-exhaustive warning. Fix the pattern. The compiler is right and you are wrong; I say this with the humility of the person who wrote the earlier compiler.
v0.19.0: the furniture arrives
After the pattern engine settled, a queue of previously-slated features was ready to land. v0.19.0 is called “Bring the Furniture” because every item on it is the kind of thing one should not have to talk about in a release announcement; one should merely discover, pleasantly, that it is there.
Propositions acquire a syntactic home. A proof container is a module-shaped block whose bindings must return Eq(...) or a refinement witness:
proof Std.Proof
fn plus_zero(_n: Int) -> Eq(Int, n, n) = :cure_refl
fn append_nil(_xs: List(T)) -> Eq(List(T), xs, xs) = :cure_refl
The compiler enforces the shape under E026. Runtime values are plain :cure_refl atoms; the checker does the interesting work; the resulting BEAM module is ordinary-looking code you can load alongside any other. assert_type expr : T arrives as a zero-cost compile-time assertion: the checker verifies the type, the codegen strips the wrapper, and mismatches surface as E027. fn doubled(n: Int) -> Int = assert_type n * 2 : Int does what it looks like.
Records gain field defaults, which is the sort of feature whose absence one does not forgive oneself:
rec Person
name: String = "Anonymous"
age: Int = 0
active: Bool = true
Omitted fields fall back to the declared defaults at construction time; any caller-supplied value always wins. Type mismatches between the default and the declared field type are caught as E028. In the same spirit, @derive(Show, Eq, Ord) is finally wired end-to-end. The Cure.Types.Derive module had been sitting on disk since v0.12.0 without a codegen path; v0.19.0 plumbs it through so that decorating
@derive(Show, Eq, Ord)
rec Point
x: Int
y: Int
synthesises plain show/1, eq/2, and compare/2 exports. The accompanying example, examples/derived_show.cure, constructs two Point values, asks eq(p, q), and returns one if they compare equal—which, naturally, they do.
Property-based testing shows up via two cooperating stdlib modules. Std.Gen ships tiny stateless generators (int_in, bool, list_int, list_of_int, one_of, constant) backed by :rand. Std.Test.forall(gen, property, runs) runs the property against samples and returns :ok or raises :property_failed:
mod Laws
use Std.Gen
use Std.Test
fn test_plus_zero() -> Atom =
forall(fn(_) -> int_in(-100, 100), fn(n) -> n + 0 == n, 100)
Shrinking, histograms, and stateful generators are future work; this is deliberately the minimum viable property tester, and it is enough to catch the kind of bug one always catches with a property tester in the first week.
Std.Iter is the matching minimal lazy iterator protocol. Constructors empty/0, from_list/1, and range/2; consumers fold/3, take/2, to_list/1. take/2 stops before materialising the tail, so unbounded ranges are safe as long as you only peek at a prefix:
use Std.Iter
fn sum_range(n: Int) -> Int =
let it = range(1, n)
let add = fn(x) -> fn(acc) -> acc + x
fold(it, 0, add)
The package-registry groundwork lands as a version parser and a dependency resolver. Cure.Project.Version handles SemVer plus compound constraints (~>, >=, <=, <, >, ==, combined with and), and accepts MAJOR.MINOR as shorthand for MAJOR.MINOR.0. Cure.Project.Resolver.resolve/2 is a deterministic backtracking resolver over a local registry that picks the newest compatible version and surfaces conflicts as E030. A remote index service, signing, and Hex.pm cross-publishing are what v0.20.0 is for.
Totality catches up with multi-function cycles. Cure.Types.Totality.check_mutual/1 runs Tarjan’s SCC algorithm on the module call graph, then classifies each non-trivial strongly-connected component as either :ok (structural decrease proven on at least one path) or :suspect (E029). It is not a full termination checker—nothing running at compile time on the BEAM is going to be—but it is strictly more than nothing, and it catches the obvious cases.
One last syntactic nicety: multi-head cons patterns,
match xs
[a, b, c | rest] -> a + b + c
_ -> 0
desugared by the parser into right-associated cons cells. Works in pattern and construction position. It is the sort of feature that every functional language grows eventually; we might as well have grown ours here.
Five new error codes (E026 through E030) round out the error catalog for the release, all of them available via cure explain.
v0.19.1: Dialyzer and the small sins
The point release should never be the interesting one, and v0.19.1 honours that convention. Its job was to add Dialyzer to the CI matrix (.github/workflows/ci.yml), resolve the backlog of specs it turned up across Cure.Compiler, Cure.Compiler.Codegen, Cure.FSM.Compiler, Cure.Types.Env, and friends, and tidy a handful of things that were unclean but not broken: the mix cure.escript task got proper CLI integration, mix.exs lost a dead dependency, and the stdlib preloader started force-compiling in tests instead of hoping the beams from the last run would still be fresh. None of this is exciting. All of it needed to happen, because a language without Dialyzer in CI is a language whose maintainer is lying to themselves about how robust its internals are.
The small print, for anyone upgrading. The lexer rule for trailing ? on identifiers introduced in v0.17.0 means x?y now tokenises as x? followed by y; add whitespace or parentheses if you need the old behaviour. Map patterns that relied on the pre-v0.18.0 construction-form bug will now fail matches honestly. None versus None() inside a record pattern now emits a warning when you probably meant the nullary constructor. And Cure.lock lockfiles produced by v0.19.0 and v0.19.1 remain source-compatible with v0.17.0 and later, but a mix cure.compile_stdlib && mix cure.check after pulling is cheap insurance.
What all this adds up to
If you have read this far you may have noticed a pattern. Each of these releases took a subsystem that the earlier Cure treated as decorative and made it load-bearing. FSMs stopped being a transition graph with a separately-authored runtime and became a language primitive one can actually define in one place. The dependent-type core stopped nodding politely at Vector(T, m + n) and started checking it. The pattern engine stopped accepting whatever came past and started matching. The furniture release filled in the items one would notice the absence of but would not necessarily think to name: defaults, derive, property tests, a lazy iterator, a version parser, mutual-recursion totality. And the point release added the tool (Dialyzer) that would have prevented some of these embarrassments from landing in the first place.
There is still a long queue. Full bitstring pattern specifiers. Refinement narrowing through nested record and map patterns. The remote package-registry index service with its signing and Hex.pm cross-publishing. True dependent types in their full glory—fn len(l: Vector(_, n)) -> NonNegInt = n remains, for the moment, a thing I write in the v0.2x plan rather than in actual Cure. None of this will be done quickly. All of it will be done, if the last four releases are any indication, by ripping out the almost-works version and replacing it with one that does.
Clone it, build it, break it:
git clone https://github.com/am-kantox/cure-lang.git
cd cure
mix deps.get && mix test
mix escript.build
./cure version
./cure run examples/destructuring.cure
./cure run examples/derived_show.cure
./cure run examples/lazy_iter.cure
The repository is at github.com/am-kantox/cure-lang, the site is at cure-lang.org, and the furniture, as of v0.19.1, is indoors.