Rust’s safety story is usually told through memory safety. But when spacecrafts fail, the cause is almost never a buffer overflow. It’s ambiguity: mismatched assumptions between components, contracts that lived in documentation rather than in code, state machines that allowed transitions nobody intended.
Safety-critical engineering has spent decades learning how to prevent exactly these kinds of failures. Some of those lessons map well onto Rust. Others don’t - they require discipline and design that no language enforces by default.
This talk explores that gap through real spacecraft failures and the engineering practices that emerged from them, connecting contract-based thinking, state-transition invariants, and system boundary correctness to practical Rust patterns. Sometimes Rust helps. Sometimes it can’t. Knowing the difference matters.