First-Order Reasoning about Laziness
Speaker: Anton Lorenzen
Anton Lorenzen is pursuing a PhD in programming languages at the University of Edinburgh, where he is supervised by Sam Lindley and Daan Leijen. He has interned at the RiSE group of Microsoft Research and the OCaml compiler team of Jane Street, where he worked on Koka and modal uniqueness types. He holds a BSc in Mathematics and an MSc in Computer Science from the University of Bonn, Germany.
Abstract
Lazy data structures have important properties in theory, but their adoption in practice has been limited. Part of the issue seems to be the difficulty of reasoning about them: laziness is opaque, which makes it hard to understand when thunks are evaluated. This is true even in a strict language, where laziness is tightly controlled. Furthermore, the amortized complexity analysis of lazy data structures typically uses debits instead of the more familiar credits. In this talk, I will present an alternative, first-order implementation of laziness which avoids these issues. This implementation allows us to defunctionalise lazy thunks and thus inspect what a thunk computes and when it is evaluated. Furthermore, I’ll present credit passing style, a simplified, credit-based version of Okasaki’s debit passing style. Together, these techniques yield a reasoning style that mirrors the concrete, credit-based analysis usually done for strict data structures. This talk is based on the work [https://dl.acm.org/doi/10.1145/3747530](https://dl.acm.org/doi/10.1145/3747530) and [https://dl.acm.org/doi/10.1145/3759164.3759351](https://dl.acm.org/doi/10.1145/3759164.3759351).