Most functional compiler
One-letter variable names abound in IOCCC entries, and for good reason. These tiny pieces of confetti are hard to read, and leave room for more code. Then why not go further and use zero-letter variable names? That is, tacit programming or point-free style.
I had been playing with an algorithm devised by Oleg Kiselyov that effortlessly and efficiently eliminates those pesky variables, leaving behind terms composed from a small set of combinators. No need for lambda lifting or supercombinators.
By adding a handful of lines of mostly parsing code, we get a Haskell compiler, or rather, a compiler that accepts a subset of Haskell sufficiently large to self-host. You might say I wrote a tool for this contest, then ran it on itself to make an entry for it.
And more: https://www.ioccc.org/years.html#2019
Open Season on Hylomorphisms
This piece of code is probably unreadable to a regular C++ programmer, but makes perfect sense to a Haskell programmer.
Here’s the description of the problem: You are given a list of equal-length strings. Every string is different, but two of these strings differ only by one character. Find these two strings and return their matching part. For instance, if the two strings were “abcd” and “abxd”, you would return “abd”.
What makes this problem particularly interesting is its potential application to a much more practical task of matching strands of DNA while looking for mutations. I decided to explore the problem a little beyond the brute force approach. And, of course, I had a hunch that I might encounter my favorite wild beast–the hylomorphism.
Running from the past
Functional programming encourages us to program without mutable state. Instead we compose functions that can be viewed as state transformers. It’s a change of perspective that can have a big impact on how we reason about our code. But it’s also a change of perspective that can be useful in mathematics and I’d like to give an example: a really beautiful technique that alows you to sample from the infinite limit of a probability distribution without needing an infinite number of operations.
Free Monoid from Free Algebra
In my previous blog post I used, without proof, the fact that the initial algebra of the functor I + h \otimes - is a free monoid. The proof of this statement is not at all trivial and, frankly, I would never have been able to work it out by myself.
I worked my way through this proof, filling some of the steps that might have been obvious to a mathematician, but not to an outsider. I even learned how to draw diagrams using the TikZ package for LaTeX.
Part two: https://bartoszmilewski.com/2018/07/30/free-monoid-from-free-algebra-part-2/
Linear Haskell: Practical linearity in a higher-order polymorphic language
The paper is focused around two main use cases for linear types. The one that really catches my imagination is encoding the usage protocol of a resource (e.g., you can’t read from a file before it is opened) in types. “Here, linearity does not affect efficiency, but rather eliminates many bugs.” We’ll return to some examples of this later in this post. It seems like there’s a very nice and natural fit to protocols in distributed systems (a bit like we looked at with Disel yesterday).
The second use case is safe update in place, and the reason we’re interested in that is efficiency – if we know it’s safe to update a value in place we don’t need to worry about making copies, or mutual exclusion mechanisms.
Stalking a Hylomorphism in the Wild
Trying to improve my Haskell coding skills, I decided to test myself at solving the 2017 Advent of Code problems. It’s been a lot of fun and a great learning experience. One problem in particular stood out for me because, for the first time, it let me apply, in anger, the ideas I learned from category theory. But I’m not going to talk about category theory this time, just about programming.
Although saying “just programming” may be a bit modest.
Writing a Formally-Verified Porn Browser in Coq and Haskell
Hopefully this example shows that there’s nothing really stopping anyone from using Coq in their Haskell programs today.
Writing a SAT Solver
In this post, we’ll look at how to teach computers to solve puzzles. Specifically, we’ll look at a simple puzzle that can be expressed as a boolean constraint satisfaction problem, and we’ll write a simple constraint solver (a SAT solver) and mention how our algorithm, when augmented with a few optimizations, is used in modern SAT solvers.
Escaping Hell with Monads
Could be some other solutions not mentioned, but a decent comparative demonstration.
Linear types make performance more predictable
Linear logic had two of everything: two ways of building conjunctions and disjunctions, two notions of truth, falsehood and two ways to negate. It’s a strange system, but perhaps not moreso than the zoo of cute names and symbols Girard conferred to every construct. For the purposes of this post, we’ll only need one new symbol from this zoo: ⊸, which reads lollipop
Unlike monads, which came into programming straight from category theory, applicative functors have their origins in programming.
But are otherwise equally nonsensical...
Don’t use the bad functions.
Concurrency and Node
GHC optimization and fusion
Rewrite critical bits in C.
^ Not actually a fair summary.
Monads: Programmer's Definition
Programmers have developed a whole mythology around monads.
What they do vs what they are.
Haskell's Missing Concurrency Basics
The midnight Monad
Functor, Applicative, Monad, Enlightenment. It’s all about the fame.