I found this paper especially illuminating when you have seen too many debates on reference counting and so-called "garbage collection". It provides a unified view that the reference counting and tracing approaches are duals of each other. And the efforts to improve one of them will drive it to approximate the other. Here are some interesting points:
[a->b, b->a]
have 2 fix-points, [rc(a)=0, rc(b)=0]
and [rc(a)=1, rc(b)=1]
.
The tracing gradually approaches the former while the RC approaches the latter. Unfortunately, in this case, the former least fix-point,
is what we actually want to reach (i.e. both objects are dead).
I'm happy to see another unification in intermediate representation (IR) used by compilers. It has been a long time for the imperative language camp to use IR in SSA form in their compilers. But for the functional language camp, there's still a folklore of CPS IR. Though proved to be equivalent to SSA form, the CPS IR still remains the problem of being totally unreadable for humans. In this paper, I'm pleased to see the de facto standard compiler of Haskell used a direct style IR, extending lambda calculus in A-normal form with join points. It gives a closer correspondence between these two kinds of IR. Basically, the join points can be regarded the counterparts of basic blocks (with parameters) in the SSA form, and the functions are just functions. Of course, due to the different attitudes towards high-order functions, data types, etc. in the two camps, the concrete compilation process and focus are still not very similar.