A Whirlwind Tour of Computer Science

I am very lucky to do my research somewhere between a compiler group and a machine learning group.

Since most of them are physicists, mathematicians, or engineers, many of the machine learning researchers I come across are perplexed by the idea of a whole degree about programming, to which I eagerly quote Michael Fellows:


Computer science is no more about computers than astronomy is about telescopes, biology is about microscopes, or chemistry is about beakers and test tubes.

Despite how fun this quote is, it’s often quite hard to pinpoint what really separates computer science from anything else. One could argue we are closer to discrete mathematicians, but doesn’t that just make us bad mathematicians?

The answer that I’ve settled on is that there is something unexplainably fun about studying computer science. And isn’t that it's all about anyway?

So, I present my list of resources; a whirlwind tour of the wonders of computer science.

Full Courses


  • Computation Structures
    If I had to recommend one tour of computer science, it would be this one. It covers everything from basic circuit design to virtual memory and parallel programming, and yet somehow still manages to skirt the line between computer engineering and theoretical computer science.

  • Structure and Interpretation of Computer Programs
    An undisputable classic. It comes with lecture videos, a book, exercises, and more! You might bemoan the use of Scheme, to which I respond with a pointer to the above quote.

  • Programming Paradigms (CS107)
    This heavily overlaps with Computation Structures (see above) but is perhaps a little more applied, though equal in beauty.

Books


  • Gödel Escher Bach: An Eternal Golden Braid
    To me, GEB is in some ways the bedtime story version of Structure and Interpretation of Computer Programs. Every chapter of this book will delight and surprise you.

  • Write You A Haskell
    Have you ever wondered how to make your own programming language? It turns out that the process of doing so requires a deep understanding of some of the hardest and most intricate problems in computer science. Compiler technology lends itself well to the use of neatly composed trees, graphs, search algorithms, and more.

Projects to be aware of


I have a grand plan to write a blog post about all of these, but for now I will settle for a list:

  • TVM
    TVM is a high performance code generator specifically designed for machine learning workloads. Basically, if you need something slightly unconventional to run fast, TVM is your friend. It plugs in neatly to other frameworks (via ONNX, or TorchScript IR) so you can mostly use it directly from your deep learning framework of choice.

  • JAX
    JAX is numpy + autograd + XLA, which means you can write your code in almost raw Python, and it will be both differentiable and fast on most hardware.

  • TorchScript
    TorchScript is orthogonal to JAX; it acts as an IR that can lower from PyTorch to whatever low level code you need generated (e.g. CUDA, C++). This makes it really easy to optimise and deploy your models across various devices.

  • MLIR
    Really the problem that these tools are trying to address is the complexity of the lowering step from high level Python descriptions to optimised, low level machine code. Each of the steps of the lowering process can come with domain specific information that might aid optimisation. MLIR is compiler infrastructure built with this in mind; it's quite young, but very exciting.