Most of my early work has revolved around numerical methods for scientific computing, for which libraries and packages are very useful. For my first research projects I used julia, which gave me a taste for functional programming, and I was an avid user of QuantumOptics.jl for some time.
Then I started machine learning, and for this I first used Flux.jl, which is great but lacks a lot of support for modern tools.
I therefore went over to Python, and used netket, a great jax-based livrary spearheaded by a F. Vicentini, a former labmate.
After that at Normal I worked on a few nice libraries for thermodynamic computing and more recently I worked on Thunder, Lightning's PyTorch compiler.
Open-source contributions
2025
- Thunder
Thomas Viehmann et al.
Thunder is a PyTorch compiler that aims to simplify the model optimization workflow. If you've ever tried to optimize a model in PyTorch, you generally start by trying to use `torch.compile`, and that slowly becomes tweaking knobs for things you don't necessarily understand if you're not an expert. In Thunder the great Thomas Viehmann wrote an interpreter in Python, meaning that all traces are python code and very easily inspectable. You also get to see how the traces are modified step-by-step based on the optimizations applied.
2024
- posteriors
S. Duffield, Me, and J. Chiu, P. Klett and D. Simpson
posteriors is the *go-to library* for uncertainty quantification of LLMs. It is functional, swappable, transformers-compatible and contains the basic important UQ methods, such as Laplace approximations, Monte-Carlo methods and variational inference. I also contributed an efficient conjugate gradient solver using fisher-vector products in pytorch, which doesn't seem to exist anywhere.
Check out the blog post
- thermox
S. Duffield & Me
thermox is the *best OU process simulator*, as it is exact, GPU-compatible and uses associative scans. This is used internally by us for simulation thermodynamic hardware, but can be useful for many other things (finance people contributions welcome!).
Check out the blog post