Saifuddin Syed

Saifuddin Syed

Department of Statistics

University of Oxford

Biography

I am a Florence Nightingale Bicentennial Fellow in computational statistics and machine learning at the University of Oxford’s Department of Statistics, where I am also a member of the Algorithms and Inference Working Group for the Next Generation Event Horizon Telescope (ngEHT). Prior to this role, I completed a postdoc supervised by Arnaud Doucet and a PhD supervised by Alexandre Bouchard-Côté.

My research involves developing robust and scalable algorithms for statistical inference and generative modelling, with scientific applications in mind. My research has received numerous national and internation awards including the Pierre Robillard Award , the Cecil Graham Doctoral Dissertation Award , and the honourable mention of the Savage Award for Bayesian theory and methods.

If you have a cool, computationally challenging problem reach out!

Interests
  • Parallel tempering
  • Annealing algorithms
  • Scalable Bayesian inference
  • Generative modelling
  • AI for Science
Education
  • PhD in Statistics, 2022

    University of British Columbia

  • MSc in Mathematics, 2016

    University of British Columbia

  • BMath in Pure & Applied Mathematics, 2014

    University of Waterloo

Research

Many of the state-of-the-art algorithms in statistics, and machine learning, utilize a technique called annealing, which involves making inferences from an intractable target problem by incrementally deforming solutions from a tractable reference problem. I’m am interested in using annealing as a tool to understand the interplay between MCMC, SMC, variational inference, diffusion models, normalizing flows, and optimal transport.

(2023). autoMALA: Locally adaptive Metropolis-adjusted Langevin algorithm. Arxiv Preprint.

PDF Cite Arxiv

(2023). Pigeons.jl: Distributed sampling from intractable distributions. Arxiv Preprint.

PDF Cite Arxiv

(2023). Local Exchangeability. Bernoulli.

PDF Cite Source Document Arxiv

(2023). A Unified Framework for U-Net Design and Analysis. Arxiv Preprint.

PDF Cite Arxiv

(2022). Parallel tempering with a variational reference. Conference on Neural Information Processing Systems.

PDF Cite Poster Source Document Arxiv

(2021). Non-reversible parallel tempering: a scalable highly parallel MCMC scheme. Journal of the Royal Statistical Society (Series B).

PDF Cite Poster Video Source Document Arxiv

(2021). Parallel tempering on optimized paths. International Conference on Machine Learning.

PDF Cite Poster Slides Video Source Document Arxiv

Notable Applications

Here are some recent examples of large-scale projects that used non-reversible parallel tempering (NRPT) as the backbone of their inference engine. Please reach out if you would like to use NRPT for your projects and have any questions. If you want to play around with NRPT check out our Julia package Pigeons.jl .

Contact

  • saifuddin.syed@stats.ox.ac.uk
  • +44 7467 304999
  • 24-29 St Giles, Oxford, CA OX1 3LB
  • Office 1.18