July, 2024. An updated version of Gemma, an open-weights LLM with fantastic performance was released; see here for a blog describing this work and here for a more in-depth paper. I couldn’t be more pleased to have been involved in this work.
September, 2022. We have just released a second version of our Acme paper, which is a significant rewrite that includes many more algorithms and an additional focus on batch/offline algorithms. We also give a more deep description of the distributed backbone of Acme. And of course we have opensourced all of this work here.
April, 2021. We have just released Launchpad, a system for defining and launching distributed programs particularly tuned towards machine learning applications. This partially makes up the backbone we use for the distributed variants of RL algorithms in Acme.
June, 2020. Along with some great colleagues at DeepMind we’re releasing Acme, an RL framework that we’ve been working on and using for our own research for quite some time. You can check it out here or take a look at our whitepaper!
January, 2019. I have finally gotten around to moving and updating my website. At the moment the data here should be incredibly out-of-date, but it’s only a matter of time before the rest gets updated! (Thanks to Yannis for forcing me to do this!)
August, 2016. Our paper on Learning to learn by gradient descent by gradient descent was accepted at NIPS 2016. See you in Barcelona!
January, 2016. I have accepted a position as a research scientist at Google DeepMind and am excited to join this coming March!
December, 2015. I spoke at the NIPS workshop on Bayesian optimization and presented recent work on output-space PES.
September, 2015. I spoke at the Gaussian process summer school’s workshop on global optimization; at the site you can find videos for each talk presented. We also released an updated version of pybo, our code for modular Bayesian optimization.
July, 2015. Our paper on PES with constraints was presented at ICML 2015. I also spoke at the ICML workshop on Automatic Machine Learning.
December, 2014. Along with several colleagues I presented papers at the BayesOpt workshop on modular Bayesian optimization, a shortened version of our PES paper, PES with unknown constraints, as well as entropy-based approaches to portfolio construction.
December, 2014. I released, along with Bobak Shahriari, a Python package for modular Bayesian optimization. Take a look at it and let us know what you think!