Elliot J. Crowley
pdb

I am a Lecturer (Assistant Professor) in Machine Learning and Computer Vision at the School of Engineering in the University of Edinburgh. I co-lead the Bayesian and Neural Systems research group.

My research interests include:

  • simplifying machine learning
  • neural architecture search (and AutoML more generally)
  • efficient network training
  • low-resource deep learning
  • engineering applications of machine learning

I have an MEng in Engineering Science and a DPhil (PhD), both from the University of Oxford. My DPhil was on "Visual recognition in Art using Machine Learning" with Andrew Zisserman in the VGG group. After my DPhil, I was a postdoc at the School of Informatics in Edinburgh with Amos Storkey.

I hold an EPSRC New Investigator Award and I am an investigator on the dAIEdge Horizon Network.

Team

I am principal advisor for:
  • Dr Linus Ericsson (Postdoc) who is exploring the fundamentals of neural architectures
  • Chenhongyi Yang (PhD student) who works on 2D and 3D visual recognition
  • Miguel Espinosa (PhD student) who works on engineering big models for earth observation
  • Shiwen Qin (PhD student) who works on efficient training of LLMs
If you are interested in starting a PhD with me, then please send me a targeted, nonverbose email (that doesn't read like it came out of an LLM) with your CV and a 2 page research proposal. At the moment, I do not have funding for international students and I don't want to take self-funded students.

News

Selected Publications

GPViT: A High Resolution Non-Hierarchical Vision Transformer with Group Propagation

ICLR 2023 (Accepted as a notable paper)

Chenhongyi Yang*, Jiarui Xu*, Shalini De Mello, Elliot J. Crowley, Xiaolong Wang

A new vision transformer architecture that serves as an excellent backbone across different fine-grained vision tasks.

Prediction-Guided Distillation for Dense Object Detection

ECCV 2022

Chenhongyi Yang, Mateusz Ochal, Amos Storkey, Elliot J. Crowley

A knowledge distillation framework for single stage detectors that uses a few key predictive regions to obtain high performance.

Neural Architecture Search without Training

ICML 2021 (Long talk)

Joseph Mellor, Jack Turner, Amos Storkey, Elliot J. Crowley

A low-cost measure for scoring networks at initialisation that can be used to perform neural architecture search in seconds.

Neural Architecture Search as Program Transformation Exploration

ASPLOS 2021 (Distinguished Paper)

Jack Turner, Elliot J. Crowley, Michael O'Boyle

A compiler-oriented approach to neural architecture search which can generate new convolution operations.

Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels

NeurIPS 2020 (Spotlight)

Massimiliano Patacchiola, Jack Turner, Elliot J. Crowley, Michael O'Boyle, Amos Storkey

A simple Bayesian alternative to standard meta-learning.

BlockSwap: Fisher-guided Block Substitution for Network Compression on a Budget

ICLR 2020

Jack Turner*, Elliot J. Crowley*, Michael O'Boyle, Amos Storkey, Gavia Gray

A fast algorithm for obtaining a compressed network architecture using Fisher information.

* equal contribution. A full list of publication is on Scholar.

Thanks to Jack Turner and Chenhongyi Yang for the website template.