University of Cambridge > Talks.cam > Cambridge ML Systems Seminar Series > Bringing distributed training natively to Transformers library

Bringing distributed training natively to Transformers library

Download to your calendar using vCal

  • UserFerdinand Mom - Hugging Face Speaker website
  • ClockMonday 16 February 2026, 14:30-15:30
  • HouseComputer Lab, LT1.

If you have a question about this talk, please contact Sally Matthews .

Ferdinand Mom is a Research Engineer at Hugging Face with a background in large-scale pretraining and efficient deep learning systems. He is a contributor to the Hugging Face Transformers library: https://huggingface.co/docs/transformers/en/index—and co-author of the Ultra-Scale Playbook: https://nanotron-ultrascale-playbook.static.hf.space/index.html. Ferdinand is a leading voice and experimentalist in distributed and decentralized training, pushing the limits of scalable open-source AI.

This talk is part of the Cambridge ML Systems Seminar Series series.

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Š 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity