This version of Talks.cam will be replaced by 1 July 2026, further information is available on the UIS Help Site
 

University of Cambridge > Talks.cam > Cambridge ML Systems Seminar Series > Bringing distributed training natively to Transformers library

Bringing distributed training natively to Transformers library

Add to your list(s) Download to your calendar using vCal

  • UserFerdinand Mom - Hugging Face World_link
  • ClockMonday 16 February 2026, 14:30-15:30
  • HouseComputer Lab, LT1.

If you have a question about this talk, please contact Sally Matthews.

Ferdinand Mom is a Research Engineer at Hugging Face with a background in large-scale pretraining and efficient deep learning systems. He is a contributor to the Hugging Face Transformers library: https://huggingface.co/docs/transformers/en/index—and co-author of the Ultra-Scale Playbook: https://nanotron-ultrascale-playbook.static.hf.space/index.html. Ferdinand is a leading voice and experimentalist in distributed and decentralized training, pushing the limits of scalable open-source AI.

This talk is part of the Cambridge ML Systems Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2026 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity