University of Cambridge > Talks.cam > NLIP Seminar Series > Collaborative Pretraining on Evolving Pretraining and Small Manageable Tasks

Collaborative Pretraining on Evolving Pretraining and Small Manageable Tasks

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Richard Diehl Martinez.

Pretraining is monolithic. In this talk, I will discuss a collaborative approach to pertaining, by iterative model merging (originally fusing). We will then discuss making evaluation reliable and efficient, to allow anyone to evaluate. We might mention the BabyLM challenge, of pretraining models with human feasible amount of data as well (If interested in more, contact me, babyLM would be CoNLL’s shared task next year as well).

Leshem Choshen is a postdoctoral researcher at MIT -IBM, aiming to collaboratively pretrain through model recycling, efficient evaluation, and efficient pretraining research (e.g., babyLM). He received the postdoctoral Rothschild and Fulbright fellowship as well as IAAI and Blavatnik best Ph.D. awards. With broad NLP and ML interests, he also worked on Reinforcement Learning, Evaluation and Understanding of how neural networks learn. In parallel, he participated in Project Debater, creating a machine that could hold a formal debate, ending in a Nature cover and live debate.

He is also a dancer and runs tei.ma, a food and science blog (NisuiVeTeima on Instagram, Facebook and Tiktok).

This talk is part of the NLIP Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity