University of Cambridge > Talks.cam > Wednesday Seminars - Department of Computer Science and Technology  > Cross-lingual transfer learning with multilingual masked language models

Cross-lingual transfer learning with multilingual masked language models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Ben Karniely.

This talk presents an exploration into Multilingual Masked Language Models (MMLMs) as an emerging asset for cross-lingual transfer learning. The focus will be on introducing the mechanisms and applications that position MML Ms at the forefront of advancing multilingual capabilities in NLP .

We’ll dissect the transformer architecture that underpins MML Ms, delve into the masking mechanism, and discuss the transfer learning training that enables these models to understand and generate multilingual text. The synergy between these components is critical for the model’s linguistic versatility.

Further, the discussion will pivot to optimizing few-shot learning within the MMLM framework. By strategically annotating challenging instances, we can amplify model performance. I’ll present findings on employing zero-shot learning techniques to identify such instances for cross-lingual transfer, which could inform annotation strategies.

Attendees will gain a clear understanding of MML Ms, informed by practical applications such as grammatical error correction and sentiment analysis, potentially stimulating further research in the domain.

Link to join virtually: https://cam-ac-uk.zoom.us/j/81322468305

A recording of this talk is available at the following link: https://www.cl.cam.ac.uk/seminars/wednesday/video/

This talk is part of the Wednesday Seminars - Department of Computer Science and Technology series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity