University of Cambridge > > Wednesday Seminars - Department of Computer Science and Technology  > Cross-lingual transfer learning with multilingual masked language models

Cross-lingual transfer learning with multilingual masked language models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Ben Karniely.

This talk presents an exploration into Multilingual Masked Language Models (MMLMs) as an emerging asset for cross-lingual transfer learning. The focus will be on introducing the mechanisms and applications that position MML Ms at the forefront of advancing multilingual capabilities in NLP .

We’ll dissect the transformer architecture that underpins MML Ms, delve into the masking mechanism, and discuss the transfer learning training that enables these models to understand and generate multilingual text. The synergy between these components is critical for the model’s linguistic versatility.

Further, the discussion will pivot to optimizing few-shot learning within the MMLM framework. By strategically annotating challenging instances, we can amplify model performance. I’ll present findings on employing zero-shot learning techniques to identify such instances for cross-lingual transfer, which could inform annotation strategies.

Attendees will gain a clear understanding of MML Ms, informed by practical applications such as grammatical error correction and sentiment analysis, potentially stimulating further research in the domain.

Link to join virtually:

A recording of this talk is available at the following link:

This talk is part of the Wednesday Seminars - Department of Computer Science and Technology series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2023, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity