COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Language Technology Lab Seminars > Adapters in Transformers. A New Paradigm for Transfer Learning…?
Adapters in Transformers. A New Paradigm for Transfer Learning…?Add to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Marinela Parovic. Adapters have recently been introduced as an alternative transfer learning strategy. Instead of fine-tuning all weights of a pre-trained transformer-based model, small neural network components are introduced at every layer. While the pre-trained parameters are frozen, only the newly introduced adapter weights are fine-tuned, achieving an encapsulation of the down-stream task information in designated parts of the model. In this talk we will provide an introduction to adapter-training in natural language processing. We will go into detail on how the encapsulated knowledge can be leveraged for compositional transfer learning, as well as cross-lingual transfer. We will briefly touch the efficiency of adapters in terms of trainable parameters as well as (wall-clock) training time. Finally, we will provide an outlook to recent alternative adapter approaches and training strategies. This talk is part of the Language Technology Lab Seminars series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsPublic health nutrition AI+Pizza Theory of Living Matter GroupOther talksTwo pathways to self-harm in adolescence LMB Seminar: Functions of Ribosome-Associated Chaperones in Health and Disease Doctors v. midwives: Caribbean medical encounters in the age of pronatal abolition Title: AIming at organ transplantation Pathways to zero energy at home - A talk by Nicola Terry Tackling Multispeaker Conversation Processing based on Speaker Diarization and Multispeaker Speech Recognition |