BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:NLIP Seminar Series
SUMMARY:Modular and Compositional Transfer Learning - Jona
 s Pfeiffer (Google Research)
DTSTART;TZID=Europe/London:20230224T120000
DTEND;TZID=Europe/London:20230224T130000
UID:TALK196519AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/196519
DESCRIPTION:Abstract:\n\nWith pre-trained transformer-based mo
 dels continuously increasing in size\, there is a 
 dire need for parameter-efficient and modular tran
 sfer learning strategies. In this talk\, we will t
 ouch base on adapter-based fine-tuning\, where ins
 tead of fine-tuning all weights of a model\, small
  neural network components are introduced at every
  layer. While the pre-trained parameters are froze
 n\, only the newly introduced adapter weights are 
 fine-tuned\, achieving an encapsulation of the dow
 n-stream task information in designated parts of t
 he model. We will demonstrate that adapters are mo
 dular components which can be composed for improve
 ments on a target task and how they can be used fo
 r out of distribution generalization on the exampl
 e of zero-shot cross-lingual transfer. Finally\, w
 e will discuss how adding modularity during pre-tr
 aining can mitigate catastrophic interference and 
 consequently lift the curse of multilinguality.\n\
 nBio:\n\nJonas Pfeiffer is a Research Scientist at
  Google Research. He is interested in modular repr
 esentation learning in multi-task\, multilingual\,
  and multi-modal contexts\, and in low-resource sc
 enarios. He worked on his PhD at the Technical Uni
 versity of Darmstadt\,  was a visiting researcher 
 at the New York University and a Research Scientis
 t Intern at Meta Research. Jonas has received the 
 IBM PhD Research Fellowship award for 2021/2022. H
 e has given numerous invited talks at academia\, i
 ndustry and ML summer schools\, and has co-organiz
 ed multiple workshops on multilinguality and multi
 modality.\n\nTopic: NLIP Seminar\nTime: Feb 24\, 2
 023 12:00 PM London\n\nJoin Zoom Meeting\nhttps://
 cl-cam-ac-uk.zoom.us/j/94330375053?pwd=TjRtbTg5aUd
 zWVdLRU15RjR0V2g0Zz09\n\nMeeting ID: 943 3037 5053
 \nPasscode: 768471
LOCATION:Virtual (Zoom)
CONTACT:Rami Aly
END:VEVENT
END:VCALENDAR
