COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Language Technology Lab Seminars > Semiparametric Language Models
Semiparametric Language ModelsAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Qianchu Liu. Machine learning models work well on a dataset given enough training examples, but they often fail to isolate and reuse previously acquired knowledge when the data distribution shifts (e.g., when presented with a new dataset or very long context). In contrast, humans are able to learn incrementally and accumulate persistent knowledge to facilitate faster learning of new skills without forgetting old ones. In this talk, I will argue that obtaining such an ability for a language model requires significant advances in how to represent, store, and reuse knowledge acquired from textual data. I will present a semiparametric language model framework that separates computation (information processing) in a large parametric neural network and memory storage in a non-parametric component. I will show two instantiations of such a model. First, I will discuss how to use it to allow a language model to continually learn new tasks without forgetting old ones. Second, I will present a language model architecture that adaptively combines local context and global context to make more accurate predictions. This talk is part of the Language Technology Lab Seminars series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsBuddhism Talk on Silent Illumination Meditation Museum of Archaeology & Anthropology https://zipmusicdownload.com/entertainment-news/Other talksThe Climate crisis and its solutions - what role can scientists play? The air we breathe: practices of care Is SMEFT enough? Cancer metabolism, a hallmark of cancer Microbe smiths: engineering microbial control in 20th-century Japan |