![]() |
University of Cambridge > Talks.cam > NLIP Seminar Series > Understanding the Interplay between LLMs' Utilisation of Parametric and Contextual Knowledge
Understanding the Interplay between LLMs' Utilisation of Parametric and Contextual KnowledgeAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Suchir Salhan. Language Models (LMs) acquire parametric knowledge from their training process, embedding it within their weights. The increasing scalability of LMs, however, poses significant challenges for understanding a model’s inner workings and further for updating or correcting this embedded knowledge without the significant cost of retraining. Moreover, when using these language models for knowledge-intensive language understanding tasks, LMs have to integrate relevant context, mitigating their inherent weaknesses, such as incomplete or outdated knowledge. Nevertheless, studies indicate that LMs often ignore the provided context as it can be in conflict with the pre-existing LM’s memory learned during pre-training. Conflicting knowledge can also already be present in the LM’s parameters, termed intra-memory conflict. This underscores the importance of understanding the interplay between how a language model uses its parametric knowledge and the retrieved contextual knowledge. In this talk, I will aim to shed light on this important issue by presenting our research on evaluating the knowledge present in LMs, diagnostic tests that can reveal knowledge conflicts, as well as on understanding the characteristics of successfully used contextual knowledge. Bio: Isabelle Augenstein is a Professor at the University of Copenhagen, Department of Computer Science, where she heads the Natural Language Processing section. Her main research interests are fair and accountable NLP , including challenges such as explainability, factuality and bias detection. Prior to starting a faculty position, she was a postdoctoral researcher at University College London, and before that a PhD student at the University of Sheffield. In October 2022, Isabelle Augenstein became Denmark’s youngest ever female full professor. She currently holds a prestigious ERC Starting Grant on ‘Explainable and Robust Automatic Fact Checking’, and her research has been recognised by a Karen Spärck Jones Award, as well as a Hartmann Diploma Prize. She is a member of the Royal Danish Academy of Sciences and Letters, and co-leads the Danish Pioneer Centre for AI. This talk is part of the NLIP Seminar Series series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCambridge University Astronomical Society (CUAS) Machine learning in Physics, Chemistry and Materials discussion group (MLDG) IfM Centre for Industrial SustainabilityOther talksSpacetime Spins: Statistical mechanics for error correction with stabilizer circuits Deciphering disease mutations and gene regulation through massively parallel assays of variant effects. Title TBC Deputy Director's Briefing Max Perutz Lecture - Title TBC Bigger Picture Talk: Community-Based Participatory Research on Food-Energy-Water Projects |