![]() |
COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. | ![]() |
University of Cambridge > Talks.cam > Cambridge ML Systems Seminar Series > Training LLMs Anywhere: Enabling Large-Scale Decentralized Learning on Your Mobiles Devices
Training LLMs Anywhere: Enabling Large-Scale Decentralized Learning on Your Mobiles DevicesAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Sally Matthews. Training large language models (LLMs) is often seen as a resource-intensive task, requiring massive computational power centralized in data centers. But what if you could train powerful models directly on your everyday devices? In this talk, we introduce cutting-edge techniques that bring efficient LLM training to mobile and edge devices, overcoming constraints like limited memory, processing power, and network bandwidth. We present novel methods, including adaptive federated learning and backpropagation-free optimization for cross-device collaboration. These innovations empower large-scale decentralized learning, reducing system costs while maintaining high performance and privacy. Join this talk to explore how this research is reshaping on-device AI, making LLM fine-tuning practical, efficient, and closer than ever to your fingertips. Bio Mr. Dongqi Cai is a fourth-year PhD student at Beijing University of Posts and Telecommunications, currently a visiting PhD student in Prof. Nicholas D. Lane’s group at the University of Cambridge. His research focuses on efficient on-device machine learning systems. He has authored 12 papers as the first or corresponding author, including 7 in top-tier venues such as ACM MobiCom, USENIX ATC , NeurIPS, ACM Computing Surveys, and IEEE Transactions on Big Data. He has received multiple National PhD Scholarships and serves as PC members for leading conferences’s AE committee like ACM MobiCom and ACM MobiSys, while also reviewing for prestigious journals including IEEE TMC , IEEE TSC , and IEEE TKDE . This talk is part of the Cambridge ML Systems Seminar Series series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsStatistical Laboratory info aggregator Cambridge Blockchain Society EsferaOther talksManaging complexity of Weather and Climate Code with diversity of skills and workflows Title TBC Resource discussion and development, stimulated by Chiodo talk Telomere to mitochondria signaling prevents age-associated cancer initiation Save the date. Details of this seminar will follow shortly. The Marmoset Monkey as a Model Species for Spatial Navigation and Vestibular Function |