University of Cambridge > Talks.cam > Signal Processing and Communications Lab Seminars > Can kernel machines be a viable alternative to deep neural networks?

Can kernel machines be a viable alternative to deep neural networks?

Add to your list(s) Download to your calendar using vCal

  • UserDr Parthe Pandit, Indian Institute of Technology, Bombay World_link
  • ClockTuesday 25 February 2025, 14:00-15:00
  • HouseJDB Seminar Room, CUED.

If you have a question about this talk, please contact Prof. Ramji Venkataramanan.

Deep learning remains an art with several heuristics that do not always translate across application domains. Kernel machines, a classical model in ML, have received renewed attention following the discovery of the Neural Tangent Kernel and its equivalence to wide neural networks. I will present 2 results which show the promise of kernel machines for modern large scale applications. 1. Data-dependent supervised kernels: https://www.science.org/stoken/author-tokens/ST-1738/full 2. Fast scalable training algorithms for kernel machines: https://arxiv.org/abs/2411.16658

Bio: Parthe Pandit is the Thakur Family Chair Assistant Professor at the Center for Machine Intelligence and Data Science at IIT Bombay. He was a Simons Postdoctoral Fellow at UC San Diego. He obtained his PhD from UCLA and his undergraduate education from IIT Bombay. In 2024, he was awarded the AI2050 Early Career Fellowship by Schmidt Sciences. He has also been the recipient of the 2019 Jack K Wolf Student paper award by the IEEE Information Theory Society.

This talk is part of the Signal Processing and Communications Lab Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity