University of Cambridge > > Wednesday Seminars - Department of Computer Science and Technology  > Backdoors in Machine Learning Models

Backdoors in Machine Learning Models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Ben Karniely.

Machine Learning (ML) is now omnipresent in our computer systems. Downloading and running ML models is practically trivial and requires little knowledge of their internals. Yet, should we be even using these models to make important decisions? Is there any hidden functionality inside of them? Is it hard to inject deterministic behaviours into ML? Are there methods to uncover this hidden functionality? In this talk I survey the Backdoor/Poisoning literature in the field of machine learning; I then connect it to the field of Computer Security and draw parallels to Cryptography; finally, I identify direction of future research and hypothesise what form practical backdoors will take in the future.


Ilia Shumailov holds a PhD in Computer Science from University of Cambridge, specialising in Machine Learning and Computer Security. During the PhD under the supervision of Prof Ross Anderson Ilia worked on a number of projects spanning the fields of machine learning security, cybercrime analysis and signal processing. Following the PhD, Ilia joined Vector Institute in Canada as a Postdoctoral Fellow, where he worked under the supervision of Prof Nicolas Papernot and Prof Kassem Fawaz. Ilia is currently a Junior Research Fellow at Christ Church, University of Oxford and a member of the Oxford Applied and Theoretical Machine Learning Group with Prof Yarin Gal.

Link to join virtually:

A recording of this talk is available at the following link:

This talk is part of the Wednesday Seminars - Department of Computer Science and Technology series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity