University of Cambridge > > Computer Laboratory Systems Research Group Seminar > Bias in AI

Bias in AI

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Srinivasan Keshav.

“Blame it on the algorithm” is a common refrain these days when systems that include algorithmic components deliver very clearly unacceptable results. Examples range from Amazon’s gender biased hiring (machine learning) algorithm that learned from biased examples, though racial bias in AI methods used for predictive policing, to the recent UK A -level grade “normalization” of 2020, which was biased by design. These scenarios are often succinctly referred to as “algorithmic bias”, and there are many more examples, some of which are not so obvious. Moreover, this phrasing often leads people to think that this purely a technology problem when in fact one must look at the complete system, including the human elements, in which the algorithm is created and operated, to understand how these systems make mistakes. The talk will highlight the ways in which “algorithmic bias” can arise, and how we can identify and mitigate the damage.


Derek McAuley is a British academic who is Professor of Digital Economy in the School of Computer Science at the University of Nottingham and director of Horizon Digital Economy Research, an interdisciplinary research institute funded through the RCUK Digital Economy programme. He acted as a Specialist Advisor to the House of Lords European Union Committee into online platforms, and Chief Innovation Officer during the founding of the Digital Catapult. He is a Fellow of the British Computer Society and member of the UKCRC , a computing research expert panel of the Institution of Engineering and Technology and BCS .

This talk is part of the Computer Laboratory Systems Research Group Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity