University of Cambridge > > Rainbow Group Seminars > Social Signals in the Wild: Multimodal Machine Learning for Human-Robot Interaction

Social Signals in the Wild: Multimodal Machine Learning for Human-Robot Interaction

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Hatice Gunes.

ABSTRACT : Science fiction has long promised us interfaces and robots that interact with us as smoothly as humans do – Rosie the Robot from The Jetsons, C-3PO from Star Wars, and Samantha from Her. Today, interactive robots such as Pepper and voice user interfaces such as Amazon Alexa are moving us closer to effortless, human-like interactions in the real world. In this talk, I will discuss the challenges in creating technologies that can analyze, detect and generate non-verbal communication, including gestures, gaze, auditory signals, and facial expressions. I will present my lab’s major directions in understanding human social signals (including emotions, mental states, and attitudes) across cultures as well as in recognizing and generating expressions with diversity in mind.

SPEAKER BIO : Angelica Lim is the Director of the Rosie Lab (, and an Assistant Professor of Professional Practice in the School of Computing Science at Simon Fraser University. Previously, she led the Emotion and Expressivity teams for the Pepper humanoid robot at SoftBank Robotics. She received her B.Sc. in Computing Science (Artificial Intelligence Specialization) from SFU and a Ph.D. and Masters in Computer Science (Intelligence Science) from Kyoto University, Japan. She has been featured on the BBC , TEDx, hosted a TV documentary on robotics, and was recently featured in Forbes 20 Leading Women in AI.

This talk is part of the Rainbow Group Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity