BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Social Signals in the Wild: Multimodal Machine Learning for Human-
 Robot Interaction - Angelica Lim (Simon Fraser University)
DTSTART:20220628T130000Z
DTEND:20220628T140000Z
UID:TALK174599@talks.cam.ac.uk
CONTACT:Hatice Gunes
DESCRIPTION:ABSTRACT: Science fiction has long promised us interfaces and 
 robots that interact with us as smoothly as humans do – Rosie the Robot 
 from The Jetsons\, C-3PO from Star Wars\, and Samantha from Her. Today\, i
 nteractive robots such as Pepper and voice user interfaces such as Amazon 
 Alexa are moving us closer to effortless\, human-like interactions in the 
 real world. In this talk\, I will discuss the challenges in creating techn
 ologies that can analyze\, detect and generate non-verbal communication\, 
 including gestures\, gaze\, auditory signals\, and facial expressions. I w
 ill present my lab's major directions in understanding human social signal
 s (including emotions\, mental states\, and attitudes) across cultures as 
 well as in recognizing and generating expressions with diversity in mind.\
 n\nSPEAKER BIO: Angelica Lim is the Director of the Rosie Lab (www.rosiela
 b.ca)\, and an Assistant Professor of Professional Practice in the School 
 of Computing Science at Simon Fraser University. Previously\, she led the 
 Emotion and Expressivity teams for the Pepper humanoid robot at SoftBank R
 obotics. She received her B.Sc. in Computing Science (Artificial Intellige
 nce Specialization) from SFU and a Ph.D. and Masters in Computer Science (
 Intelligence Science) from Kyoto University\, Japan. She has been featured
  on the BBC\, TEDx\, hosted a TV documentary on robotics\, and was recentl
 y featured in Forbes 20 Leading Women in AI.
LOCATION:William Gates Building\, Level 2 (Rainbow Corridor)\, Seminar Roo
 m: SS03
END:VEVENT
END:VCALENDAR
