University of Cambridge > Talks.cam > Bio-Inspired Robotics Lab (BIRL) Seminar Series > Nonparametric Bayesian Word Discovery by Robots: Introduction to Symbol Emergence in Robotics

Nonparametric Bayesian Word Discovery by Robots: Introduction to Symbol Emergence in Robotics

Add to your list(s) Download to your calendar using vCal

  • User Dr. Tadahiro Taniguchi, College of Information Science and Engineering, Ritsumeikan University. World_link
  • ClockFriday 09 September 2016, 15:30-16:00
  • HouseCUED, LR5.

If you have a question about this talk, please contact Josie Hughes.

Word discovery from speech signals is a crucial task for a human infant to learn a language. Differently from conventional approach towards automatic speech recognition, infants cannot use labeled data, i.e., transcribed text. They have to discover words from speech signals and learn meanings of the words in an unsupervised manner. We have been developing machine learning methods that enable a robot to learn words automatically. In this talk, I am introducing two unsupervised machine learning methods. One is for simultaneous learning of lexicons and object categories using multimodal latent Dirichlet allocation (MLDA) and nested Pitman-Yor language model (NPYLM). The other is nonparametric Bayesian double articulation analyser (NPB-DAA) for learning phonemes and words directly from speech signals using hierarchical Dirichlet process hidden language model (HDP-HLM). The both methods are based on Bayesian nonparametrics. I am also introducing our research field called symbol emergence in robotics.

This talk is part of the Bio-Inspired Robotics Lab (BIRL) Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity