University of Cambridge > Talks.cam > CUED Control Group Seminars > Low-power embedded event-based vision processing for low-latency robotics

Low-power embedded event-based vision processing for low-latency robotics

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Fulvio Forni.

Brain inspired information processing in hardware and in software – or “neuromorphic computing” – opens a possible path to real-time, low-energy computation. Today, various neuromorphic computing systems are available as customizable hardware for small or large applications, from single chips to computer server rooms. For efficient use of such novel hardware, we need to rethink computation in terms of event-based or spiking neuromorphic algorithms, instead of traditional sequential (CPU) or parallel (GPU) computing. In this presentation, I will first briefly introduce the concepts of neuromorphic computing, including a quick overview of present and upcoming neuromorphic hardware for sensing and computation. Following, I will show and discuss multiple application examples of event-based sensing and perception, leading a path towards closed-loop actuated robotic systems.

The seminar will be held in the JDB Seminar Room , Department of Engineering, and online (zoom): https://newnham.zoom.us/j/92544958528?pwd=YS9PcGRnbXBOcStBdStNb3E0SHN1UT09

This talk is part of the CUED Control Group Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity