University of Cambridge > > Wednesday Seminars - Department of Computer Science and Technology  > Hand Tracking on HoloLens 2

Hand Tracking on HoloLens 2

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact jo de bono.

Hands are the primary and most direct way we interact with the world. We use tools all the time, but our hands are always with us. With HoloLens 2, we introduced a new way for users to interact with virtual objects in mixed reality – simply reaching out and touching them. Direct Manipulation lets users grab, move, and adjust digital content in the same ways as they do with real, physical objects. To enable this kind of interaction, we must first track the user’s hands.

In this talk, I will describe how we at Microsoft turned research into product to bring Fully Articulated Hand Tracking to HoloLens. This story starts nine years ago with an internship project about fitting smooth 3D models to pictures of dolphins, and ends today with a hand tracker that can accurately and robustly track the surface and joints of a user’s hands at 45Hz, while also being performant enough to fit within the power and computation constraints of a head-mounted device.

I will present the technical highlights of our journey from research to product, focusing on several key engineering efforts and algorithmic innovations. This includes making our code over 500 times faster through careful low-level optimization, collecting a massive dataset of 3D hand data to understand how human hands look and move, and developing a photo-realistic synthetic data pipeline for preparing vast amounts of noise-free training data for machine learning.

Video recording of this talk is available at:

(access restricted to the University of Cambridge)

This talk is part of the Wednesday Seminars - Department of Computer Science and Technology series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity