University of Cambridge > > Rainbow Group Seminars > MR360: Mixed Reality Rendering for a 360° Panoramic Videos

MR360: Mixed Reality Rendering for a 360° Panoramic Videos

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Peter Robinson.

We describe an immersive system, MR360 , that provides interactive mixed reality (MR) experiences using a conventional low dynamic range (LDR) 360 degree, panoramic video (360-video) viewed in a head mounted display (HMD). MR360 seamlessly composites 3D virtual objects in real-time into live 360-video using the input panoramic video as the lighting source to illuminate the virtual objects. Image based lighting (IBL) is perceptually optimized to provide fast and believable results using the LDR 360 -video as the lighting source. Regions of most salient light in the input panoramic video are detected to optimize the number of lights used to cast believable shadows. The areas of the detected lights are used to adjust the penumbra of the shadow to provide realistic soft shadows. Our real-time differential rendering synthesises virtual 3D objects into the 360-video with perceptually-plausible lighting and shadows. MR360 provides the illusion of interacting with objects in a video, which are actually 3D virtual objects seamlessly composited into the background of the 360-video. MR360 is implemented in a commercial game engine (Unreal Engine 4) and evaluated using various 360-videos. Our MR360 pipeline requires no pre-computation. It can synthesize an interactive MR scene using live 360-video input while providing realistic high performance rendering (90 fps in stereo) suitable for HMDs.

This is an IEEE VR 2017 paper, to appear in IEEE TVCG . Authors: Taehyun Rhee, Lohit Petikam, Benjamin Allen, and Andrew Chalmers. Presented by: Neil Dodgson.

This talk is part of the Rainbow Group Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2018, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity