BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:MR360: Mixed Reality Rendering for a 360° Panoramic Videos - Neil
  Dodgson
DTSTART:20170504T131500Z
DTEND:20170504T141500Z
UID:TALK71604@talks.cam.ac.uk
CONTACT:Peter Robinson
DESCRIPTION:We describe an immersive system\, MR360\, that provides intera
 ctive mixed reality (MR) experiences using a conventional low dynamic rang
 e (LDR) 360 degree\, panoramic video (360-video) viewed in a head mounted 
 display (HMD). MR360 seamlessly composites 3D virtual objects in real-time
  into live 360-video using the input panoramic video as the lighting sourc
 e to illuminate the virtual objects. Image based lighting (IBL) is percept
 ually optimized to provide fast and believable results using the LDR 360-v
 ideo as the lighting source. Regions of most salient light in the input pa
 noramic video are detected to optimize the number of lights used to cast b
 elievable shadows. The areas of the detected lights are used to adjust the
  penumbra of the shadow to provide realistic soft shadows. Our real-time d
 ifferential rendering synthesises virtual 3D objects into the 360-video wi
 th perceptually-plausible lighting and shadows. MR360 provides the illusio
 n of interacting with objects in a video\, which are actually 3D virtual o
 bjects seamlessly composited into the background of the 360-video. MR360 i
 s implemented in a commercial game engine (Unreal Engine 4) and evaluated 
 using various 360-videos. Our MR360 pipeline requires no pre-computation. 
 It can synthesize an interactive MR scene using live 360-video input while
  providing realistic high performance rendering (90 fps in stereo) suitabl
 e for HMDs.\n\nThis is an IEEE VR 2017 paper\, to appear in IEEE TVCG.\nAu
 thors: Taehyun Rhee\, Lohit Petikam\, Benjamin Allen\, and Andrew Chalmers
 .\nPresented by: Neil Dodgson.\n
LOCATION:SS03 Meeting Room\, Computer Laboratory
END:VEVENT
END:VCALENDAR
