Perspective-Correct VR Passthrough
Perspective-Correct VR Passthrough
Please register to receive details on how to access this free online presentation.
Description
Virtual reality (VR) passthrough uses external cameras on the front of a headset to show images of the real world to a user in VR. However, these cameras capture a different perspective of the world than what the user would see without the headset, preventing users from seamlessly interacting with their environment. Although computational methods can be used to synthesize a novel view at the eye, these approaches can lead to visual distortions in the passthrough image. Instead, this talk proposes a novel camera architecture that uses an array of lenses and co-designed apertures to directly capture the exact rays of light that would have gone into the eye, enabling accurate, low latency passthrough with good image quality.
About Grace Kuo
Grace Kuo, Research Scientist, Meta Reality Labs
I am a research scientist at Reality Labs Research at Meta where I work on novel display and imaging systems for virtual and augmented reality. I graduated with my PhD from the department of Electrical Engineering and Computer Science at UC Berkeley, advised by Professor Laura Waller and Professor Ren Ng. My research is in computational imaging, which is the joint design of hardware and algorithms for imaging and display systems. I work at the intersection of optics, signal processing, computer graphics, and optimization.
___________________________
Date: Tuesday, 22 October, 2024
Time: 9am AWST, 12pm AEDT
Location: Online via Teams Meeting
Livestream: Please register by purchasing a free ticket. Details on how to access the livestream will be sent via email.