Junglab (Dr. Jae-Hyun Jung, Assistant Professor of Ophthalmolgy) at Schepens Eye Research Institute/Massachusetts Eye and Ear, Department of Ophthalmology, Harvard Medical School focuses on the interface between electrical/optical systems and human vision. We propose to uncover the principles underlying human visual perception and apply them to improve displays and visual aids using engineering approaches.
We have worked to characterize the vision multiplexing (visual confusion) such as perception in augmented reality (AR)/see-through visual aids and apply them to develop low vision rehabilitation devices using optical engineering and display technologies. Using vision multiplexing devices, we have developed various visual field expansion devices for patients with homonymous/bitemporal hemianopia and for patients with acquired monocular vision. We are applying this key mechanism of new field expansion to see-through head-mounted display (HMD)/see-through smart glasses for field loss patients as practical mobility aids. In addition, we are improving our VR walking simulator as portable clinical mobility testing tool.
We have proposed several light-field 3D imaging and display systems that use a micro lens array to capture and represent whole 3D rays within a single shot of the elemental image. We have applied this imaging system to improve visual prostheses for blind people, such as retinal/cortical implants and sensory substitution devices (SSDs). The proposed active confocal imaging system removes background clutter from images, and improve the detection of possible objects of interest. We are focusing on a more elaborate design of light-field confocal imaging system on spectacle and studies involving object recognition, visual search, and mobility testing to find better representations for visual prostheses to aid people with visual impairment. We have also developed real-time light-field microscopy that captures and simultaneously represents entire 3D volumetric information of live specimens.