I do computational visual sensing. I invent visual sensors that combine metamaterials and other emerging optical technologies and computer vision algorithms to perceive the environment by extracting scene information with accuracy, speed, smallness, and power efficiency. Much of my work draws inspiration from biology.

Raycast calibration of Augmented Reality
A compact, physics-based model to characterize the geometry of optics in an AR headset, and an automatic approach for AR geometric calibration that only requires little data. (ICCP 2020)
Metalens Depth Sensing Inspired by Jumping Spiders
A compact, single-shot, passive-lighting and efficient depth sensor powered by the metalens. This design took inspiration from the eyes of jumping spiders. (PNAS 2019)
Flexible, Large, Augmentable ToF (FLAT) Dataset
A large synthetic ToF dataset that incoporates realistic 3D artifacts, i.e., scene motion, multiple reflections and sensor noise. (ECCV 2018)
Focal Track
Focal Track (Best Demo Award, ICCP 2018)
A monocular, non-active lighting, and computationally efficient depth camera, built with a deformable lens. It can predict depth on challenging objects like reflective glasses or plastics. The systems runs at 100 frames per second. (ICCV 2017)
Focal Flow (Best Student Paper Award, ECCV 2016)
An algorithm that gives local depth and 3D velocity by simply solving a linear system which only contains image derivatives. (ECCV 2016, IJCV 2017)