Rapid perception and localization

Estimating a robot's pose and understanding the environment is vital for achieving highly dynamic maneuverability. In the absence of accurate state estimation and perception, a robot cannot operate safely within real-world settings. To this point, this project is dedicated to developing advanced state estimators that remain accurate and robust even when robots undertake extremely agile activities, like sprinting, jumping, parkour, and backflips. One of the primary challenges posed by such rapid maneuvers is motion blur, which hinders the precise extraction and tracking of features necessary for pose estimation. To counter this, we leverage event cameras, known for their exceptional temporal resolution and dynamic range. These cameras are adept at reducing the motion blur prevalent in conventional RGB images during swift motions. However, the distinct data structure of event cameras stands in contrast to that of traditional RGB images, necessitating the rethink of the methods for feature extraction, association, and tracking. This project will not only address these challenges but also delve deep into the untapped potential of event cameras.

PUBLICATIONS

Zhu, S., Tang, Z., Yang, M., Learned-Miller, E., & Kim, D. (2023). Event Camera-based Visual Odometry for Dynamic Motion Tracking of a Legged Robot Using Adaptive Time Surface. IROS

Contributing Researchers

Shifan Zhu
Shifan Zhu
Ph.D. Candidate, CICS
Donghyun-headshot
Donghyun Kim
Assistant Professor, CICS
Erik Learned-Miller
Erik Learned-Miller
Professor, CICS