Author: Yi LIN
Visual-based simultaneous localization and mapping (SLAM) technology has been well developed these years. Both feature-based methods and direct methods show impressive performance. However in extreme environments such as low-light environment, long-time exposure with aggressive motions always cause serious motion blur. In almost all visual odometry systems, blurry images drastically impede the feature detecting and feature matching. Also, frame-to-frame corresponding pixel intensity invariance assumption in photometric-based method will be affected. Adopting a blur kernel to deblur the raw images from the cameras as the pre-procedure is a popular way to deal with this problem. Instead of using a blur kernel, directly estimating the blurry images is another novel way. Notice the fact that motion blur is caused by camera motion during the exposure period, we model each pixel intensity engendering. A blur-aware motion estimation method we proposed to estimate the trajectory from an initial unblur frame to blur frame by knowing the depth map from stereo cameras.