Code for Autonomous Drone Race is now available on GitHub

We release Teach-Repeat-Replan, which is a complete and robust system enables Autonomous Drone Race.

Teach-Repeat-Replan can be applied to situations where the user has a preferable rough route but isn’t able to pilot the drone ideally, such as drone racing. With our system, the human pilot can virtually control the drone with his/her navie operations, then our system automatically generates a very efficient repeating trajectory and autonomously execute it. During the flight, unexpected collisions are avoided by onboard sensing/replanning. Teach-Repeat-Replan can also be used for normal autonomous navigations. For these applications, a drone can autonomously fly in complex environments using only onboard sensing and planning.

Major components are:

  • Planning: flight corridor generation, global spatial-temporal planning, local online re-planning
  • Perception: global deformable surfel mapping, local online ESDF mapping
  • Localization: global pose graph optimization, local visual-inertial fusion
  • Controlling: geometric controller on SE(3)

Authors: Fei Gao, Boyu Zhou, and Shaojie Shen

Videos: Video1Video2
Code: https://github.com/HKUST-Aerial-Robotics/Teach-Repeat-Replan

Code for VINS-Mono is now available on GitHub

A Robust and Versatile Monocular Visual-Inertial State Estimator

VINS-Mono is a real-time SLAM framework for Monocular Visual-Inertial Systems. It uses an optimization-based sliding window formulation for providing high-accuracy visual-inertial odometry. It features efficient IMU pre-integration with bias correction, automatic estimator initialization, online extrinsic calibration, failure detection and recovery, loop detection, and global pose graph optimization. VINS-Mono is primarily designed for state estimation and feedback control of autonomous drones, but it is also capable of providing accurate localization for AR applications. This code runs on Linux, and is fully integrated with ROS. For iOS mobile implementation, please go to VINS-Mobile.

Authors: Tong Qin, Peiliang Li, Zhenfei Yang, and Shaojie Shen from the HUKST Aerial Robotics Group

Code: https://github.com/HKUST-Aerial-Robotics/VINS-Mono

Videos:

EuRoC dataset

Indoor and outdoor performance

AR application

MAV application

Mobile implementation

Code for VINS-Mobile is now available on GitHub

Monocular Visual-Inertial State Estimator on Mobile Phones

VINS-Mobile is a real-time monocular visual-inertial state estimator developed by members of the HKUST Aerial Robotics Group. It runs on compatible iOS devices, and provides localization services for augmented reality (AR) applications. It is also tested for state estimation and feedback control for autonomous drones. VINS-Mobile uses sliding window optimization-based formulation for providing high-accuracy visual-inertial odometry with automatic initialization and failure recovery. The accumulated odometry errors are corrected in real-time using global pose graph SLAM. An AR demonstration is provided to showcase its capability.

Authors: Peiliang LI, Tong QIN, Zhenfei YANG, Kejie QIU, and Shaojie SHEN

Code: https://github.com/HKUST-Aerial-Robotics/VINS-Mobile

 

Fei GAO received 2016 IEEE-SSRR Best Conference Paper Award

Fei GAO received 2016 IEEE-SSRR Best Conference Paper Award

The paper “Online quadrotor trajectory generation and autonomous navigation on point clouds” by Ph.D. student Fei GAO just won the Best Conference Paper Award in the 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) at Lausanne, Switzerland. Our group got the best paper award for this conference two years in a row.read more

Invited Talk: RACV 2016

On September 20th 2016, professor Shaojie Shen was invited by the 2016 Symposium on Research and Application in Computer Vision (RACV 2016) to have a talk about “Robust Autonomous Flight in Cluttered Environment”, on the panel of Computer Vision for Robotics at Shanghai Technology University. read more