The focus will be to assess people navigation behavior around the robot by extracting trajectories and motions. I am working on a novel detecting, tracking, and motion profile extraction pipeline on lidar and camera data.
Implementation of a working, simple, monocular visual odometry (VO) pipeline in Matlab
Proposed a simplified pipeline of last-centimeter drone delivery towards window/balcony with vision-based fiducial marker detection and collision prevention under rigorous test
We programed based on [Crazyflie 2.1](https://www.bitcraze.io/products/crazyflie-2-1/) to find and precisely land on a platform with height of 10 cm by utilizing z reading from [flow deck](https://www.bitcraze.io/products/flow-deck-v2/). Additionally, We also utilized sensor readings from [multi-ranger deck](https://www.bitcraze.io/products/multi-ranger-deck/) to avoid the obstacles presented in the environment.
Implemented autonomous navigation with obstacle avoidance of the Thymio-II robot from simulation in Gazebo to real-world tests
Improved the existing trackers on overall performance in challenging UAV scenarios with high operational efficiency
Implemented Outdoor SLAM @ Tongji Jiading Campus and Indoor Autonomous Navigation.
Designed robots to combat in RoboMaster.