Multisensor Online Transfer Learning for 3D LiDAR-based Human Detection with a Mobile Robot

Home / Publications / 2018 / Multisensor Online Transfer Learning for 3D LiDAR-based Human Detection with a Mobile Robot

Zhi Yan, Li Sun, Tom Duckett, and Nicola Bellotto
Multisensor Online Transfer Learning for 3D LiDAR-based Human Detection with a Mobile Robot
Proceedings of the IEEE International Conference on Intelligent Robots and Systems (IROS) 2018

 

Abstract

Human detection and tracking is an essential task for service robots, where the combined use of multiple sensors has potential advantages that are yet to be fully exploited. In this paper, we introduce a framework allowing a robot to learn a new 3D LiDAR-based human classifier from other sensors over time, taking advantage of a multi-sensor tracking system. The main innovation is the use of different detectors for existing sensors (i.e. RGB-D camera, 2D LiDAR) to train, online, a new 3D LiDAR-based human classifier based on a new “trajectory probability”. Our framework uses this probability to check whether new detections belongs to a human trajectory, estimated by different sensors and/or detectors, and to learn a human classifier in a semi-supervised fashion. The framework has been implemented and tested on a real-world dataset collected by a mobile robot. We present experiments illustrating that our system is able to effectively learn from different sensors and from the environment, and that the performance of the 3D LiDAR-based human classification improves with the number of sensors/detectors used.

@inproceedings{YanSunIROS2018,
Title={Multisensor Online Transfer Learning for 3D LiDAR-based Human Detection with a Mobile Robot},
author={Zhi Yan and Li Sun and Tom Duckett and Nicola Bellotto},
booktitle={Proceedings of IEEE International Conference on Intelligent Robots and Systems (IROS)},
year={2018},
}