発表論文

2005.12

移動ロボットによる環境認識のための実時間距離画像位置合わせ

Nanaho OHSAWA, 佐川 立昌, Tomio ECHIGO, Yasushi YAGI

概要

This paper proposes real time 3D modeling of environment for a mobile robot by aligning range images. 3D modeling by mobile robot needs Simultaneous Localization And Mapping (SLAM). In disaster rescue and relief, for reducing the risk of secondary disaster, it is desirable that remote-controlled mobile robot gathers the information about afflicted areas in advance. In this paper, we tackled 3D modeling for environment recognition by mobile robot. We obtain range images by a real time laser range finder mounted on the robot. And we then align them in real time for SLAM. Our method sequentially aligns range images by a variant of Iterative Closest Point (ICP) method, therefore estimates the self-position of the robot without any external devices. We also propose a new weighting scheme for an ICP method. In the experiment, we analyze the precision of the real time laser range finder and evaluate the accuracy of the estimation of robot’s position. Finally we show the result of modeling and localization by the robot which mounts the real time range finder.