Vol.36 No.3(2001.9)
Research Report

Sensor Fusion for Road Environment Recognition
Yoshiki Ninomiya, Takeo Kato,
Yoshiko Kojima

Detecting forward objects and judging the danger of collision with them is one of the essential functions for driver assistance systems in order to improve safety. The judgment is carried out based on the relative position and occupancy of the object to the lane. The accuracy of the sensors currently used is insufficient to judge a collision with the object which is located 50m ahead.

In this paper, we introduce a sensor fusion technique to solve this problem. An obstacle detection method using a millimeter-wave radar which can measure the accurate distance to an object and machine vision which can measure the accurate lateral position of the object is presented. A motion stereo technique is utilized to extract the boundary of the object and the computational cost is reduced by the distance information measured by the millimeter-wave radar. A lane detection method using machine vision, a 2-D digital road map and a DGPS (Differential Global Positioning System) is presented. The integration of these sensor data makes the estimation of the 3-D lane shape possible. The proposed sensor fusion method was evaluated under real road conditions and confirmed that the accuracy of the position for both objects and lanes is less than 0.3m.