World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Disparity estimation based on fusion of vision and LiDAR

    https://doi.org/10.1142/S021969132250014XCited by:3 (Source: Crossref)

    Advanced driver assistance systems improve driving safety and comfort by applying onboard sensors to collect environmental data, analyze environmental data and decision making. Therefore, advanced driver assistance systems have high requirements for distance perception of the environment. Perceptual sensors commonly used in traditional solutions include stereo vision sensors and the Light Detection and Ranging (LiDAR) sensors. This paper proposes a multi-sensing sensor fusion method for disparity estimation, which combines the perceptual data density characteristics of stereo vision sensors and the measurement accuracy characteristics of LiDAR sensors. The method enhances the sensing accuracy by ensuring high-density sense, which is suitable for distance sensing tasks in complex environments. This paper demonstrates with experimental results on real data that our proposed disparity estimation method performs well and is robust in different scenarios.

    AMSC: 62H35, 68U10, 94A08