Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Advanced driver assistance systems improve driving safety and comfort by applying onboard sensors to collect environmental data, analyze environmental data and decision making. Therefore, advanced driver assistance systems have high requirements for distance perception of the environment. Perceptual sensors commonly used in traditional solutions include stereo vision sensors and the Light Detection and Ranging (LiDAR) sensors. This paper proposes a multi-sensing sensor fusion method for disparity estimation, which combines the perceptual data density characteristics of stereo vision sensors and the measurement accuracy characteristics of LiDAR sensors. The method enhances the sensing accuracy by ensuring high-density sense, which is suitable for distance sensing tasks in complex environments. This paper demonstrates with experimental results on real data that our proposed disparity estimation method performs well and is robust in different scenarios.
This chapter introduces a novel approach to tree detection by fusing LiDAR (Light Detection and Ranging) and RGB imagery, leveraging Ordered Weighted Averaging (OWA) aggregation operators to improve image fusing. It focuses on enhancing tree detection and classification by combining LiDAR’s structural data with the spectral details from RGB images. The fusion methodology aims to optimize information retrieval, employing image segmentation and advanced classification techniques. The effectiveness of this method is demonstrated on the PNOA dataset, highlighting its potential for supporting forest management.