Please login to be able to save your searches and receive alerts for new content matching your search criteria.
A method for interpreting elastic-lidar return signals in heavily-polluted atmospheres is presented. It is based on an equation derived directly from the classic lidar equation, which highlights gradients of the atmospheric backscattering properties along the laser optical path. The method is evaluated by comparing its results with those obtained with the differential absorption technique. The results were obtained from locating and ranging measurements in pollutant plumes and contaminated environments around central México.
The recognition and detection of 3D point cloud data are important research tools in the field of computer vision, with important applications in many significant fields, such as in unmanned driving, high-precision mapping, and robot-assisted vision. At the same time, with the development of deep learning technology, research on the recognition and detection of 3D point cloud data combined with deep learning technology is receiving more and more attention. One of the main problems with the current self-driving cars is that the detection ability of the optical radar’s echo can be affected by bad weather such as heavy rain, snow, thick smoke, or thick fog. The real signal is attenuated and often very weak or submerged in a large amount of noise, which affects its judgment of the outside environment, meaning that the autonomous vehicle is unable to move. Therefore, it is urgent to solve this problem to improve the accuracy of post-stereoscopic images. This study uses LiDAR to collect point cloud data, and then applies PointNet for deep learning training. Random noise added to the original point cloud data is filtered out with a filter. The accuracy of the original signal state and the signal after filtering the noise is compared. There is an improvement of 60.8% in the method detailed in this study. This method can be widely developed and applied to improve the LiDAR technology in the future.