Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Camera can sensor the environment on the lane by extracting the lane lines, but such detection is limited to a short distance with effect of illumination and other factors; radar can detect objects a long distance away but cannot detect the lane conditions. This paper combined machine vision with millimeter-wave radar and extracted the nearby distinct lane line through images; at the same time, the radar obtained the motion trajectory information of distant vehicles, then the least-square method was used to make curve fitting on those motion trajectory information in order to reconstruct the lane line information. Finally, in the stage of fusing two segments of lane lines, the goodness of fit was applied to complete the matching of corresponding lane lines. While, for areas between two segments of lane lines that neither camera or radar can detect, we established a lane model, utilized probabilistic neural network to select the corresponding lane model for matching, and then used approximate mathematics expression according to the selected lane model, thus obtaining the final front road information of current vehicle.
The mass of the Higgs boson is measured in the H→ZZ∗→4ℓ and in the H→γγ decay channels with 36.1fb−1 of proton-proton collision data from the Large Hadron Collider at a center-of-mass energy of √s=13 TeV recorded by the ATLAS detector in 2015 and 2016. The measured value in the H→ZZ∗→4ℓ channel is mZZ∗H=124.88±0.37GeV, while the measured value in the H→γγ channel is mγγH=125.11±0.42GeV. The two results have a compatibility of 0.4σ. The combined measurement from a simultaneous fit to the invariant mass distributions in the two channels is mH=124.98±0.28GeV.