Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Micro unmanned aerial vehicles (UAVs) are promising to play more and more important roles in both civilian and military activities. Currently, the navigation of UAVs is critically dependent on the localization service provided by the Global Positioning System (GPS), which suffers from the multipath effect and blockage of line-of-sight, and fails to work in an indoor, forest or urban environment. In this paper, we establish a localization system for quadcopters based on ultra-wideband (UWB) range measurements. To achieve the localization, a UWB module is installed on the quadcopter to actively send ranging requests to some fixed UWB modules at known positions (anchors). Once a distance is obtained, it is calibrated first and then goes through outlier detection before being fed to a localization algorithm. The localization algorithm is initialized by trilateration and sustained by the extended Kalman filter (EKF). The position and velocity estimates produced by the algorithm will be further fed to the control loop to aid the navigation of the quadcopter. Various flight tests in different environments have been conducted to validate the performance of UWB ranging and localization algorithm.
An effective reactive collision avoidance algorithm is presented in this paper for unmanned aerial vehicles (UAVs) using two simple inexpensive pinhole cameras. The vision sensed data, which consists of the azimuth and elevation angles at the two camera positions, is first processed through a Kalman filter formulation to estimate the position and velocity of the obstacle. Once the obstacle position is estimated, the collision cone philosophy is used to predict the collision over a short period of time. In case a collision is predicted, steering guidance commands are issued to the vehicle to steer its velocity vector away using the nonlinear differential geometric guidance. A new cubic spline based post-avoidance merging algorithm is also presented so that the vehicle rejoins the intended global path quickly in a smooth manner after avoiding the obstacle. The overall algorithm has been validated using the point mass model of a prototype UAV with first-order autopilot delay. Both extended Kalman filtering (EKF) and unscented Kalman filtering (UKF) have been experimented. Both are found to be quite effective. However, performance of UKF was found to be better than EKF with minor compromise in computational efficiency and hence it can be a better choice. Note that because of two cameras, stereovision signature gets associated with optical flow signature thereby making the overall signature quite strong for obstacle position estimation. This leads to a good amount of success as compared to the usage of a single pinhole camera, results of which has been published earlier.
In this paper, we propose a novel method for mobile robot localization and navigation based on multispectral visual odometry (MVO). The proposed approach consists in combining visible and infrared images to localize the mobile robot under different conditions (day, night, indoor and outdoor). The depth image acquired by the Kinect sensor is very sensitive for IR luminosity, which makes it not very useful for outdoor localization. So, we propose an efficient solution for the aforementioned Kinect limitation based on three navigation modes: indoor localization based on RGB/depth images, night localization based on depth/IR images and outdoor localization using multispectral stereovision RGB/IR. For automatic selection of the appropriate navigation modes, we proposed a fuzzy logic controller based on images’ energies. To overcome the limitation of the multimodal visual navigation (MMVN) especially during navigation mode switching, a smooth variable structure filter (SVSF) is implemented to fuse the MVO pose with the wheel odometry (WO) pose based on the variable structure theory. The proposed approaches are validated with success experimentally for trajectory tracking using the mobile robot (Pioneer P3-AT).