Navigation and Mapping for Aerial Vehicles Based on Inertial and Imaging Sensors

University dissertation from Linköping : Linköping University Electronic Press

Abstract: Small and medium sized Unmanned Aerial Vehicles (UAV) are today used in military missions, and will in the future find many new application areas such as surveillance for exploration and security. To enable all these foreseen applications, the UAV's have to be cheap and of low weight, which restrict the sensors that can be used for navigation and surveillance. This thesis investigates several aspects of how fusion of navigation and imaging sensors can improve both tasks at a level that would require much more expensive sensors with the traditional approach of separating the navigation system from the applications. The core idea is that vision sensors can support the navigation system by providing odometric information of the motion, while the navigation system can support the vision algorithms, used to map the surrounding environment, to be more efficient. The unified framework for this kind of approach is called  Simultaneous Localisation and Mapping (SLAM) and it will be applied here to inertial sensors, radar and optical camera.Synthetic Aperture Radar (SAR) uses a radar and the motion of the UAV to provide an image of the microwave reflectivity of the ground. SAR images are a good complement to optical images, giving an all-weather surveillance capability, but they require an accurate navigation system to be focused which is not the case with typical UAV sensors. However, by using the inertial sensors, measuring UAV's motion, and information from the SAR images, measuring how image quality depends on the UAV's motion, both higher navigation accuracy and, consequently, more focused images can be obtained. The fusion of these sensors can be performed in both batch and sequential form. For the first approach, we propose an optimisation formulation of the navigation and focusing problem while the second one results  in a filtering approach. For the optimisation method the measurement of the focus in processed SAR images is performed with the image entropy and with an image matching approach, where SAR images are matched to the map of the area. In the proposed filtering method the motion information is estimated from the raw radar data and it corresponds to the time derivative of the range between UAV and the imaged scene, which can be related to the motion of the UAV.Another imaging sensor that has been exploited in this framework is  an ordinary optical camera. Similar to the SAR case, camera images and inertial sensors can also be used to support the navigation estimate and simultaneously build a three-dimensional map of the observed environment, so called inertial/visual SLAM. Also here, the problem is posed in optimisation framework leading to batch Maximum Likelihood (ML) estimate of the navigation parameters and the map. The ML problem is solved in both the straight-forward way,  resulting in nonlinear least squares where both map and navigation parameters are considered as parameters, and with the Expectation-Maximisation (EM) approach. In the EM approach, all unknown variables are split into two sets, hidden variables and actual parameters, and in this case the map is considered as parameters and the navigation states are seen as hidden  variables. This split enables the total problem to be solved computationally cheaper then the original ML formulation. Both optimisation problems mentioned above are nonlinear and non-convex requiring good initial solution in order to obtain good parameter estimate. For this purpose a method for initialisation of inertial/visual SLAM is devised where the conditional linear structure of the problem is used to obtain the initial estimate of the parameters. The benefits and performance improvements of the methods are illustrated on both simulated and real data.

  CLICK HERE TO DOWNLOAD THE WHOLE DISSERTATION. (in PDF format)