Marketing

A VISION-BASED UNMANNED AERIAL VEHICLE NAVIGATION METHOD

Description
The satellite navigation systems are the main position sources for unmanned aerial vehicles (UAVs). This fact limits the area of UAVs operation to the places where radio signals is visible for a satellite navigation system receiver, mounted on the
Categories
Published
of 6
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
    1 st  International Conference on Innovative Research and Maritime Applications of Space Technology IRMAST 2015 A VISION-BASED UNMANNED AERIAL VEHICLE NAVIGATION METHOD   P aweł  Burdziakowski, Marek Przyborski, Jakub Szulwic   Faculty of Civil and Environmental Engineering, Gdansk University of Technology, Narutowicza Str. 11/12, 80-233 Gdansk, Poland, e-mail: pawelburdziakowski@gmail.com, marek.przyborski@pg.gda.pl, szulwic@pg.gda.pl Artur Janowski   Faculty of Geodesy, Geospatial and Civil Engineering, University of Warmia and Mazury in Olsztyn, M. Oczapowskiego Str. 2, 10-719 Olsztyn, Poland, e-mail: artur.janowski@geodezja.pl   Summary The satellite navigation systems are the main position sources for unmanned aerial vehicles (UAVs). This fact limits the area of UAVs operation to the places where radio signals is visible for a satellite navigation system receiver, mounted on the vehicle - outdoor navigation. Closed spaced are unavailable for vehicles which navigation is based on global satellite navigation systems (GNSS). Miniature UAV (MiniUAV) is able to operate successfully within a indoor environment, however is has to be supported by teleoperation system operated by a man. MiniUAV is not able to plot the trajectory and determine their position relatively to the indoor environment. The research presented in the paper are solving a problem of indoor navigation position tracking. Developed and tested method is based on image data acquired by a single non-metric digital camera placed on miniUAV. The paper presents the method of solving a own UAV position, based only on image data, as well as process of UAV navigation supported by developed method and the main advantages and limitations. The method is based on photogrammetry and a modern computer vision algorithms. 1.Introduction 1.1. Problem statement The satellite navigation systems are the main position sources for unmanned aerial vehicles (UAVs). This fact limits the area of UAVs operation to the places where radio signals is visible for a satellite navigation system receiver, mounted on the vehicle [1]. Satellite navigation determines a UAV operation environment, so UAV is able to operate only within an outdoor environment. Closed spaces are unavailable for vehicles which navigation is based on global satellite navigation systems (GNSS). Miniature UAV (MiniUAV) is able to operate successfully within an indoor environment, however is has to be supported by a man, who is operating it remotely. Nevertheless, MiniUAV is not able to plot the trajectory and determine their position relatively to the indoor environment. The fact, that MiniUAV is not able to plot own trajectory without access to INS (inertial navigation system) or GNSS, being inside a closed environment, inspirited this research. It has been noticed, that every MiniUAV is equipped with a small single non-metric camera. This type of sensor is a good source of data, which can be used for a navigation purposes, without any additional installation on board MiniUAV required [2]. Usually, low resolution video data from vehicle is transmitted to an operator, in order to support a navigation decision process. High resolution video is usually gathered for crucial environmental data collecting, enhancing and updating modern Geographical Information System (GIS) database, either for geographical or counterterrorism purposes [3].  2. Proposed method 2.1. Navigation process phases The MiniUAV navigation process Fig. 1, supported by developed method, has been divided on 4 phases. First, GPS-phase is conducted in a satellite navigation available environment. The trajectory [4,5,6] is plotted using a main navigation system. A VISUAL-phase is conducted within a environment, where satellite signal is unavailable. The vehicle is able to proceed through that environment and plot own trajectory using presented method based on a vision data acquired by a on-board single non-metric camera. A COMB-phase is conducted within a environment, where main navigation system signal is being periodically lost or its accuracy is drastically diminished. In this phase, a vision data are supporting a main navigation system providing a trajectory, which is resistant to a position changes caused by a satellite signal lost. A SYNCH-phase is used for a reference system transformation. Figure 1. MiniUAV navigation process. 2.2. Motion estimation The VISUAL-phase is based on own motion estimation algorithms [7,8] A relation between two neighbouring images is represented by a P CAM(t,t+1)  matrix Fig.2. This matrix describes a rotation  R  and translation  T between images taken in time t   and t+1 , this matrix can be written:      1000 3333231 2232221 1131211 )1,( )1,( t r r r  t r r r  t r r r   RT P t t t t CAM  (1) A full trajectory is represented by a cumulative matrix multiplication, equation (2) [9]:    t t icamABS   RT P 0 ][  (2) Figure 2. Motion estimation 2.3 Synchronization phase The trajectory described by matrix shown in equation (2) is referred to a first image taken, at time t=0 in camera coordinates .  The direction of camera coordinates axes are aligned with corresponding axes of 1 st  image. This first image is called CCf  –   Camera Coordinate frame, and during synchronization phase CCf is referred to the world coordinates system, then it is called WCf  –   World Coordinated frame. ICf  –   Image Coordinate frame represents a coordinate system located at the image level (Fig. 2) [10]. Figure 3. Synchronization phase The relationship between two coordinate systems WCf and CCf are found by using pairs of measurements of the coordinates of a number of points in systems. This is described by equation (3). P camABS  matrix describes a trajectory referred to a world reference system. GPS camABS camABS GPS camABS GPS camABS camABSnew  T t P RS t P    )()(   (3) A detailed knowledge about world reference system is not required, because S scale, R rotation matrix, and T translation vector are found, during synchronization phase (Fig.3), with reference to known own trajectory determined by a Mini UAV main navigation system i.e.  GPS, for N matching points. A optimal scale, rotation and translation parameters of equation (3) is found by minimalizing a following function:     N t GPS GPS camABS camABS GPS camABS GPS camABS   t PT t P RS  02 )()(min  (4)  where, P GPS  describes a known trajectory obtained by a main navigation system. Equation (4) has been solved using quaternion algebra described in [11]. 2.3. Algorithms Motion estimation, done during VISUAL-phase is the process of estimating the spatial geometrical relation between images. This relation is estimated using image feature matches. In order to succeed this, the intrinsic parameters of the camera need to be known. Due to the fact that this method is basing on images obtained by a non-metric camera, intrinsic parameters (5), are to be estimated during a calibration process.  1000  y y x xi  o f o f  M      (5) All elements of intrinsic matrix  M  i  like focal length  f   x  and  f   y  in effective horizontal and vertical pixel size units, principal point coordinates o  x   o  y , skew coefficient γ  have been estimated using Zhang method [12], as well as lens distortion coefficients (radial and tangential distortions). Having compensated the images for distortions, next step is to estimate the spatial relation between a pair of images. To do that, the normalized 8-point algorithm [7,8,13,14] has been used. RANSAC [15,16] has been used for robust outlier rejection. A SIFT [17] detector has been used for selecting image features as well as SIFT descriptors are used for the matching process. 3. Results 3.1. System setup An experiment has been conducted using a non-metric camera type GoPro Hero 3 Black Edition. The camera has been mounted on RTK GPS Leica 1200 pole, 30 cm under GPS antenna. Method has been implemented in Matlab software, using a FIT3D Toolbox for Matlab [10] and Camera calibration toolbox for Matlab [18]. A four test trajectories (Fig. 4) planed were measured by RTK, and simultaneously taking pictures by a camera. Both, first and second trajectories were measured by RTK in manual mode, camera in single shot mode (photos were taken in the same time with position taking). Measures offered a maximal standard deviation of position (RTK in manual mode): σ   N   = 0,19 m , σ   E   = 0,10   m , σ   H   = 0,20 m . Trajectories third and fourth measured by RTK in automatic mode (position obtained every 0,2 s ), camera in timelapse mode (photos took every 0,5 s ), offered a maximal standard deviation of position: σ   N   = 12,98 m , σ   E   = 12,30   m , σ   H   = 21,84 m. Standard deviation is shown on Figures 7 and 9 as a red dotted plot. Figure 4. A planed and measures trajectories 3.2. Calibration A calibration process was done using Camera calibration toolbox for Matlab [18]. A three different size calibration matrix (planar checkerboard) were used for the best intrinsic matrix and distortion parameters estimation, First matrix with 11x7 100mm squares, second 15x11 50mm squares and third 20x13 20 mm squares. All images were rectified using obtained distortion coefficients. As a results and calibration process showed (Fig. 5), intrinsic parameters of nonmetric camera (GoPro) obtained by a Matlab tool are sufficient for a navigation purposes. Due to the fact, that GoPro offers a fish eye lens type, which has a significant radial distortion, some difficulties were observed, as for distortions coefficients calculation. Figure 5. Complete distortion model of GoPro camera  3.3 Algorithm parameters In order to find most sufficient intrinsic matrix coefficients, parameters of SIFT and RANSAC algorithms, the minimal mean square error: 21 ))()(( 1 t S t S   N  MSE  GPS  N t vistrans     , (6) where: )()(  t Pt S     and )( t P  is a difference between points of trajectory )1,(   t t P camABSnew and )1,(   t t P GPS  , was calculated. For calibration results obtained with usage of third planar checkerboard MSE trans  had the smallest value, and this intrinsic matrix coefficients where used for next calculations. Prepared parameters for a SIFT and RANSAC resulted with smallest MSE trans  for SIFT threshold =0.4, RANSAC threshold = 0.001 and RANSAC iterations = 10000, respectively with [10] notations. That parameters were used for final trajectories calculation. 3.4 Trajectories Final results are shown on Figures 6 and 8. P GPS  positions taken by RTK GPS, P VIS  positions obtained by presented method. Scale S, rotation matrix R and translation vector T (Eqn. 4) are found for recorded trajectories for N = 30 % of number of all trajectory points. Exact number N=17 for 2 nd  trajectory and N=90 for 4 th  trajectory. As analysis showed, it is strongly recommended that in order to solved equation 4 during a synchronization phase unnamed platform have to execute minimum one turn. Figure 6. 2 nd  trajectory  –   RTK and VISUAL A functions of absolute sum of X, Y and Z GPS and VISUAL coordinates with standard deviation of GPS position are shown of figures 7 and 9. Differences of function  f  GPS   and  f  vis  show an exponential increment of this difference (f  dif  ). Practically, a navigation during a VISUAL phase, has to be limited in time. Figure 7. Result functions  –   2 nd  trajectory Manual measurement offered a minimal standard deviation, which is rather not usual for navigation systems mounted on MiniUAV. The situation, that standard deviation is similar to navigation system mounted on MiniUAV has been simulated during automatic measurement and shown for 4 th  trajectory. It is clearly visible on figure 9 ( 50s<t<100s ) that position obtained by a presented method is more stable and not prone on drop of GPS position accuracy. Drop of position accuracy especially caused by multipath signals or natural and artificial interference. This stability can be used in COMB phase, in the environment where a GPS accuracy can be suffered. The fact that position is calculated relatively to the environment acquired from images, creates autonomy independence of satellite positioning system. Figure 8. 4 th  trajectory  –   RTK and VISUAL   Figure 9. Result functions  –   4 th  trajectory 3.5 Achieved accuracy In order to asses achieved accuracy and compare recovered trajectory to the GPS an incremental metrics has been introdu ced, ΔS for translation and Δϕ  for rotation. As Fig. 10-left shows an incremental translation for a trajectory measured manually with a low GPS standard deviation, both plots are corresponding to each other. Where a GPS receiver suffered from various interference which is indicated as a pick on Fig. 10-right, a visual method clearly not reacts on that. A picks of incremental rotation are even more visible on Fig. 11-right. Figure 10. Comparison of incremental translations obtained with GPS and visual method  –   2 nd  trajectory (left), trajectory 4 th  (right). Figure 11. Comparison of incremental rotation obtained with GPS and visual method  –   2 nd  trajectory (left), 4 th  trajectory (right). An errors has been calculated, in accordance with following equations: )()()(  t S t S t  AEtrans GPS vis   , (7)     N t GPS vis  t S t S   N t  MAEtrans 1 )()( 1)( , (8)       N t GPS vis  t S t S   N t  MSEtrans 12 )()( 1)( , (9) )()()(  t t t  AErot  GPS vis   , (10)     N t GPS vis  t t  N t  MAErot  1 )()( 1)( , (11)       N t GPS vis  t t  N t  MSErot  12 )()( 1)( , (12) where: )1()( ))1(),(( )(  t  pt  p t  pt  p t  . (13) A calculated absolute error for 2 nd  trajectory (Fig. 12) show that is mean value is 0.23 m  in translation and 4.15 deg , for rotation. Mean absolute errors calculated for 4 th  trajectory are respectively 1.06 m  and 21.75 deg.  Figure 12. Translation (left) and rotation (right) absolute error  –   2 nd  trajectory. Figure 13. Translation (left) and rotation (right) absolute error  –   4 th  trajectory. 3. Conclusions The paper presented the method of solving an own UAV position, based on image data, as well as process of UAV navigation supported by developed method. As results showed, a MiniUAV equipped with small non-metric camera, using proposed method, is able to continuously plot own trajectory based only on image data. It is opening a new areas for navigating platforms, the areas where traditional satellite navigation systems suffered from either a loss of signal or strong
Search
Similar documents
View more...
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks