سیستم ناوبری اینرسی-تصویری مبتنی بر رویکردی جدید در ردیابی و ارزیابی ویژگی‌های تصویر

نوع مقاله: مقالة‌ تحقیقی‌ (پژوهشی‌)

نویسندگان

1 دانشگاه تربیت مدرس، دانشکده مکانیک

2 دانشکده مکانیک تربیت مدرس

چکیده

در چند دهه اخیر استفاده از سیستم ناوبری تصویری به‌عنوان یک سیستم کمک ناوبری در کنار سیستم ناوبری اینرسی برای پرنده‌های بدون سرنشین مورد تحقیق بسیاری از محققان بوده است. در این تحقیق رویکردی جدید از الگوریتم ردیابی ویژگی‌ها به‌منظور بهبود خطای سیستم ناوبری اینرسی ارائه شده است. در این رویکرد از اطلاعات سیستم ناوبری اینرسی، ویژگی‌های تصویر قبلی و معادلات دینامیکی جهت پیش گویی نقاط ویژگی استفاده شده است. همچنین در این رویکرد به‌منظور بهبود دقت محاسبه نقاط شاخص زمینی، نقاط ویژگی پیش‌گویی شده نامطلوب در مقایسه با خروجی الگوریتم سیفت حذف می‌شوند. در این رویکرد جهت بهبود خطای ارتفاع از یک بارومتر در کنار سیستم تصویر استفاده شده است. نتایج شبیه‌سازی بیانگر دقت مطلوب مشاهدات سیستم تصویر و بارومتر در مرحله به‌روزرسانی فیلتر کالمن توسعه‌یافته و عملکرد مطلوب سیستم ناوبری تلفیقی مذکور در تعیین پارامترهای ناوبری یک پرنده بدون سرنشین است.

کلیدواژه‌ها


عنوان مقاله [English]

Visual-Inertial Navigation System based on a New approach in Tracking and Investigating the Image Feature Points

نویسندگان [English]

  • Mohammadvali Arbabmir 1
  • Masoud Ebrahimi 2
1 Tarbiat modares university, mechanic group
2 Tarbiat Modares University
چکیده [English]

In the last decades, the visual navigation system has been investigated by many researchers as an aided navigation system for Inertial Navigation System (INS) in the Unmanned Aerial Vehicles (UAV). In this research, for improving the INS errors a new approach based on feature tracking algorithm is used. In this approach, in order to estimate the feature points in the current image, the INS states, the feature points of the previous image and dynamic equations are used. Also, in this approach, for improving the estimation of terrain points, the outlier estimated feature points delete. Furthermore, in this article, for improving the altitude error, a barometer is used by the mentioned vision navigation system. The simulation results illustrate the desirable accuracy of the vision system and barometer observations in the update step of Extended Kalman Filter (EKF) and remarkable performance of integrated navigation system for calculating the UAV navigation parameters.

کلیدواژه‌ها [English]

  • INS
  • Visual Navigation
  • Feature Tracking
  • Barometer
  • EKF
[1]     M.V. Arbabmir, M. Ebrahimi and M. Norouz, “A Survey on Vision Navigation Methods for UAV Navigation Applications, ” Journal of Space Science & TechNology, Vol. 10, No. 2, pp. 33–52, 2017 (In Persian).

[2]     C. Kanellakis and G. Nikolakopoulos, “Survey on computer vision for UAVs: Current developments and trends,” J. Intell. Robot. Syst., Vol. 87, No. 1, pp. 141–168, 2017.

[3]     M. Bryson and S. Sukkarieh, “Observability analysis and active control for airborne SLAM,” IEEE Trans. Aerosp. Electron. Syst., Vol. 44, No. 1, 2008.

[4]     J. Kim and S. Sukkarieh, “Real-time implementation of airborne inertial-SLAM,” Rob. Auton. Syst., Vol. 55, No. 1, pp. 62–71, 2007.

[5]     M. A. A. Atashgah and S. M. B. Malaek, “An integrated virtual environment for feasibility studies and implementation of aerial MoNoSLAM,” Virtual Real., Vol. 16, No. 3, pp. 215–232, 2012.

[6]     A. Aminzadeh, M.A. Atashgah, and A. Roudbari, “Software in the loop framework for the performance assessment of a navigation and control system of an unmanned aerial vehicle,” IEEE Aerosp. Electron. Syst. Mag., Vol. 33, No. 1, pp. 50–57, 2018.

[7]     M.A.A. Atashgah, P. Gholampour, and S.M.B. Malaek, “Integration of image de-blurring in an aerial MoNo-SLAM,” Proc. Inst. Mech. Eng. Part G J. Aerosp. Eng., Vol. 228, No. 8, pp. 1348–1362, 2014.

[8]     A. Abaszadeh, H. ghanbarpourasl, Kh. Yaghmai, Data Fusion of Inertial Navigation Syatem and Visual Navigation Syatem, 13th Iranian Conference on Electrical Engineering, Iran, Zanjan, Zanjan University, 2005 (In Persian).

[9]     J. R. Carr and J. S. Sobek, “Digital scene matching area correlator (DSMAC),” in 24th Annual Technical Symposium, pp. 36–41, 1980.

[10]   J. Joglekar and S.S. Gedam, “Area Based Image Matching Methods–A Survey,” International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com, Vol. 2, Issue 1, 2012, pp. 130-136.

[11]   R. Krishnan and A. AR, “A Survey On Image Matching Methods,” International Journa L For Res Earch In Ap Pl I Ed Sc Ienc E And Engineering Technolo Gy (IJRAS ET), Vol. 2, Issue V, 2014, pp. 275-280.

[12]   D. Lee, Y. Kim, and H. Bang, “Vision-based Terrain Referenced Navigation for Unmanned Aerial Vehicles Using Homography Relationship,” J. Intell. Robot. Syst., Vol. 69, No. 1–4, pp. 489–497, 2013.

[13]   Q. Yu et al., “Full-parameter Vision Navigation Based on Scene Matching for Aircrafts,” Sci. China Inf. Sci., Vol. 57, No. 5, pp. 1–10, 2014.

[14]   A. Aminzadeh, M.A. AmiriAtashgh. “The Practical Implementation of Vision Navigation with Optical Flow Sensor for a Air Vehicle Robot, ”2nd National Congress Research in Computer Engineering, Iran, Tehran, 2015 (In Persian).

[15]   H. Chao, Y. Gu, and M. NapolitaNo, “A Survey of Optical Flow Techniques for UAV Navigation Applications,” in 2013 International Conference on Unmanned Aircraft Systems (ICUAS), 2013, pp. 710–716.

[16]   T. Cornall and G. Egan, “Optic Flow Methods Applied to Unmanned air Vehicles,” in Academic Research Forum, Department of Electrical and Computer Systems Engineering, Monash University, 2003.

[17]   B.K. Horn and B.G. Schunck, “Determining Optical Flow,” in 1981 Technical symposium east, pp. 319–331, 1981.

[18]   J.J. Koenderink and A.J. van Doorn, “Facts on Optic Flow,” Biol. Cybern., Vol. 56, No. 4, pp. 247–254, 1987.

[19]   W.E. Green, P.Y. Oh, K. Sevcik, and G. Barrows, “AutoNomous Landing for Indoor Flying Robots Using Optic Flow,” in ASME 2003 International Mechanical Engineering Congress and Exposition, pp. 1347–1352, 2003.

[20]   A. Aminzadeh and M. A. A. Atashgah, “Implementation and Performance Evaluation of Optical Flow Navigation System under Specific Conditions for a Flying Robot,” IEEE Aerosp. Electron. Syst. Mag., Vol. 33, No. 11, pp. 20–28, 2018.

[21]   F. Caballero, L. MeriNo, J. Ferruz, and A. Ollero, “Unmanned Aerial Vehicle Localization Based on MoNocular Vision and Online Mosaicking A New Mapping Framework,” Journal of Intelligent and Robotic Systems, Vol. 55,  pp. 323–343, 2009.

[22]   D. Nistér, O. Naroditsky, and J. Bergen, “Visual odometry,” in Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004., Vol. 1, pp. I-652, 2004.

[23]   S. Poddar, R. Kottath, and V. Karar, “EVolution of Visual Odometry Techniques,” arXiv Prepr. arXiv1804.11142, 2018.

[24]   M. Norouz, M. Ebrahimi, and M. Arbabmir, “Modified Unscented Kalman Filter for improving the integrated visual navigation system,” in 2017 Iranian Conference on Electrical Engineering (ICEE), pp. 753–758, 2017.

[25]   F. Fraundorfer and D. Scaramuzza, “Visual Odometry: Part II: Matching, Robustness, Optimization, and Applications,” IEEE Robot. Autom. Mag., Vol. 19, No. 2, pp. 78–90, 2012.

[26]   F. Fraundorfer and D. Scaramuzza, “Visual Odometry: Part i: The First 30 Years and Fundamentals,” IEEE Robot. Autom. Mag., Vol. 18, No. 4, pp. 80–92, 2011.

[27]   J. F. Raquet and M. Giebner, “Navigation Using Optical Measurements of Objects at UnkNown Locations,” in Proceedings of the 59th Annual Meeting of The Institute of Navigation and CIGTF 22nd Guidance Test Symposium, pp. 282–290, 2001.

[28]   A.D. Wu, E.N. Johnson, and A.A. Proctor, “Vision-Aided Inertial Navigation for Flight Control,” J. Aerosp. Comput. Information, Commun., Vol. 2, No. 9, pp. 348–360, 2005.

[29]   M.J. Veth, “Fusion of Imaging and Inertial Sensors for Navigation,” DTIC Document, 2006.

[30]   K.M. Han and G.N. DeSouza, “Geolocation of Multiple Targets from Airborne Video without Terrain Data,” J. Intell. Robot. Syst., Vol. 62, No. 1, pp. 159–183, 2011.

[31]   V. Indelman, P. Gurfil, E. Rivlin, and H. Rotstein, “Real-time Vision-aided Localization and Navigation Based on Three-view Geometry,” IEEE Trans. Aerosp. Electron. Syst., Vol. 48, No. 3, pp. 2239–2259, 2012.

[32]   J. Hosen, H. H. Helgesen, L. Fusini, T. I. Fossen, and T. A. Johansen, “Vision-aided Nonlinear Observer for Fixed-wing Unmanned Aerial Vehicle Navigation,” J. Guid. Control. Dyn., Vol. 39, No. 8, 2016, pp. 1777–1789.

[33]   S. Zhao, F. Lin, K. Peng, X. Dong, B. M. Chen, and T. H. Lee, “Vision-aided Estimation of Attitude, Velocity, and Inertial Measurement Bias for UAV Stabilization,” J. Intell. Robot. Syst., Vol. 81, No. 3–4, pp. 531–549, 2016.

[34]   T. J. Steiner, R.D. Truax, and K. Frey, “A Vision-aided Inertial Navigation System for Agile High-speed Flight in Unmapped Environments: Distribution Statement A: Approved for Public Release, Distribution Unlimited,” in 2017 IEEE Aerospace Conference, 2017, pp. 1–10.

[35]   L. Ljung, “Asymptotic Behavior of the Extended Kalman Filter as a Parameter Estimator for Linear Systems,” IEEE Trans. Automat. Contr., Vol. 24, No. 1, pp. 36–50, 1979.

[36]   D. Titterton, J.L. Weston, and J. Weston, Strapdown inertial navigation techNology, Vol. 17. IET, 2004.

[37]   R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision. Cambridge university press, 2003.

[38]   D.G. Lowe, “Object Recognition from Local Scale-Iinvariant Features,” in The proceedings of the seventh IEEE international conference on Computer vision, 1999. Vol. 2, pp. 1150–1157, 1999.

[39] G. Siouris, “Aerospace Avionics Systems: a Modern Synthesis,” 1st, edition, ebook, Esevier, 1993.