نوع مقاله : مقالة‌ تحقیقی‌ (پژوهشی‌)

نویسندگان

1 دانشیار، دانشکده علوم و فنون نوین، دانشگاه تهران، تهران، ایران

2 دانشجوی دکتری، دانشکده علوم و فنون نوین، دانشگاه تهران، تهران، ایران

چکیده

در این مقاله رویکرد مکان یابی یک ربات پرنده در یک فضای محدود توسط دوربین هایی با دید وسیع ارائه می شود. در ابتدا شبیه سازی دینامیکی 6 درجه آزادی وسیله در محیط Simulink انجام شده و حرکت آن در یک محیط مجازی در 3DsMAX پیاده سازی می‌شود. در این محیط تعدادی دوربین ‌چشم‌ماهی در چند نقطه مستقر شده ‌و تصاویر دریافتی آنها از پرواز ربات پرنده به صورت برونخط توسط کتابخانه‌هایOpenCV تجزیه و تحلیل می‌گردند. سپس به منظور کالیبراسیو ن بُرونی دوربینها از الگویی مشخص شامل تعدادی نقطه روشن که مختصات معلوم دارند تصویربرداری شده و با کمک روشی مبتنی بر نقاط پرسپکتیو، جهت و مکان خود دوربین ها بدست می‌آید. نتایج تحقیق حاکی از این است که با دقتی در حدود 4 سانتیمتر می توان موقعیت وسیله را تخمین زد. شایان ذکر است که از روش پیشنهادی می‌توان در ناوبری رباتهای فضایی و مریخ نورد نیز بهره برد.

کلیدواژه‌ها

موضوعات

عنوان مقاله [English]

Localization of Aerial Robot Based on Fisheye Cameras in a Virtual Lab

نویسندگان [English]

  • MohammadAli Amiri Atashgah 1
  • Seyyed Mohammad-Jafar Tabib 2

1 Associate Professor, Faculty of New Sciences and Technologies, University of Tehran, Tehran, Iran

2 PhD Student, Aerospace Engineering Department, Faculty of New Sciences and Technologies, University of Tehran,Tehran, Iran

چکیده [English]

This research represents localization of an aerial robot using fisheye cameras on walls in a simulation environment. The virtual testbed in this work is a quadrotor that is simulated in MATLAB Simulink. Subsequently, the simulation outputs as flight records are used in a virtual lab, which is developed in 3DsMAX. Then, the virtual fisheye cameras (here two) are installed in some different points on the walls and the related images from the cameras are received offline. The gathered images will be processed by OpenCV in a C++ environment. For external calibration, each fisheye camera takes an image from a known pattern consist of some lights placed in the virtual lab. We execute Perspective-n-Point method on the images to obtain pierce direction/position of the camera. For more, the aerial robot is localized by computing the nearest point between two lines of sight. In brief, the outcomes exhibit an accuracy of 4cm in the center of the virtual-room room.

کلیدواژه‌ها [English]

  • Virtual Environment
  • Visual Navigation
  • Fisheye Calibration
  • Fisheye Camera
  • Perspective-n-Point
  • Aerial Robot
[1]  Lee, C.-Y. “Cooperative Drone Positioning Measuring in Internet-of-Drones,” in 2020 IEEE 17th Annual Consumer Communications & Networking Conference (CCNC), 2020, pp. 1–3.
[2]  Aminzadeh, A., M. A. Amiri Atashgah, and A. Roudbari, “Software in the loop framework for the performance assessment of a navigation and control system of an unmanned aerial vehicle,” IEEE Aerosp. Electron. Syst. Mag., vol. 33, no. 1, pp. 50–57, Jan. 2018.
[3]  Tyukin, A. L., A. L. Priorov, and I. M. Lebedev, “Research and development of an indoor navigation system based on the digital processing of video images,” Pattern Recognit. Image Anal., vol. 26, no. 1, pp. 221–230, 2016.
[4]  Cheng, R.-S., W.-J. Hong, J.-S. Wang, and K. W. Lin, “An Indoor Guidance System Combining Near Field Communication and Bluetooth Low Energy Beacon Technologies,” World Acad. Sci. Eng. Technol. Int. J. Comput. Electr. Autom. Control Inf. Eng., vol. 10, no. 9, pp. 1639–1645, 2016.
[5]  Bergeon, Y., I. Hadda, V. Křivánek, J. Motsch, and A. Štefek, “Low cost 3D mapping for indoor navigation,” in Military Technologies (ICMT), 2015 International Conference on, 2015, pp. 1–5.
[6]  Oleynikova, H., D. Honegger, and M. Pollefeys, “Reactive avoidance using embedded stereo vision for MAV flight,” in Robotics and Automation (ICRA), 2015 IEEE International Conference on, 2015, pp. 50–56.
[7]  Sheinker, A., B. Ginzburg, N. Salomonski, L. Frumkis, B.-Z. Kaplan, and M. B. Moldwin, “A method for indoor navigation based on magnetic beacons using smartphones and tablets,” Measurement, vol. 81, pp. 197–209, 2016.
[8]  Dryanovski, I., R. G. Valenti, and J. Xiao, “An open-source navigation system for micro aerial vehicles,” Auton. Robots, vol. 34, no. 3, pp. 177–188, 2013.
[9]  Chai, W., C. Chen, E. Edwan, J. Zhang, and O. Loffeld, “INS/Wi-Fi based indoor navigation using adaptive Kalman filtering and vehicle constraints,” in Positioning Navigation and Communication (WPNC), 2012 9th Workshop on, 2012, pp. 36–41.
[10]  Simpson, A. K., “Autonomous robot control for underwater tunnel mapping,” Sr. thesis, Princet. Univ., 2012.
[11]  S. Biswas and R. Sharma, “Goal-aware Navigation of Quadrotor UAV for Infrastructure Inspection,” in AIAA Scitech 2019 Forum, 2019, p. 1691.
[12]  Kumar, G. A., A. Patil, R. Patil, S. Park, and Y. Chai, “A LiDAR and IMU integrated indoor navigation system for UAVs and its application in real-time pipeline classification,” Sensors, vol. 17, no. 6, p. 1268, 2017.
[13]  Balasubramanian, A. and A. Ganesan, “Vision-based heading and lateral deviation estimation for indoor navigation of a quadrotor,” IETE J. Res., vol. 63, no. 4, pp. 597–603, 2017.
[14]  Tiemann, J., A. Ramsey, and C. Wietfeld, “Enhanced UAV indoor navigation through SLAM-augmented UWB localization,” in 2018 IEEE International Conference on Communications Workshops (ICC Workshops), 2018, pp. 1–6.
[15]  Yin, Y., Y. Chen, L. Wan, Y. Zhang, and Y. Yang, “Research on indoor multipoint data acquisition with a micro UAV,” in 2017 IEEE International Conference on Environment and Electrical Engineering and 2017 IEEE Industrial and Commercial Power Systems Europe (EEEIC/I&CPS Europe), 2017, pp. 1–4.
[16]  Zhang, P., R. Li, Y. Shi, and L. He, “Indoor Navigation for Quadrotor Using RGB-D Camera,” in Proceedings of 2018 Chinese Intelligent Systems Conference, 2019, pp. 497–506.
[17]  Nahangi, M., A. Heins, B. McCabe, and A. Schoellig, “Automated Localization of UAVs in GPS-Denied Indoor Construction Environments Using Fiducial Markers,” in ISARC. Proceedings of the International Symposium on Automation and Robotics in Construction, 2018, vol. 35, pp. 1–7.
[18]  Zhou, B., Z. Chen, and Q. Liu, “An Efficient Solution to the Perspective-n-Point Problem for Camera With Unknown Focal Length,” IEEE Access, vol. 8, pp. 162838–162846, 2020.
[19]  Adli, S.E., M. Shoaran, and S. M. S. Noorani, “GSP n P: simple and geometric solution for P n P problem,” Vis. Comput., vol. 36, no. 8, pp. 1549–1557, 2020.
[20]  Bai, L., Y. Yang, C. Guo, C. Feng, and X. Xu, “Camera assisted received signal strength ratio algorithm for indoor visible light positioning,” IEEE Commun. Lett., vol. 23, no. 11, pp. 2022–2025, 2019.
[21]  Grunert, J.A., “Das pothenotische problem in erweiterter gestalt nebst bber seine anwendungen in der geodasie,” Grunerts Arch. fur Math. und Phys., pp. 238–248, 1841.
[22]  Kneip, L., D. Scaramuzza, and R. Siegwart, “A novel parametrization of the perspective-three-point problem for a direct computation of absolute camera position and orientation,” 2011.
[23]  Gao, X.-S., X.-R. Hou, J. Tang, and H.-F. Cheng, “Complete solution classification for the perspective-three-point problem,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 25, no. 8, pp. 930–943, 2003.
[24]  Available, [on Line]: “https://docs.opencv.org/3.2.0/d7/ d1b/group__imgproc__misc.html.” .
[25] Available, [on Line]: “https://docs.opencv.org/3.2.0/d3/dc0/ group__imgproc__shape.html.” .
[26] Suzuki, S. “Topological structural analysis of digitized binary images by border following,” Comput. vision, Graph. image Process., vol. 30, no. 1, pp. 32–46, 1985.
[27] Available, [on Line]: “https://math.stackexchange.com /questions/61719/finding-the-intersection-point-of-many-ines-in-3d-point-closest-to-all-lines.”