"Artificial Intelligence Based Autonomous Flight and Control System for Unmanned Aerial Vehicles"


Zeren U., Tan Y., YILDIZ A.

Yükseköğretim Kurumları Destekli Proje, Üniversite Destekli Diğer Projeler, 2020 - 2021

  • Proje Türü: Yükseköğretim Kurumları Destekli Proje
  • Destek Programı: Üniversite Destekli Diğer Projeler
  • Başlama Tarihi: Kasım 2020
  • Bitiş Tarihi: Eylül 2021

Proje Özeti

"The main aim of this study, which is planned to be carried out within the scope of TED University Electrical Electronics Engineering Department Senior Project, is to create a control system that will enable manual and artificial intelligence based autonomous use for unmanned aerial vehicles. In this project, the hardware studies consist of 2 parts as vehicle brain unit and sensor fusion unit which will integrated with the brain unit. Besides that, the software studies cover artificial intelligence, machine learning, computer vision and image processing algorithms. The purpose of this study is utilize the concepts that have extremely important roles in today's technology effectively. The initial objectives of the project are highly promising for further development. The planned hardware architecture of the system will provide a ready integration environment for different artificial intelligence and image processing activities in the future. In the first stage, the main tasks of the project were determined as real time object detection and real time object tracking. These objectives were initially arranged in order to be able to evaluate the performance of the system. They will alsooffer a broader application area. Another main objective of the project is that this system, which can be used in defense industry and aviation, can work in harmony with different sensors and other subsystems in future. The main processing unit that can be hardware-integrated on the vehicle is the brain component of the system. The image taken from the dual cameras by this component where artificial intelligence and computer vision activities take place. By the computer vision models realized in real time on the same image, the necessary operations for autonomous functions will be made. Meanwhile, the unit is in active communication with the operator and will guide the navigation of the vehicle according to the commands coming from the operator in manual usage mode. The ground control unit is the component where the system is made available to operator management. It is a unit that processes the commands given by the operator through the units on it and transmits it to the main processing unit. At the same time, it is capable of transmitting the image it receives from the main processing unit and the data about the active artificial intelligence applications (such as detected assets, suitable landing areas) to the operator through the screen on it. With the harmonious and synchronized operation of these two units, it is aimed to create a user-friendly system that will allow the operator to take an active role in artificial intelligence-based activities."