Self-Controlled Drone.



EOI: 10.11242/viva-tech.01.04.094

Download Full Text here



Citation

Ms. Saurabh Guchait, Ms. Mayuri Dudam, Ms. Mohammed Aquib Khan, Prof. Pratik Parsewar, "Self-Controlled Drone.", VIVA-IJRI Volume 1, Issue 4, Article 94, pp. 1-5, 2021. Published by Computer Engineering Department, VIVA Institute of Technology, Virar, India.

Abstract

We all have seen drone at some point of our life. Drones are unmanned vehicles. For layman drones are usually „Quad-copters? .Quad-copter is more like a small helicopter with four rotors attach to it. In this project we will be using a similar Quad-copter. You will be surprised to know that first manual quad-copter was invented in 1920 by Etienne Omnichen. And now the quad-copters or drone works on remote control. And in recent years due to the increase in artificial intelligence and machine learning era there are drone which fly on their own without much involvement of human. These types of drone are known as autonomous drone. But the problem with these type of traditional autonomous drone are that they work using ultrasonic sensors. The use of ultrasonic sensor in these drones increases the reaction time to react to an obstacle which simultaneously decreases the latency of the drone. So it is not a good option to use these drones in public places. To tackle such problems we are proposing our project which is „Self-Controlled Drone?. In this project we try to make an autonomous drone which will be using a camera sensor instead of an ultrasonic sensor. Use of various machine learning and artificial intelligence algorithms as well as image processing programs will make the drone work autonomously by taking input from the camera sensor, processing on the input and react accordingly. The use of the camera sensor in the drone will increase the latency of the drone and decrease the reaction timing. Another thing is that to travel the drone from one place to another we will be using a GPS module, which will guide the drone. The making of this drone will bring a change in the drone culture and will help to make the drone more safer and efficient to use in public spaces.

Keywords

Algorithms ,Artificial Intelligence, Autonomous, Drone, Machine Learning, Microcontroller, Obstacle Detection, Programming ,Quad-Copter, Sensors.

References

  1. Davide Falanga, Suseong Kim, and Davide Scaramuzza “How Fast is Too Fast? The Role of Perception Latency in High-Speed Sense and Avoid”,IEEE,2019
  2. Tobi Delbruck, Yuhuang Hu and Zhe He, “V2E: From video frames to realistic DVS event camera streams”, IEEE,2008.
  3. Anton Mitrokhin, Cornelia Ferm¨uller, Chethan Parameshwara, Yiannis Aloimonos , “Event-based Moving Object Detection and Tracking”, Department of Computer Science, Institute for Advanced Computer Studies, and Neuroscience and Cognitive Science Program, University of Maryland, USA.
  4. Elia Kaufmann, Antonio Loquercio, René Ranftl , Matthias Müller , Vladlen Koltun , Davide Scaramuzza., “Deep Drone Acrobatics”, Robotics: Science and Systems, 2020.
  5. Ryad Benosman, Charles Clercq, Xavier Lagorce, Sio-Hoi Ieng, and Chiara Bartolozzi, “Event-Based Visual Flow ”, IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 25, NO. 2, FEBRUARY 2014.
  6. Jin-Chun Piao and Shin-Dug Kim, “Adaptive Monocular Visual–Inertial SLAM for Real-Time Augmented Reality Applications in Mobile Devices”, Department of Computer Science, Yonsei University 2017.
  7. Matia Pizzoli, Christian Forster and Davide Scaramuzza, “REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time ”, Robotics and Perception Group, University of Zurich, Switzerland.
  8. Barza Nisar, Philipp Foehn, Davide Falanga, Davide Scaramuzza, “VIMO: Simultaneous Visual Inertial Model-based Odometry and Force Estimation”, Robotics: Science and Systems Conference, Freiburg, 2019, and the IEEE Robotics and Automation Letter.
  9. Junhaeng Lee, T. Delbruck, Paul K. J. Park, Michael Pfeiffer, Chang-Woo Shin, Hyunsurk Ryu, and Byung Chang Kang, “GestureBased remote control using stereo pair of dynamic vision sensors ”, Institute of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland.
  10. Tomoyuki Mori and Sebastian Scherer, “First Results in Detecting and Avoiding Frontal Obstacles from a Monocular Camera for Micro Unmanned Aerial Vehicles”, Carnegie Mellon University, Pittsburgh.
  11. Tobi Delbruck, “Fun with Asynchronous Vision Sensors and Processing ”, Inst. of Neuroinformatics, University of Zurich and ETH Zurich.
  12. Tobi Delbruck and Shih-Chii Liu, “Data-Driven Neuromorphic DRAM-based CNN and RNN Accelerators.”, Sensors Group, Institute of Neuroinformatics University of Zurich and ETH Zurich, Switzerland.