This project represents one component of the Dronehook NASA grant to build an autonomous plane. A successful program in this case is an all-in-one software package that takes camera data as input and outputs a target direction in order for the plane to correct its trajectory.
Originally, I wrote the code to find a large red weather balloon. In this way, the balloon could be filled with helium and the plane would use it as a target. I used OpenCV Python to segment all red hues within a small range and employ contour detection to find the largest circular object. At altitude, there is not likely to be another large red object, so I prioritized code simplicity. In the final version of the code, I transitioned to using ArUco markers for their smaller size and lower false positive rate. I utilized the X/Y pixel position of the marker from OpenCV's Aruco marker detection module to provide guidance to the plane.
The system runs entirely within an NVIDIA Jetson Nano and a Raspberry Pi camera with a telephoto lens. I achieved 0.22s of latency between the appearance of the marker and the system's recognition, with a max reliable distance of 80ft. In addition to completing the original project goals, I also integrated flight surface control via USB from the NVIDIA Jetson to the system's flight controller. As a result, the plane both identified and physically reacted to the target in the lab. In the final stage, I mounted the camera on the plane and measured 0.22s of latency between new camera frame input and movement of the plane's ailerons.