Design a Small UAV with Autonomous Navigation Challenge

Overview and Background

Autonomy in unmanned aerial systems (UAVs), commonly known as drones, is a desirable and futuristic goal. It is also necessary to advance UAV technology. UAVs are being used by hobbyists, in the military as well as commercial businesses, for a variety of applications. In the military, for example, UAVs are being deployed for Intelligence, Surveillance and Reconnaissance (ISR) missions. In commercial sectors, on the other hand, UAVs are being used for a wide range of functions, including transport monitoring, pipeline and border patrol surveillance, monitoring traffic congestion, assessing damages to buildings after a natural disaster, and delivering packages, just to name a few. Autonomy can add a new paradigm towards intelligent unmanned systems and opens doors for new applications and tactics. Rapid progress in terms of autonomous flight has been demonstrated in large military UAVs, including advances in semi-autonomous and multi-functional capabilities; however, very limited progress has been made in small UAVs. This is primarily due to a lack of on-board power and payload capability. Given the Army’s vision, in order to augment small military units, individuals or soldiers will need small UAVs (weighing from a few tens of grams up to 5kg); therefore, advances in small UAV autonomous flight technology is essential.

One critical function that a UAV must possess is a “sense and avoid” capability. Large systems utilize a suite of sophisticated sensors such as radar, LIDAR, and Electro Optical/IR cameras, and also possess significant on-board computational power to analyze objects (i.e., terrain, trees, other environmental or urban elements) around them in real-time while in flight. These types of systems are currently prohibitive on small UAVs due to weight and space considerations.

Problem Statement

The purpose of this challenge is to find a solution that provides a “sense and avoid” capability for a small, light-weight UAV system. The problem of navigating through uncharted natural and man-made environments without colliding with objects is challenging because one has to create a three-dimensional representation of the terrain/environment with respect to the UAV’s instantaneous position while in flight. For humans, the eyes (our optical sensors) are offset by a few inches, which creates so-called parallax, or a small offset/ displacement of an object when viewed by the two different lines of sight (through each eye). The amount of offset/displacement — more for close objects and less for far away objects —can be used to gain a sense of distance to an object, and provide a three-dimensional representation of the landscape or object that is in front of the sensors (eyes). This is often referred to as depth perception.

Ideation Challenge:
Develop a Small Light-Weight UAV/System with Autonomous Navigation

The objective of this challenge is to develop a small light-weight UAV/system that can autonomously navigate an obstacle course using only optical camera(s).

Please submit your high-level thoughts on the following:

Design a new small UAV platform and flight/remote controller for autonomous flight with sense and avoid capabilities. This can be done by redesigning any aspect of the equipment including, but not limited to the platform, the robotics, the cameras, the controllers, the processors, etc. Be creative!  The only restrictions are that the design must be under 5 weight, and 18 inches in width.

As you think about your response please consider the following:

  1. Describe the systems hardware and/or unique software components (platform, camera(s), controllers, computers, unique software modules, batteries, etc.).
  1. Describe how the system will see (sense) and understand objects (terrain, trees, solid obstructions, etc.). That is, how will you achieve things like parallax, size/depth perception, distance, etc., while in flight, or have you figured out other ways to “sense”?
  1. Describe how and/or where the system will process the camera/optical/sensing data and object information? Also, describe how much time this will take.
  1. Describe how the system will take the optical data (described in 3. above) and feed it back to the flight controller(s) as necessary, to avoid objects in flight.
  2. Describe how the system will achieve autonomous flight through an obstacle course — including how long it might take to complete — from one point to another (10 feet off the ground and 100 feet apart).