Headed by Prof. Davide Scaramuzza, the Robotics and Perception Group works at the intersection of robotics, computer vision, machine learning, and neuroscience. The lab develops artificial intelligence algorithms that can make autonomous drones fly better and faster than human pilots by using only onboard cameras and computation. Indeed, current commercial drones are completely blind: they navigate using GPS or a human pilot, which prevents their use in complex missions (search and rescue, cargo delivery, flying cars, inspection of bridges or power lines). By equipping drones with cameras, they can navigate also in absence of GPS, like indoors, under bridges or tree canopies. Another aspect that the lab investigate is on fast navigation of drones. Because the battery of drones is limited to 30 minutes, we need to make them faster so that they can accomplish more within the limited battery time. But to do so, they need to use faster sensors and algorithms. One of the key sensors that our drones use is an "event camera", a novel high-speed sensor with much lower latency (the delay between the moment when an image is captured and the moment when it is actually displayed) and higher dynamic range than standard cameras; however, these cameras function very differently than conventional cameras, so that new algorithms must be developed for them.
Watch the following video and listen to Davide Scaramuzza explaining his lab's research work.
Watch more videos here:
Learning High-Speed Flight in the Wild (Science Robotics, 2021)
AI Drone faster than Humans? Time-Optimal Planning for Quadrotor Waypoint Flight
Deep Drone Acrobatics (RSS 2020)
Dynamic Obstacle Avoidance for Quadrotors with Event Cameras (Science Robotics 2020)
The Foldable Drone: A Morphing Quadrotor that can Squeeze and Fly
TimeLens: Event-based Video Frame Interpolation (CVPR 2021)