My research pushes the boundaries of autonomous operation on agile micro aerial vehicles, through the design of robust and lightweight perception algorithms. My work is a combination of rigorous theory and practical implementation: I bring new theoretical tools to the robotics community (e.g., convex relaxations, spectral graph theory, distributed computing, compressive sensing) and demonstrate their revolutionary impact in real-world applications. My previous research provided fundamental insights and performance guarantees in robot localization and mapping. My current and future research will redesign the landscape of sensing and perception for resource-constrained robots, with a special focus on swarms of micro and nano aerial vehicles.

Autonomous robots are becoming a pervasive technology and will provide a critical asset to address some of the major societal challenges over the next decades. Unmanned aerial vehicles (UAV) for crop monitoring and spraying in precision agriculture will enable early disease detection and timely actions on crops, reducing yield loss and helping to cope with the increasing food demand. Fast and agile UAV will deliver medical supplies (e.g., vaccines) in rural areas and in underdeveloped countries, fighting the spread of diseases and providing new opportunities for the global welfare. Autonomous robots capable of dexterously moving among obstacles will be an invaluable support for search and rescue and disaster response. Inexpensive and lightweight platforms will collect a large amount of environmental data, providing new perspectives and actionable understanding of phenomena such as hurricanes and floods. A virtually endless list of applications includes infrastructure inspection, transportation, monitoring, construction, and entertainment, making robotics a multi-billion dollar market in rapid expansion.

A main challenge towards this vision is the design of robust and lightweight perception algorithms, which interpret sensor data into a coherent world representation, enabling on-board situational awareness and high-level decision-making. Perception constitutes a bottleneck in the deployment of robotics systems, and indeed most of the existing robotics applications either rely on low-level supervision by a human (e.g., robots used for bomb disposal or disaster response), or operate in structured environments (e.g., fenced areas in factory floors equipped with external markers/cameras for localization and guidance).

My research aims to bring autonomous robots into the real world. During my Ph.D. I addressed issues related to robust robot perception. Traditional algorithms for localization and mapping are fragile and rely on careful parameter tuning. Moreover, they are prone to failure in off-nominal conditions (e.g., large sensor noise, outliers). I demonstrated that using tools from nonlinear optimization (e.g, convex relaxation, Lagrangian duality), graph theory (e.g., cycle space, spectral graph theory), Riemannian geometry, and probabilistic inference, one can design faster and more robust algorithms, which are less sensitive to parameter tuning and adverse environmental conditions. These algorithms have been implemented in popular robotics software libraries and used by universities and companies. Robust perception algorithms relax the requirement of human supervision, making robot deployment cheaper, and enabling to scale to large teams of cooperative robots in real scenarios.

My current and future research will enable autonomous navigation of resource-constrained platforms, with a special focus on swarms of agile micro (MAV) and nano (NAV) aerial vehicles. Faster operational speed means more efficient task completion, which is crucial in time-critical applications (e.g., search and rescue). Agility, in particular, is a key requirement for indoor operation, pushing towards the adoption of smaller platforms. While the use of multiple small MAVs appears as a desirable alternative to more expensive monolithic solutions, the deployment of these platforms in the real world poses formidable challenges. The limited payload and power impose constraints on the onboard computation and sensing, preventing the use of information-rich sensors, such as lidars and depth cameras. Moreover, the adoption of a large number of vehicles largely limits the bandwidth available to each vehicle. Finally, the use of platforms with fast dynamics requires perception algorithms to operate in a very challenging regime (motion blur, sub-sampled data, high rate and low latency estimation for fast closed-loop control). These challenges require a paradigm shift and open a number of research endeavors, such as dealing with very sparse sensor data (sparse sensing), and designing algorithms that selectively process only sensor data that is relevant to complete a given task (perceptual attention), and that are aware of the on-board resources of the platform (algorithms-and-hardware co-design).

My research will have a broader impact on the robotic ecosystem beyond the micro aerial vehicles domain. The study of resource constrained perception systems will empower bio-inspired robots (e.g., robotic insects) with advanced navigation capabilities. Moreover, it will impact all domains in which sensing is limited (e.g., endoscopic surgery). Finally, it will promote the use of more affordable sensors in safety-critical applications (e.g., self-driving cars), by leveraging a tighter integration of sensing, perception, and control.