This page has only limited features, please log in for full access.
A variety of open-source software tools are currently available to help building autonomous mobile robots. These tools have proven their effectiveness in developing different types of robotic systems, but there are still needs related to safety and efficiency that are not sufficiently covered. This article describes recent advances in the Aerostack software framework to address part of these needs, which may become critical in the case of aerial robots. The article describes a software tool that helps to develop the executive system, an important component of the control architecture whose characteristics significantly affect the quality of the final autonomous robotic system. The presented tool uses an original solution for execution control that aims at simplifying mission specification and protecting against errors, considering also the efficiency needs of aerial robots. The effectiveness of the tool was evaluated by building an experimental autonomous robot. The results of the evaluation show that it provides significant benefits about usability and reliability with acceptable development effort and computational cost. The tool is based on Robot Operating System and it is publicly available as part of the last release of the Aerostack software framework (version 3.0).
Martin Molina; Abraham Carrera; Alberto Camporredondo; Hriday Bavle; Alejandro Rodriguez-Ramos; Pascual Campoy. Building the executive system of autonomous aerial robots using the Aerostack open-source framework. International Journal of Advanced Robotic Systems 2020, 17, 1 .
AMA StyleMartin Molina, Abraham Carrera, Alberto Camporredondo, Hriday Bavle, Alejandro Rodriguez-Ramos, Pascual Campoy. Building the executive system of autonomous aerial robots using the Aerostack open-source framework. International Journal of Advanced Robotic Systems. 2020; 17 (3):1.
Chicago/Turabian StyleMartin Molina; Abraham Carrera; Alberto Camporredondo; Hriday Bavle; Alejandro Rodriguez-Ramos; Pascual Campoy. 2020. "Building the executive system of autonomous aerial robots using the Aerostack open-source framework." International Journal of Advanced Robotic Systems 17, no. 3: 1.
Indoor environments have abundant presence of high-level semantic information which can provide a better understanding of the environment for robots to improve the uncertainty in their pose estimate. Although semantic information has proved to be useful, there are several challenges faced by the research community to accurately perceive, extract and utilize such semantic information from the environment. In order to address these challenges, in this paper we present a lightweight and real-time visual semantic SLAM framework running on board aerial robotic platforms. This novel method combines low level visual/visual-inertial odometry (VO/VIO) along with geometrical information corresponding to planar surfaces extracted from detected semantic objects. Extracting the planar surfaces from selected semantic objects provides enhanced robustness and makes it possible to precisely improve the metric estimates rapidly, simultaneously generalizing to several object instances irrespective of their shape and size. Our graph-based approach can integrate several state of the art VO/VIO algorithms along with the state of the art object detectors in order to estimate the complete 6DoF pose of the robot while simultaneously creating a sparse semantic map of the environment. No prior knowledge of the objects is required, which is a significant advantage over other works. We test our approach on a standard RGB-D dataset comparing its performance with the state of the art SLAM algorithms. We also perform several challenging indoor experiments validating our approach in presence of distinct environmental conditions and furthermore test it on board an aerial robot.
Hriday Bavle; Paloma De La Puente; Jonathan P. How; Pascual Campoy. VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems. IEEE Access 2020, 8, 60704 -60718.
AMA StyleHriday Bavle, Paloma De La Puente, Jonathan P. How, Pascual Campoy. VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems. IEEE Access. 2020; 8 (99):60704-60718.
Chicago/Turabian StyleHriday Bavle; Paloma De La Puente; Jonathan P. How; Pascual Campoy. 2020. "VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems." IEEE Access 8, no. 99: 60704-60718.
Deep- and reinforcement-learning techniques have increasingly required large sets of real data to achieve stable convergence and generalization, in the context of image-recognition, object-detection or motion-control strategies. On this subject, the research community lacks robust approaches to overcome unavailable real-world extensive data by means of realistic synthetic-information and domain-adaptation techniques. In this work, synthetic-learning strategies have been used for the vision-based autonomous following of a noncooperative multirotor. The complete maneuver was learned with synthetic images and high-dimensional low-level continuous robot states, with deep- and reinforcement-learning techniques for object detection and motion control, respectively. A novel motion-control strategy for object following is introduced where the camera gimbal movement is coupled with the multirotor motion during the multirotor following. Results confirm that our present framework can be used to deploy a vision-based task in real flight using synthetic data. It was extensively validated in both simulated and real-flight scenarios, providing proper results (following a multirotor up to 1.3 m/s in simulation and 0.3 m/s in real flights).
Alejandro Rodriguez-Ramos; Adrian Alvarez-Fernandez; Hriday Bavle; Pascual Campoy; Jonathan P. How. Vision-Based Multirotor Following Using Synthetic Learning Techniques. Sensors 2019, 19, 4794 .
AMA StyleAlejandro Rodriguez-Ramos, Adrian Alvarez-Fernandez, Hriday Bavle, Pascual Campoy, Jonathan P. How. Vision-Based Multirotor Following Using Synthetic Learning Techniques. Sensors. 2019; 19 (21):4794.
Chicago/Turabian StyleAlejandro Rodriguez-Ramos; Adrian Alvarez-Fernandez; Hriday Bavle; Pascual Campoy; Jonathan P. How. 2019. "Vision-Based Multirotor Following Using Synthetic Learning Techniques." Sensors 19, no. 21: 4794.
Execution control is a critical task of robot architectures which has a deep impact on the quality of the final system. In this study, we describe a general method for execution control, which is a part of the Aerostack software framework for aerial robotics, and present technical challenges for execution control and design decisions to develop the method. The proposed method has an original design combining a distributed approach for execution control of behaviors (such as situation checking and performance monitoring) and centralizes coordination to ensure consistency of the concurrent execution. We conduct experiments to evaluate the method. The experimental results show that the method is general and usable with acceptable development efforts to efficiently work on different types of aerial missions. The method is supported by standards based on a robot operating system (ROS) contributing to its general use, and an open-source project is integrated in the Aerostack framework. Therefore, its technical details are fully accessible to developers and freely available to be used in the development of new aerial robotic systems.
Martin Molina; Alberto Camporredondo; Hriday Bavle; Alejandro Rodriguez-Ramos; Pascual Campoy. An execution control method for the Aerostack aerial robotics framework. Frontiers of Information Technology & Electronic Engineering 2019, 20, 60 -75.
AMA StyleMartin Molina, Alberto Camporredondo, Hriday Bavle, Alejandro Rodriguez-Ramos, Pascual Campoy. An execution control method for the Aerostack aerial robotics framework. Frontiers of Information Technology & Electronic Engineering. 2019; 20 (1):60-75.
Chicago/Turabian StyleMartin Molina; Alberto Camporredondo; Hriday Bavle; Alejandro Rodriguez-Ramos; Pascual Campoy. 2019. "An execution control method for the Aerostack aerial robotics framework." Frontiers of Information Technology & Electronic Engineering 20, no. 1: 60-75.
The lack of redundant attitude sensors represents a considerable yet common vulnerability in many low-cost unmanned aerial vehicles. In addition to the use of attitude sensors, exploiting the horizon as a visual reference for attitude control is part of human pilots’ training. For this reason, and given the desirable properties of image sensors, quite a lot of research has been conducted proposing the use of vision sensors for horizon detection in order to obtain redundant attitude estimation onboard unmanned aerial vehicles. However, atmospheric and illumination conditions may hinder the operability of visible light image sensors, or even make their use impractical, such as during the night. Thermal infrared image sensors have a much wider range of operation conditions and their price has greatly decreased during the last years, becoming an alternative to visible spectrum sensors in certain operation scenarios. In this paper, two attitude estimation methods are proposed. The first method consists of a novel approach to estimate the line that best fits the horizon in a thermal image. The resulting line is then used to estimate the pitch and roll angles using an infinite horizon line model. The second method uses deep learning to predict attitude angles using raw pixel intensities from a thermal image. For this, a novel Convolutional Neural Network architecture has been trained using measurements from an inertial navigation system. Both methods presented are proven to be valid for redundant attitude estimation, providing RMS errors below 1.7° and running at up to 48 Hz, depending on the chosen method, the input image resolution and the available computational capabilities.
Adrian Carrio; Hriday Bavle; Pascual Campoy. Attitude estimation using horizon detection in thermal images. International Journal of Micro Air Vehicles 2018, 10, 352 -361.
AMA StyleAdrian Carrio, Hriday Bavle, Pascual Campoy. Attitude estimation using horizon detection in thermal images. International Journal of Micro Air Vehicles. 2018; 10 (4):352-361.
Chicago/Turabian StyleAdrian Carrio; Hriday Bavle; Pascual Campoy. 2018. "Attitude estimation using horizon detection in thermal images." International Journal of Micro Air Vehicles 10, no. 4: 352-361.
Deep learning techniques for motion control have recently been qualitatively improved, since the successful application of Deep Q- Learning to the continuous action domain in Atari-like games. Based on these ideas, Deep Deterministic Policy Gradients (DDPG) algorithm was able to provide impressive results in continuous state and action domains, which are closely linked to most of the robotics-related tasks. In this paper, a vision-based autonomous multirotor landing maneuver on top of a moving platform is presented. The behaviour has been completely learned in simulation without prior human knowledge and by means of deep reinforcement learning techniques. Since the multirotor is controlled in attitude, no high level state estimation is required. The complete behaviour has been trained with continuous action and state spaces, and has provided proper results (landing at a maximum velocity of 2 m/s), Furthermore, it has been validated in a wide variety of conditions, for both simulated and real-flight scenarios, using a low-cost, lightweight and out-of-the-box consumer multirotor.
Alejandro Rodriguez-Ramos; Carlos Sampedro; Hriday Bavle; Ignacio Gil Moreno; Pascual Campoy. A Deep Reinforcement Learning Technique for Vision-Based Autonomous Multirotor Landing on a Moving Platform. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018, 1010 -1017.
AMA StyleAlejandro Rodriguez-Ramos, Carlos Sampedro, Hriday Bavle, Ignacio Gil Moreno, Pascual Campoy. A Deep Reinforcement Learning Technique for Vision-Based Autonomous Multirotor Landing on a Moving Platform. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2018; ():1010-1017.
Chicago/Turabian StyleAlejandro Rodriguez-Ramos; Carlos Sampedro; Hriday Bavle; Ignacio Gil Moreno; Pascual Campoy. 2018. "A Deep Reinforcement Learning Technique for Vision-Based Autonomous Multirotor Landing on a Moving Platform." 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) , no. : 1010-1017.
Navigation in unknown indoor environments with fast collision avoidance capabilities is an ongoing research topic. Traditional motion planning algorithms rely on precise maps of the environment, where re-adapting a generated path can be highly demanding in terms of computational cost. In this paper, we present a fast reactive navigation algorithm using Deep Reinforcement Learning applied to multi rotor aerial robots. Taking as input the 2D-laser range measurements and the relative position of the aerial robot with respect to the desired goal, the proposed algorithm is successfully trained in a Gazebo-based simulation scenario by adopting an artificial potential field formulation. A thorough evaluation of the trained agent has been carried out both in simulated and real indoor scenarios, showing the appropriate reactive navigation behavior of the agent in the presence of static and dynamic obstacles.
Carlos Sampedro; Hriday Bavle; Alejandro Rodriguez-Ramos; Paloma de la Puente; Pascual Campoy. Laser-Based Reactive Navigation for Multirotor Aerial Robots using Deep Reinforcement Learning. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018, 1024 -1031.
AMA StyleCarlos Sampedro, Hriday Bavle, Alejandro Rodriguez-Ramos, Paloma de la Puente, Pascual Campoy. Laser-Based Reactive Navigation for Multirotor Aerial Robots using Deep Reinforcement Learning. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2018; ():1024-1031.
Chicago/Turabian StyleCarlos Sampedro; Hriday Bavle; Alejandro Rodriguez-Ramos; Paloma de la Puente; Pascual Campoy. 2018. "Laser-Based Reactive Navigation for Multirotor Aerial Robots using Deep Reinforcement Learning." 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) , no. : 1024-1031.
In this paper we propose a particle filter localization approach, based on stereo visual odometry (VO) and semantic information from indoor environments, for mini-aerial robots. The prediction stage of the particle filter is performed using the 3D pose of the aerial robot estimated by the stereo VO algorithm. This predicted 3D pose is updated using inertial as well as semantic measurements. The algorithm processes semantic measurements in two phases; firstly, a pre-trained deep learning (DL) based object detector is used for real time object detections in the RGB spectrum. Secondly, from the corresponding 3D point clouds of the detected objects, we segment their dominant horizontal plane and estimate their relative position, also augmenting a prior map with new detections. The augmented map is then used in order to obtain a drift free pose estimate of the aerial robot. We validate our approach in several real flight experiments where we compare it against ground truth and a state of the art visual SLAM approach.
Hriday Bavle; Stephan Manthe; Paloma de la Puente; Alejandro Rodriguez-Ramos; Carlos Sampedro; Pascual Campoy. Stereo Visual Odometry and Semantics based Localization of Aerial Robots in Indoor Environments. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018, 1018 -1023.
AMA StyleHriday Bavle, Stephan Manthe, Paloma de la Puente, Alejandro Rodriguez-Ramos, Carlos Sampedro, Pascual Campoy. Stereo Visual Odometry and Semantics based Localization of Aerial Robots in Indoor Environments. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2018; ():1018-1023.
Chicago/Turabian StyleHriday Bavle; Stephan Manthe; Paloma de la Puente; Alejandro Rodriguez-Ramos; Carlos Sampedro; Pascual Campoy. 2018. "Stereo Visual Odometry and Semantics based Localization of Aerial Robots in Indoor Environments." 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) , no. : 1018-1023.
This paper presents a fast and robust approach for estimating the flight altitude of multirotor Unmanned Aerial Vehicles (UAVs) using 3D point cloud sensors in cluttered, unstructured, and dynamic indoor environments. The objective is to present a flight altitude estimation algorithm, replacing the conventional sensors such as laser altimeters, barometers, or accelerometers, which have several limitations when used individually. Our proposed algorithm includes two stages: in the first stage, a fast clustering of the measured 3D point cloud data is performed, along with the segmentation of the clustered data into horizontal planes. In the second stage, these segmented horizontal planes are mapped based on the vertical distance with respect to the point cloud sensor frame of reference, in order to provide a robust flight altitude estimation even in presence of several static as well as dynamic ground obstacles. We validate our approach using the IROS 2011 Kinect dataset available in the literature, estimating the altitude of the RGB-D camera using the provided 3D point clouds. We further validate our approach using a point cloud sensor on board a UAV, by means of several autonomous real flights, closing its altitude control loop using the flight altitude estimated by our proposed method, in presence of several different static as well as dynamic ground obstacles. In addition, the implementation of our approach has been integrated in our open-source software framework for aerial robotics called Aerostack.
Hriday Bavle; Jose Luis Sanchez-Lopez; Paloma De La Puente; Alejandro Rodriguez-Ramos; Carlos Sampedro; Pascual Campoy. Fast and Robust Flight Altitude Estimation of Multirotor UAVs in Dynamic Unstructured Environments Using 3D Point Cloud Sensors. Aerospace 2018, 5, 94 .
AMA StyleHriday Bavle, Jose Luis Sanchez-Lopez, Paloma De La Puente, Alejandro Rodriguez-Ramos, Carlos Sampedro, Pascual Campoy. Fast and Robust Flight Altitude Estimation of Multirotor UAVs in Dynamic Unstructured Environments Using 3D Point Cloud Sensors. Aerospace. 2018; 5 (3):94.
Chicago/Turabian StyleHriday Bavle; Jose Luis Sanchez-Lopez; Paloma De La Puente; Alejandro Rodriguez-Ramos; Carlos Sampedro; Pascual Campoy. 2018. "Fast and Robust Flight Altitude Estimation of Multirotor UAVs in Dynamic Unstructured Environments Using 3D Point Cloud Sensors." Aerospace 5, no. 3: 94.
Search and Rescue (SAR) missions represent an important challenge in the robotics research field as they usually involve exceedingly variable-nature scenarios which require a high-level of autonomy and versatile decision-making capabilities. This challenge becomes even more relevant in the case of aerial robotic platforms owing to their limited payload and computational capabilities. In this paper, we present a fully-autonomous aerial robotic solution, for executing complex SAR missions in unstructured indoor environments. The proposed system is based on the combination of a complete hardware configuration and a flexible system architecture which allows the execution of high-level missions in a fully unsupervised manner (i.e. without human intervention). In order to obtain flexible and versatile behaviors from the proposed aerial robot, several learning-based capabilities have been integrated for target recognition and interaction. The target recognition capability includes a supervised learning classifier based on a computationally-efficient Convolutional Neural Network (CNN) model trained for target/background classification, while the capability to interact with the target for rescue operations introduces a novel Image-Based Visual Servoing (IBVS) algorithm which integrates a recent deep reinforcement learning method named Deep Deterministic Policy Gradients (DDPG). In order to train the aerial robot for performing IBVS tasks, a reinforcement learning framework has been developed, which integrates a deep reinforcement learning agent (e.g. DDPG) with a Gazebo-based simulator for aerial robotics. The proposed system has been validated in a wide range of simulation flights, using Gazebo and PX4 Software-In-The-Loop, and real flights in cluttered indoor environments, demonstrating the versatility of the proposed system in complex SAR missions.
Carlos Sampedro; Alejandro Rodriguez-Ramos; Hriday Bavle; Adrian Carrio; Paloma De La Puente; Pascual Campoy. A Fully-Autonomous Aerial Robot for Search and Rescue Applications in Indoor Environments using Learning-Based Techniques. Journal of Intelligent & Robotic Systems 2018, 95, 601 -627.
AMA StyleCarlos Sampedro, Alejandro Rodriguez-Ramos, Hriday Bavle, Adrian Carrio, Paloma De La Puente, Pascual Campoy. A Fully-Autonomous Aerial Robot for Search and Rescue Applications in Indoor Environments using Learning-Based Techniques. Journal of Intelligent & Robotic Systems. 2018; 95 (2):601-627.
Chicago/Turabian StyleCarlos Sampedro; Alejandro Rodriguez-Ramos; Hriday Bavle; Adrian Carrio; Paloma De La Puente; Pascual Campoy. 2018. "A Fully-Autonomous Aerial Robot for Search and Rescue Applications in Indoor Environments using Learning-Based Techniques." Journal of Intelligent & Robotic Systems 95, no. 2: 601-627.
A reliable estimation of the flight altitude in dynamic and unstructured indoor environments is an unsolved problem. Standalone available sensors, such as distance sensors, barometers and accelerometers, have multiple limitations in presence of non-flat ground surfaces, or in cluttered areas. To overcome these sensor limitations, maximizing their individual performance, this paper presents a modular EKF-based multi-sensor fusion approach for accurate vertical localization of multirotor UAVs in dynamic and unstructured indoor environments. The state estimator allows to combine the information provided by a variable number and type of sensors, including IMU, barometer and distance sensors, with the capabilities of sensor auto calibration and bias estimation, as well as a flexible configuration of the prediction and update stages. Several autonomous indoors real flights in unstructured environments have been conducted in order to validate our proposed state estimator, enabling the UAV to maintain the desired flight altitude when navigating over wide range of obstacles. Furthermore, it has been successfully used in IMAV 2016 competition. The presented work has been made publicly available to the scientific community as an open source software within the Aerostack 1 framework.
Hriday Bavle; Jose Luis Sanchez-Lopez; Alejandro Rodriguez-Ramos; Carlos Sampedro; Pascual Campoy. A flight altitude estimator for multirotor UAVs in dynamic and unstructured indoor environments. 2017 International Conference on Unmanned Aircraft Systems (ICUAS) 2017, 1044 -1051.
AMA StyleHriday Bavle, Jose Luis Sanchez-Lopez, Alejandro Rodriguez-Ramos, Carlos Sampedro, Pascual Campoy. A flight altitude estimator for multirotor UAVs in dynamic and unstructured indoor environments. 2017 International Conference on Unmanned Aircraft Systems (ICUAS). 2017; ():1044-1051.
Chicago/Turabian StyleHriday Bavle; Jose Luis Sanchez-Lopez; Alejandro Rodriguez-Ramos; Carlos Sampedro; Pascual Campoy. 2017. "A flight altitude estimator for multirotor UAVs in dynamic and unstructured indoor environments." 2017 International Conference on Unmanned Aircraft Systems (ICUAS) , no. : 1044-1051.
In this paper, a fully-autonomous quadrotor aerial robot for solving the different missions proposed in the 2016 International Micro Air Vehicle (IMAV) Indoor Competition is presented. The missions proposed in the IMAV 2016 competition involve the execution of high-level missions such as entering and exiting a building, exploring an unknown indoor environment, recognizing and interacting with objects, landing autonomously on a moving platform, etc. For solving the aforementioned missions, a fully-autonomous quadrotor aerial robot has been designed, based on a complete hardware configuration and a versatile software architecture, which allows the aerial robot to complete all the missions in a fully autonomous and consecutive manner. A thorough evaluation of the proposed system has been carried out in both simulated flights, using the Gazebo simulator in combination with PX4 Software-In-The-Loop, and real flights, demonstrating the appropriate capabilities of the proposed system for performing high-level missions and its flexibility for being adapted to a wide variety of applications.
Carlos Sampedro; Hriday Bavle; Alejandro Rodriguez-Ramos; Adrian Carrio; Ramón A. Suárez Fernández; Jose Luis Sanchez-Lopez; Pascual Campoy. A fully-autonomous aerial robotic solution for the 2016 International Micro Air Vehicle competition. 2017 International Conference on Unmanned Aircraft Systems (ICUAS) 2017, 989 -998.
AMA StyleCarlos Sampedro, Hriday Bavle, Alejandro Rodriguez-Ramos, Adrian Carrio, Ramón A. Suárez Fernández, Jose Luis Sanchez-Lopez, Pascual Campoy. A fully-autonomous aerial robotic solution for the 2016 International Micro Air Vehicle competition. 2017 International Conference on Unmanned Aircraft Systems (ICUAS). 2017; ():989-998.
Chicago/Turabian StyleCarlos Sampedro; Hriday Bavle; Alejandro Rodriguez-Ramos; Adrian Carrio; Ramón A. Suárez Fernández; Jose Luis Sanchez-Lopez; Pascual Campoy. 2017. "A fully-autonomous aerial robotic solution for the 2016 International Micro Air Vehicle competition." 2017 International Conference on Unmanned Aircraft Systems (ICUAS) , no. : 989-998.
Fully autonomous landing on moving platforms poses a problem of importance for Unmanned Aerial Vehicles (UAVs). Current approaches are usually based on tracking and following the moving platform by means of several techniques, which frequently lack performance in real applications. The aim of this paper is to prove a simple landing strategy is able to provide practical results. The presented approach is based on three stages: estimation, prediction and fast landing. As a preliminary phase, the problem is solved for a particular case of the IMAV 2016 competition. Subsequently, it is extended to a more generic and versatile approach. A thorough evaluation has been conducted with simulated and real flight experiments. Simulations have been performed utilizing Gazebo 6 and PX4 Software-In-The-Loop (SITL) and real flight experiments have been conducted with a custom quadrotor and a moving platform in an indoor environment.
Alejandro Rodriguez-Ramos; Carlos Sampedro; Hriday Bavle; Zorana Milosevic; Alejandro Garcia-Vaquero; Pascual Campoy. Towards fully autonomous landing on moving platforms for rotary Unmanned Aerial Vehicles. 2017 International Conference on Unmanned Aircraft Systems (ICUAS) 2017, 170 -178.
AMA StyleAlejandro Rodriguez-Ramos, Carlos Sampedro, Hriday Bavle, Zorana Milosevic, Alejandro Garcia-Vaquero, Pascual Campoy. Towards fully autonomous landing on moving platforms for rotary Unmanned Aerial Vehicles. 2017 International Conference on Unmanned Aircraft Systems (ICUAS). 2017; ():170-178.
Chicago/Turabian StyleAlejandro Rodriguez-Ramos; Carlos Sampedro; Hriday Bavle; Zorana Milosevic; Alejandro Garcia-Vaquero; Pascual Campoy. 2017. "Towards fully autonomous landing on moving platforms for rotary Unmanned Aerial Vehicles." 2017 International Conference on Unmanned Aircraft Systems (ICUAS) , no. : 170-178.
To achieve fully autonomous operation for Unmanned Aerial Systems (UAS) it is necessary to integrate multiple and heterogeneous technical solutions (e.g., control-based methods, computer vision methods, automated planning, coordination algorithms, etc.). The combination of such methods in an operational system is a technical challenge that requires efficient architectural solutions. In a robotic engineering context, where productivity is important, it is also important to minimize the effort for the development of new systems. As a response to these needs, this paper presents Aerostack, an open-source software framework for the development of aerial robotic systems. This framework facilitates the creation of UAS by providing a set of reusable components specialized in functional tasks of aerial robotics (trajectory planning, self localization, etc.) together with an integration method in a multi-layered cognitive architecture based on five layers: reactive, executive, deliberative, reflective and social. Compared to other software frameworks for UAS, Aerostack can provide higher degrees of autonomy and it is more versatile to be applied to different types of hardware (aerial platforms and sensors) and different types of missions (e.g. multi robot swarm systems). Aerostack has been validated during four years (since February 2013) by its successful use on many research projects, international competitions and public exhibitions. As a representative example of system development, this paper also presents how Aerostack was used to develop a system for a (fictional) fully autonomous indoors search and rescue mission.
Jose Luis Sanchez-Lopez; Martin Molina; Hriday Bavle; Carlos Sampedro; Ramón A. Suárez Fernández; Pascual Campoy. A Multi-Layered Component-Based Approach for the Development of Aerial Robotic Systems: The Aerostack Framework. Journal of Intelligent & Robotic Systems 2017, 88, 683 -709.
AMA StyleJose Luis Sanchez-Lopez, Martin Molina, Hriday Bavle, Carlos Sampedro, Ramón A. Suárez Fernández, Pascual Campoy. A Multi-Layered Component-Based Approach for the Development of Aerial Robotic Systems: The Aerostack Framework. Journal of Intelligent & Robotic Systems. 2017; 88 (2-4):683-709.
Chicago/Turabian StyleJose Luis Sanchez-Lopez; Martin Molina; Hriday Bavle; Carlos Sampedro; Ramón A. Suárez Fernández; Pascual Campoy. 2017. "A Multi-Layered Component-Based Approach for the Development of Aerial Robotic Systems: The Aerostack Framework." Journal of Intelligent & Robotic Systems 88, no. 2-4: 683-709.
In this paper a scalable and flexible Architecture for real-time mission planning and dynamic agent-to-task assignment for a swarm of Unmanned Aerial Vehicles (UAV) is presented. The proposed mission planning architecture consists of a Global Mission Planner (GMP) which is responsible of assigning and monitoring different high-level missions through an Agent Mission Planner (AMP), which is in charge of providing and monitoring each task of the mission to each UAV in the swarm. The objective of the proposed architecture is to carry out high-level missions such as autonomous multi-agent exploration, automatic target detection and recognition, search and rescue, and other different missions with the ability of dynamically re-adapt the mission in real-time. The proposed architecture has been evaluated in simulation and real indoor flights demonstrating its robustness in different scenarios and its flexibility for real-time mission re-planning and dynamic agent-to-task assignment.
Carlos Sampedro; Hriday Bavle; Jose Luis Sanchez-Lopez; Ramón A. Suárez Fernández; Alejandro Rodriguez-Ramos; Martin Molina; Pascual Campoy. A flexible and dynamic mission planning architecture for UAV swarm coordination. 2016 International Conference on Unmanned Aircraft Systems (ICUAS) 2016, 355 -363.
AMA StyleCarlos Sampedro, Hriday Bavle, Jose Luis Sanchez-Lopez, Ramón A. Suárez Fernández, Alejandro Rodriguez-Ramos, Martin Molina, Pascual Campoy. A flexible and dynamic mission planning architecture for UAV swarm coordination. 2016 International Conference on Unmanned Aircraft Systems (ICUAS). 2016; ():355-363.
Chicago/Turabian StyleCarlos Sampedro; Hriday Bavle; Jose Luis Sanchez-Lopez; Ramón A. Suárez Fernández; Alejandro Rodriguez-Ramos; Martin Molina; Pascual Campoy. 2016. "A flexible and dynamic mission planning architecture for UAV swarm coordination." 2016 International Conference on Unmanned Aircraft Systems (ICUAS) , no. : 355-363.
To simplify the usage of the Unmanned Aerial Systems (UAS), extending their use to a great number of applications, fully autonomous operation is needed. There are many open-source architecture frameworks for UAS that claim the autonomous operation of UAS, but they still have two main open issues: (1) level of autonomy, being in most of the cases limited and (2) versatility, being most of them designed specifically for some applications or aerial platforms. As a response to these needs and issues, this paper presents Aerostack, a system architecture and open-source multi-purpose software framework for autonomous multi-UAS operation. To provide higher degrees of autonomy, Aerostack's system architecture integrates state of the art concepts of intelligent, cognitive and social robotics, based on five layers: reactive, executive, deliberative, reflective, and social. To be a highly versatile practical solution, Aerostack's open-source software framework includes the main components to execute the architecture for fully autonomous missions of swarms of UAS; a collection of ready-to-use and flight proven modular components that can be reused by the users and developers; and compatibility with five well known aerial platforms, as well as a high number of sensors. Aerostack has been validated during three years by its successful use on many research projects, international competitions and exhibitions. To corroborate this fact, this paper also presents Aerostack carrying out a fictional fully autonomous indoors search and rescue mission.
Jose Luis Sanchez-Lopez; Ramón A. Suárez Fernández; Hriday Bavle; Carlos Sampedro; Martin Molina; Jesus Pestana; Pascual Campoy. AEROSTACK: An architecture and open-source software framework for aerial robotics. 2016 International Conference on Unmanned Aircraft Systems (ICUAS) 2016, 332 -341.
AMA StyleJose Luis Sanchez-Lopez, Ramón A. Suárez Fernández, Hriday Bavle, Carlos Sampedro, Martin Molina, Jesus Pestana, Pascual Campoy. AEROSTACK: An architecture and open-source software framework for aerial robotics. 2016 International Conference on Unmanned Aircraft Systems (ICUAS). 2016; ():332-341.
Chicago/Turabian StyleJose Luis Sanchez-Lopez; Ramón A. Suárez Fernández; Hriday Bavle; Carlos Sampedro; Martin Molina; Jesus Pestana; Pascual Campoy. 2016. "AEROSTACK: An architecture and open-source software framework for aerial robotics." 2016 International Conference on Unmanned Aircraft Systems (ICUAS) , no. : 332-341.