The satellite-based global positioning system (GPS) is known as the best global navigation system for the guidance, navigation, and control of UAVs whenever the GPS signal is detectable effortlessly. Although GPS signals have superior performance in long-term measurements, they are often associated with errors in short-period measurements. In addition, GPS-based navigation systems cannot often work appropriately in urban environments due to signal degradation, obstruction, or reflection caused by nearby objects. This issue is one of the essential burdens of employing UAVs in blocked GPS areas. The lack of credibility of GPS-based navigation systems in the aforementioned scenarios is the main reason of the integration of these systems with inertial navigation systems. Recently, vision and laser scanner-based navigation approaches have been used for scanning objects in blocked GPS areas. Vision-based navigation technologies are incredibly delicate to light or reflections, as well as light detection and ranging (LiDAR) sensors often generate drifting and errors in the local position updates. Therefore, integration of GPS-based navigation systems and LiDAR sensors along with simultaneous localization and mapping (SLAM) techniques has been proposed to address the previous issues. Nevertheless, this technique demands for possessing a large-size UAV with a large-size battery to operate multiple sensors, which is not practical for aerial maneuvers.
The intellectual merit of this project lies in enhancing unmanned aerial vehicles (UAVs) with LiDAR, GPS, Altimeter, and inertial measurement units (IMUs) along a new consensus data fusion technique. These interacting UAVs exchange their sensor measurements using graph theories to achieve a common agreement on simultaneous localization and mapping (SLAM), which will be an accurate and filtered map. The ultimate goal is to perform formation flying with multiple UAVs for low altitude urban environment flights or in environments where GPS signals are degraded using a consensus data fusion technique.
In this phase, we aim to develop our own flight controller in the ROS (Robot Operating System), which is a popular robotics software framework can be implemented in complex robots. ROS provides a message passing middleware to communicate with various processes/nodes. For instance, it may include a node for reading and writing to an Arduino along with a different node reading cameras or LiDAR sensors. All of these nodes can communicate with each other as well as exchanging data. For this purpose, we have done