Protostar Labs

Use Cases

Vision-based navigation in GPS-denied environments

Vision-based navigation in GPS-denied environments

This localization and navigation pipeline allows UAVs to rely on their sensors to stay on track when the GPS signal is unreliable or unavailable. It is based on Robot Operating System (ROS) which makes it deployable on many platforms and can be connected to standard flight stacks.

Overview

Overview

UAVs became an important asset in civil and military applications and with Industry 4.0 their use expanded to industrial environments. The wide range of environments the UAVs are used in sets the need for a stable navigation stack. Many environemts limit or completely block the GPS signal, which disturbs the usual navigation algorithms. To assure that the aircraft never loses its track and that it can always successfully perform the RTH (“Return To Home”) function, we researched and tested the state of the art solutions for visual-inertial based navigation systems.

Goals

To goal is to find and improve the best solution for a navigation system for UAVs that does not depend on the GPS. The system should take the input from multiple sensors, which include camera, IMU, magnetometer, and barometer. The output should be position and orientation of the aircraft relative to the starting point. The estimation error should be a few meters no matter the complexity and length of the given trajectory.

Solution

Implement ORB-SLAM3 and SVO-PRO algorithms and setup the ROS environment. Use the PX4 flight stack and communicate with the SLAM algorithms through ROS. Use Gazebo simulator to test different parameters and trajectories. Perform camera and other sensors calibration to ensure precise estimations.

Results

  • Used a sensors suite and mounted it to a drone and recorded the camera and sensors readings. Drone successfully navigated for 200 meters with the mean absolute position error being 0.84 meters.

Related Use Cases