Skip to main content

GPS-Denied UAV with Visual SLAM

August 2024 - June 2025

GitHub Repository

Story

Autonomous navigation in GPS-denied environments is one of the most critical challenges facing unmanned aerial vehicle (UAV) systems today. From urban canyons to subterranean tunnels and dense forests, many real-world environments render GPS unreliable or entirely unavailable. For drones to operate safely and effectively in these conditions, whether for search and rescue missions, military reconnaissance, or industrial inspection, they must rely on onboard perception and mapping capabilities.

This project presents a UAV platform powered by Visual Simultaneous Localization and Mapping (VSLAM) using Isaac ROS VSLAM, integrated with MAVROS and the PX4 flight stack. By fusing visual-inertial data and leveraging the Jetson Orin Nano’s edge computing capabilities, the system enables real-time localization and map building to empower the drone to navigate autonomously without the need for GPS.

Through this implementation, the UAV can maintain situational awareness, dynamically map unknown environments, and execute autonomous flight with high precision. This work not only demonstrates the feasibility of deploying real-time VSLAM on resource-constrained platforms, but also underscores its broader implications in expanding the reach and reliability of autonomous systems across critical, GPS-denied scenarios.

VSLAM UAV Andrew Bernas

Demo Video

note

The motion capture cameras visible in the video are used solely for ground truth pose validation. The UAV relies entirely on visual-inertial odometry (VIO) and the flight controller's IMU, fused through an Extended Kalman Filter (EKF), for local pose estimation during flight.

Components List

QtyComponent
x1F450 Quadcopter
x1Jetson Orin Nano
x1D435i Camera
x1PX4 Flight Controller
x19V, 5A Voltage Regulator
x1USB to UART Converter