skip to content

Search

LiMPNet: Lightweight Multi-sensor Perception and DRL Navigation for Tiny Drones in Mapless Environments

Omer Kurkutlu 1 , Arman Roohi 1
1 University of Illinois Chicago
0 min read PDF

Abstract

Autonomous tiny drones face significant challenges in navigation due to strict constraints on size, weight, power, and onboard computational capacity. This paper presents a lightweight navigation framework that integrates basic multisensor perception with deep reinforcement learning (DRL) to enable safe, mapless flight in cluttered environments. We employ the Crazyflie 2.1 nano-drone, equipped with a grayscale camera and a multi-ranger deck, a laser-based distance sensor, for real-time obstacle detection and avoidance. A Proximal Policy Optimization (PPO) agent is trained within a ROS and Gazebo simulation environment to generate collision-free trajectories using fused visual and range data. The system is evaluated in two environments: a simple obstacle field, where the drone achieves a 100% success rate (112/112 episodes), and a densely cluttered map, where it reaches the target in 35% of trials (7/20). These results demonstrate that effective autonomous navigation is achievable using minimal sensing and low-computation models, making it well-suited for resource-constrained aerial platforms.