π¦Όπ€ Sneak peek: Beyond our own ATV autonomy results, we have also been pushing towards:
1. More challenging scenarios (Nighttime)
2. More robot platforms (quadruped & urban wheelchair) (5/5)
π¦Όπ€ Sneak peek: Beyond our own ATV autonomy results, we have also been pushing towards:
1. More challenging scenarios (Nighttime)
2. More robot platforms (quadruped & urban wheelchair) (5/5)
ποΈ What powers all this: multi-modal data, which we provide to everyone through TartanDrive 2.0, a large-scale off-road driving dataset for self-supervised learning tasks. Check out the data here: theairlab.org/TartanDrive2/ (4/5)
πSince there is no prior map or GPS, our system relies on SLAM to track position and build a local understanding of the environment in real-time, enabling consistent perception across diverse terrains.
See the voxel grid from our SLAM point cloud colored by VFM features (3/5)
π§ 3 key principles: Self-supervised, Multi-modality, and Uncertainty Awareness.
Our system uses multi-modality (LiDAR+camera) data and learns by itself where to go (traversability) while balancing risk (uncertainty) and performance with inverse reinforcement learning. (2/5)
π»π³ Autonomous Off-Road Driving β No Labels, No Prior Map, & No GPS! Our self-supervised stack enables an ATV to navigate forests, snow, & nighttime. (1/5)
More details: theairlab.org/offroad/
Full video: youtu.be/7t4EQj8BIdY
#cmurobotics #robotics #autonomy
@cmurobotics.bsky.social
Congratulations to AirLab's Tartan Air Rescue for being named both a Stage 1 Winner and a US University Innovation Award Winner of the GoAero Prize! @airlabcmu.bsky.social #TartanProud
Learn more about Tartan Air Rescue and the competition: loom.ly/TtGh914
GoAero video: loom.ly/1vHBWYk
Excited to announce MapEx is accepted to ICRA 2025! π
π€π΅οΈββοΈHow can a robot explore to build accurate maps without seeing everything?
mapex-explorer.github.io
We explore by jointly reasoning on prediction uncertainty and potential sensor coverage using multiple map predictions of unseen areas. 1/n