A Boston Dynamics robot dog stands on a walkway through grass at the base of large cherry trees full of blooms
Close-up of the chest and head of a Boston Dynamics robot dog framed by trees full of cherry blossoms
Three smiling researchers dressed in casual clothing, one pulling a wagon, walk behind a Boston Dynamics robot dog with campus buildings and trees in the background
A researcher holding a handheld controller follows a Boston Dynamics robot dog down a building ramp while another researcher standing a few meters away holding a laptop looks on
If you visited the @uwcherryblossom.bsky.social, did you “spot” an unusual visitor among the blooms? Researchers in the @uofwa.bsky.social #UWAllen #robotics group recently took advantage of some nice weather to take our Boston Dynamics robot dog for a stroll around campus. #AI 1/5
02.05.2025 23:28
👍 11
🔁 4
💬 2
📌 1
Thank you!
Yea we also went through a lot of the papers that tried to do long range perception for the LAGR project.
Really cool to take inspiration from works almost 20 years old but still very relevant :)
18.04.2025 18:33
👍 1
🔁 0
💬 0
📌 0
This project was a fun effort with Matt Schmittle, Nathan Hatch, Rosario Scalise, @mateoguaman.bsky.social, Sidharth Talia, @khimya.bsky.social, @siddhss5.bsky.social and Byron Boots.
🧵6/6
18.04.2025 17:56
👍 1
🔁 0
💬 0
📌 0
This work is a collaboration between the Personal Robotics Lab (@siddhss5.bsky.social) and Robot Learning Lab at the University of Washington @uwrobotics.bsky.social @uwcse.bsky.social
🧵5/6
18.04.2025 17:56
👍 1
🔁 0
💬 1
📌 0
🤖 Real-world tested: LRN cuts down interventions on Spot and a large tracked vehicle.
✅ Plug & play: Works with nearly any local stack that accepts goal waypoints.
🔄 Auto-labeled: Trained from raw FPV videos using CoTracker to trace camera paths.
🧵4/6
18.04.2025 17:56
👍 1
🔁 0
💬 1
📌 0
🔥 Key insight: Robots can reason further by learning to identify distant affordable frontiers as intermediate goals.
🧠 How it works: It uses a pre-trained SAM2 backbone + small head to find frontiers in images. Given a goal, it selects the highest-scoring one to navigate to.
🧵3/6
18.04.2025 17:56
👍 1
🔁 0
💬 1
📌 0
❗️Problem: Robots navigating with no prior maps relying only on local sensors have a limited mapping range (due to sparse/noisy depth) causing myopic decisions.
🧵2/6
18.04.2025 17:56
👍 1
🔁 0
💬 1
📌 0
Long Range Navigator (LRN) 🧭— an approach to extend planning horizons for off-road navigation given no prior maps. Using vision LRN makes longer-range decisions by spotting navigation frontiers far beyond the range of metric maps.
personalrobotics.github.io/lrn/
🧵1/6
18.04.2025 17:56
👍 3
🔁 4
💬 1
📌 3
Excited to attend the talk!
11.01.2025 23:01
👍 2
🔁 0
💬 0
📌 0
Sweet!! 🙌🤩
26.12.2024 13:47
👍 3
🔁 0
💬 0
📌 0
Happy holidays from UW Robotics!
24.12.2024 18:15
👍 11
🔁 2
💬 0
📌 0