Peiqi Liu's Avatar

Peiqi Liu

@peiqiliu

NYU 24' Research Assistant @ NYU Robotics Engineer @ Hello Robot Inc https://peiqi-liu.github.io

280
Followers
46
Following
1
Posts
22.11.2024
Joined
Posts Following

Latest posts by Peiqi Liu @peiqiliu

Video thumbnail

We just released RUKA, a $1300 humanoid hand that is 3D-printable, strong, precise, and fully open sourced!

The key technical breakthrough here is that we can control joints and fingertips of the robot **without joint encoders**. All we need here is self-supervised data collection and learning.

18.04.2025 18:53 πŸ‘ 29 πŸ” 7 πŸ’¬ 1 πŸ“Œ 0
Video thumbnail

When life gives you lemons, you pick them up.

(trained with robotutilitymodels.com)

28.03.2025 04:02 πŸ‘ 15 πŸ” 4 πŸ’¬ 1 πŸ“Œ 0
Video thumbnail

The robot behaviors shown below are trained without any teleop, sim2real, genai, or motion planning. Simply show the robot a few examples of doing the task yourself, and our new method, called Point Policy, spits out a robot-compatible policy!

28.02.2025 19:09 πŸ‘ 21 πŸ” 5 πŸ’¬ 1 πŸ“Œ 1
Video thumbnail

We just released AnySense, an iPhone app for effortless data acquisition and streaming for robotics. We leverage Apple’s development frameworks to record and stream:

1. RGBD + Pose data
2. Audio from the mic or custom contact microphones
3. Seamless Bluetooth integration for external sensors

26.02.2025 15:14 πŸ‘ 34 πŸ” 10 πŸ’¬ 2 πŸ“Œ 0
Preview
Stretch Community News - February 2025 β€” Hello Robot Gazebo Harmonic, Stretch AI, Dynamic memory, and more!

What's new in the Stretch community this month?

❄️ Gazebo Harmonic
❄️ Dynamic semantic maps for open-vocabulary tasks
❄️ Natural-language narration of robot experiences
❄️ Implicit human-robot communication

And more! Follow the link below for more details:

hello-robot.com/community-up...

06.02.2025 00:15 πŸ‘ 4 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0
Preview
DynaMem: Online Dynamic Spatio-Semantic Memory for Open World Mobile Manipulation DynaMem is an OVMM system adapting a spatio semantic memory to dynamically changing environments.

DynaMem is now fully refactored & integrated into Stretch AI repo!
Try it out: github.com/hello-robot/...
Project page: dynamem.github.io

31.12.2024 22:12 πŸ‘ 3 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Video thumbnail

We all want a home robot that can actually help us out. Why can't I ask my robot "where did I leave my water bottle" and get a good answer?

In Graph-EQA, we build a 3d memory as the robot explores, using that memory to make decisions.

saumyasaxena.github.io/grapheqa/

30.12.2024 16:20 πŸ‘ 45 πŸ” 8 πŸ’¬ 3 πŸ“Œ 1
Video thumbnail

A look at the future: chatting with my robot via Discord to ask it to find something in my house.

This uses an LLM to understand what the human wants and generate a task plan, then builds an open-vocab 3d scene representation to find and pick up objects

31.12.2024 16:38 πŸ‘ 19 πŸ” 4 πŸ’¬ 2 πŸ“Œ 1
Video thumbnail

I'd like to introduce what I've been working at @hellorobot.bsky.social: Stretch AI, a set of open-source tools for language-guided autonomy, exploration, navigation, and learning from demonstration.

Check it out: github.com/hello-robot/...

Thread ->

03.12.2024 16:51 πŸ‘ 131 πŸ” 23 πŸ’¬ 5 πŸ“Œ 4
Video thumbnail

New paper! We show that by using keypoint-based image representation, robot policies become robust to different object types and background changes.

We call this method Prescriptive Point Priors for robot Policies or P3-PO in short. Full project is here: point-priors.github.io

10.12.2024 20:32 πŸ‘ 37 πŸ” 7 πŸ’¬ 1 πŸ“Œ 2
Video thumbnail

Modern policy architectures are unnecessarily complex. In our #NeurIPS2024 project called BAKU, we focus on what really matters for good policy learning.

BAKU is modular, language-conditioned, compatible with multiple sensor streams & action multi-modality, and importantly fully open-source!

09.12.2024 23:33 πŸ‘ 30 πŸ” 9 πŸ’¬ 1 πŸ“Œ 2
Video thumbnail

Since we are nearing the end of the year, I'll revisit some of our work I'm most excited about from the last year and maybe a sneak peek of what we are up to next.

To start of, Robot Utility Models, which enables zero-shot deployment. In the video below, the robot hasnt seen these doors before.

08.12.2024 02:32 πŸ‘ 36 πŸ” 8 πŸ’¬ 2 πŸ“Œ 3