Jun Rekimoto giving a talk next to my lab's banner that says Human Computer Integration (lab name and my motto for 15+ years), which is really nice to see! The talk was very inspiring and you can see his energy in his way of gesturing at the slide.
Full lab sitting and watching Jun Rekimoto giving a talk next to my The talk was very inspiring and you can see his energy in his way of gesturing at the slide.
A pleasure to host Jun Rekimoto at the Human-Computer Integration Lab (lab.plopes.org) where he gave a talk about "Human-AI Integration". Inspiring as always and thanks for staying for a full day of demos and giving insightful feedback to many of our lab members. (thanks Romain for co-organizing!)
18.06.2025 23:14
π 8
π 1
π¬ 0
π 0
Seeing with the Hands
A sensory substitution that supports manual interactions
Want to try? We are excited to demonstrate the device during Interactivity at #CHI25 and we will present the paper on Wednesday, 30 Apr, 10 am "Sensing and Haptics". See video, paper, and links to program: seeingwiththehands.com
23.04.2025 16:57
π 0
π 0
π¬ 0
π 0
A photo showing two people are shaking their hands. Blue rays extned from the head and from the hand. Call-outs from both consist of close up photos overlaid with a grid of red and white dots.
We further let participants *choose* which tactile perspective to use for a handshaking task. All chose to *use both devices*, and most stated that the eyes' provides an overview, and the hands' provides details. We were really impressed by how they were able to fluently utilize these new senses!
23.04.2025 16:57
π 0
π 0
π¬ 1
π 0
A collage of study photos in which each shows a different person while blindfolded, reaching their hand for different tasks in front of a table with different objects.
Our user studies including Blind & Low Vision participants focus on daily tasks (e.g., grasping). Without extensive training (~15 min), most completed without touching unintended objects. Compared to using the eyes' perspective, participants found the hands' provides a more ergonomic way of seeing.
23.04.2025 16:57
π 0
π 0
π¬ 1
π 0
A person crouches near a counter and reaches their left hand into a cabinet where a mug is. A blue light indicating the direction of their gaze, and another blue light indicating the direction which the hand is facing. An inset from the hand shows a tactile image on the back of the hand, represented by a grid of red and green dots.
In contrast to the more traditional way of sensory substitution which captures vision from the *eyesβ perspective*, we propose seeing from the *hands' perspective*, and explore its unique benefits for manual tasks, e.g., hands are natural and easy to move around.
23.04.2025 16:57
π 0
π 0
π¬ 2
π 0
This over-the-shoulder shot shows a person reaching for a black kettle on a table with their right hand. On the back of their right hand, a device is attached and consists of strips. A call-out in the top left corner shows a close up photo of the kettle, overlaid with a grid of dots in white and red colors. The red dots approximately overlap with the kettle image.
βSeeing with the Handsβ is the latest #CHI25 research from our lab (@pedrolopes.org) with Gene S-H Kim and Xuanyou Liu. We explore a sensory substitution that enables a flexible way of seeing (vision-to-electrotactile), e.g., hover an object and feel its shape before grasping if one cannot see.
23.04.2025 16:57
π 10
π 2
π¬ 1
π 1
Shan-Yuan Tengβs PhD defense
Enabling haptic experiences anywhere, anytime
I am excited to announce that I will defend my PhD next week on March 24th 11:00 am (Chicago Time). You are welcome to join on Zoom, and learn about my PhD work on innovating haptic devices (2019-2025)! Event info: defense.tengshanyuan.info
18.03.2025 22:50
π 11
π 1
π¬ 0
π 0
Robotic hand helps pianists overcome βceiling effectβ
Passive training with robotic exoskeleton hand even led to motor improvements in the untrained hand.
Robotic hand helps pianists overcome βceiling effect." Passive training with robotic exoskeleton hand even led to motor improvements in the untrained hand. arstechnica.com/science/2025...
20.01.2025 19:52
π 20
π 9
π¬ 3
π 0
Sensorimotor Devices: Coupling Sensing and Actuation to Augment Bodily Experience
CHI 2025 Workshop - Yokohama, Japan
π Workshop at #CHI2025: "Sensorimotor Devices: Coupling Sensing and Actuation to Augment Bodily Experience"
If youβre passionate about sensorimotor interaction, wearables, or motion-coupled feedback, join us!
ποΈ Deadline: Feb 13, 2025 AoE
π sensorimotordevices.github.io
14.01.2025 15:39
π 9
π 3
π¬ 1
π 0
Thanks @kalealex.bsky.social @pedrolopes.org @ineffablicious.bsky.social Ken Nakagaki and @krisha-mehta.bsky.social for doing this together!
06.12.2024 22:00
π 4
π 0
π¬ 0
π 0
A group of students is attending a panel titled "How to Thrive in HCI Academia Without Losing Your Mind" in a seminar setting with four speakers at the front.
Being an HCI researcher is not easy! It was nice hosting a panel at UChicago CS and hearing from our kind faculty members about cultivating culture, volunteering, dealing with failure, and more.
06.12.2024 22:00
π 10
π 2
π¬ 2
π 1
Poster for Friday's people and tech. Check info on slack for a textual information of the same poster.
Join us for people and technology seminar on Friday with a fun group and topic! Thanks @tengshanyuan.info for coming up with this one!
03.12.2024 21:56
π 10
π 3
π¬ 0
π 0
What if you could learn to play a new musical piece in just a few practice sessions? In my new paper, we show that using haptic gloves to passively rehearse piano pieces can speed up learning by 49.7%, and negating forgetting between practice sessions.
25.11.2024 20:19
π 4
π 1
π¬ 1
π 0
You can enjoy haptics while charging devices with my arm-wear! (beautifully shot by John Zich at University of Chicago)
A fresh start on Bluesky: I am Shan-Yuan, a PhD candidate passionate about "haptics" (touch, texture, force), which we have mostly given up with touchscreens & VR/AR headsets. I re-envision and build new hardware that enables fun and dexterous computing interactions (hoping to start a new lab!).
25.11.2024 18:18
π 71
π 5
π¬ 6
π 0