Huge thanks to my amazing coauthors (@dorazhao.bsky.social, Jeffrey T. Hancock, Robert Kraut, @diyiyang.bsky.social), I couldnβt have done this without you. Grateful to learn from and work with you all :)
Huge thanks to my amazing coauthors (@dorazhao.bsky.social, Jeffrey T. Hancock, Robert Kraut, @diyiyang.bsky.social), I couldnβt have done this without you. Grateful to learn from and work with you all :)
We need to ask how to weave AI into our social worlds in ways that augment (rather than erode) human connection. For more details, check out our preprint here: arxiv.org/abs/2506.12605
β οΈ Short-term comfort from a chatbot may cost long-term social health. Users report emotional dependence, withdrawal, and distorted expectations.
The illusion of care comes with real risks. Future designs must ask: How do we prevent over-attachment? How do we support healthier AI chatbots?
5οΈβ£ Chatbot effects depend on your social environment β people with fewer real-life connections get less out of AI companionship, which doesn't make up for missing human support π«.
3οΈβ£ Itβs all about how chatbots are used: general interaction links to greater well-being π, but seeking companionship is tied to worse outcomes π.
4οΈβ£ More frequent interactions, deeper emotional connections, and more disclosure with AI companions are linked to lower well-being π.
1οΈβ£ People with less human support, like single users, minority groups, and those with smaller social networks, are more likely to seek companionship from chatbots π¬.
2οΈβ£ The longer people chat, the more intense the relationship with AI π.
People donβt always say they use chatbots for companionship, but companionship remains the primary actual use across all three data sources. Users view chatbots as friends or partners and are turning to them to discuss emotional and intimate topics.
π We surveyed 1,000+ Character.AI users and analyzed 4,363 chat sessions to understand how people really talk to AI. We combined three data sources to reveal how people connect with AI companions and how it impacts their well-being.
This raises an urgent question: Can these βartificialβ bonds truly meet human needs, or are we creating new vulnerabilities?
AI companions arenβt science fiction anymore π€π¬β€οΈ
Thousands are turning to AI chatbots for emotional connection β finding comfort, sharing secrets, and even falling in love. But as AI companionship grows, the line between real and artificial relationships blurs.
Academic job market post! π
Iβm a CS Postdoc at Stanford in the Stanford HCI group.
I develop ways to improve the online information ecosystem by designing better social media feeds & improving Wikipedia. I work on AI, Social Computing, and HCI.
piccardi.me π§΅
This paper argues that online spaces become ghost towns because it's too easy to lurk without contributing, and that asking people to regularly re-commitβor the incoming messages start getting mutedβreverses the trend. arxiv.org/abs/2410.23267
It works! #cscw2024 paper by @popowski.bsky.social
.