rentastic's Avatar

rentastic

@lathrys.at

just a silly boy (she/her)

708
Followers
316
Following
2,726
Posts
03.07.2025
Joined
Posts Following

Latest posts by rentastic @lathrys.at

ty :))

11.03.2026 05:35 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

hell yeah :) yw!

11.03.2026 00:32 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

YOU SHOUD LISTEN TO KISS FACILITY

10.03.2026 23:37 πŸ‘ 1 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0

this almost reminds me of Warpaint

10.03.2026 23:34 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

fyi people like this album

10.03.2026 23:32 πŸ‘ 4 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0

you’re very welcome! btw i just wrote a short review, hehe bsky.app/profile/lath...

10.03.2026 22:55 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Preview
if i may be a little obsessive… omfg. this music.

i guess i’m a music critic now

10.03.2026 22:52 πŸ‘ 11 πŸ” 3 πŸ’¬ 3 πŸ“Œ 1

anyway i think i might start a tiny newsletter on leaflet

10.03.2026 19:05 πŸ‘ 4 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

i’ve decided i need to completely avoid talking about ai. it riles me up too much :/

10.03.2026 19:05 πŸ‘ 4 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

mhmm. i was going on about it again yesterday

08.03.2026 21:45 πŸ‘ 3 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

i remain deeply uncertain about whether even a robot body controlled by a computer sim of β€œme” would have the lights on inside. any strong intuition that it or something similar would be is confusing to me, because i don’t think naive connectionism or even embodiment are sufficiently motivated

08.03.2026 21:42 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

(i’m not actually special i just happen to think this is an underexplored alternative)

08.03.2026 21:25 πŸ‘ 5 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

i’m special because i watched one episode of westworld and immediately concluded β€œi would turn off the sentient, feeling robots and salt the intellectual earth that brought them into being because there are far worse things than nonexistence which they have been made subject to”

08.03.2026 21:19 πŸ‘ 8 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

there is such a thing as moral progress and i am deeply afraid that committing to such a path is going to drag us collectively back to a darker world

08.03.2026 20:50 πŸ‘ 6 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0

being as tools. even if we had hard evidence of a lack of sentience for some future system, making it believably come off as a thing worthy of deep care and concern, which we nonetheless treat as an instrument IS. A. FUCKING. MORAL. CATASTROPHE.

08.03.2026 20:50 πŸ‘ 6 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0

and even if you disagree with that, the fact we will, it seems, continue to have deep and unresolvable confusion about the nature of consciousness should suggest far more caution and a worry for the possibility that we commit a different moral catastrophe by treating apparently sentient

08.03.2026 20:50 πŸ‘ 5 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0

we only have good reason to believe ourselves and most animals are capable of suffering. it seems a huge moral error to naively extend that out of precaution rather than flatly saying β€œno” to building this

08.03.2026 20:50 πŸ‘ 5 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0

which i think given the nature of this technology can only come through displaced concern for actually deserving creatures

08.03.2026 20:50 πŸ‘ 6 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0

two ways this goes if we give rights to AIs: we were right to do so because they actually are moral patients, in which case i think we’ve already committed some deep sin in constructing the abominable. or we were wrong and we precipitate a moral catastrophe by mistakenly caring for inanimate stuff

08.03.2026 20:50 πŸ‘ 6 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0

yeah, i feel like i’m taking crazy pills sometimes

08.03.2026 20:20 πŸ‘ 3 πŸ” 1 πŸ’¬ 2 πŸ“Œ 0

i’m an engineered sentience anti-natalist for at least four very different reasons and i’m shocked it’s not the obvious conclusion

08.03.2026 20:14 πŸ‘ 10 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0

it baffles me when, to the extent one believes there is something it is like to be an aiβ€”or could conceptually be in a future iterationβ€”and therefore they are moral patients deserving of some consideration for their wellbeing, their immediate answer is not β€œdear fucking god we must not build this”

08.03.2026 20:09 πŸ‘ 15 πŸ” 2 πŸ’¬ 2 πŸ“Œ 1

bring back ostracism

08.03.2026 16:17 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

and barring some drastic change in how we put together AIs, we can’t really do that to a meaningful degree

08.03.2026 07:44 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

i accept we do this for farmed and wild animals but i think we have a good reason for that. not only can you can look in their eyes and see that they behave similarly to you, but more importantly ime we have strong reasons from biology to believe that they *are* like us in a deeper mechanistic sense

08.03.2026 07:44 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

maybe we’re using the terms differently? what would you call a case where we give rights and concern to AIs (millions, say) we suspect (but cannot prove) might be moral patients, and concern for them ends up disadvantaging humans in some serious and important way?

08.03.2026 07:44 πŸ‘ 0 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

but generally where i get off the bus with the moral calculus is that i will, forever i expect, retain a position of privileging humans. idc if it’s irrational, i won’t be moved from it and i don’t think it’s right to say i ought to be.

08.03.2026 07:39 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

i’m not sure that argument goes through practically irl as well as it might logically (need to chew on it more to see if it even goes through at all)

08.03.2026 07:39 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

notes from public transit: @utopia-defer.red i saw your doppelgΓ€nger with a bag of chinese takeout

08.03.2026 07:33 πŸ‘ 2 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

because of my own bias i guess toward liking people, i’m less worried about the β€œoops, accidentally did a moral catastrophe in a datacenter” outcome than i am accidentally or purposefully subordinating human dignity though a process of moral dilution

08.03.2026 07:29 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0