ty :))
ty :))
hell yeah :) yw!
YOU SHOUD LISTEN TO KISS FACILITY
this almost reminds me of Warpaint
fyi people like this album
youβre very welcome! btw i just wrote a short review, hehe bsky.app/profile/lath...
anyway i think i might start a tiny newsletter on leaflet
iβve decided i need to completely avoid talking about ai. it riles me up too much :/
mhmm. i was going on about it again yesterday
i remain deeply uncertain about whether even a robot body controlled by a computer sim of βmeβ would have the lights on inside. any strong intuition that it or something similar would be is confusing to me, because i donβt think naive connectionism or even embodiment are sufficiently motivated
(iβm not actually special i just happen to think this is an underexplored alternative)
iβm special because i watched one episode of westworld and immediately concluded βi would turn off the sentient, feeling robots and salt the intellectual earth that brought them into being because there are far worse things than nonexistence which they have been made subject toβ
there is such a thing as moral progress and i am deeply afraid that committing to such a path is going to drag us collectively back to a darker world
being as tools. even if we had hard evidence of a lack of sentience for some future system, making it believably come off as a thing worthy of deep care and concern, which we nonetheless treat as an instrument IS. A. FUCKING. MORAL. CATASTROPHE.
and even if you disagree with that, the fact we will, it seems, continue to have deep and unresolvable confusion about the nature of consciousness should suggest far more caution and a worry for the possibility that we commit a different moral catastrophe by treating apparently sentient
we only have good reason to believe ourselves and most animals are capable of suffering. it seems a huge moral error to naively extend that out of precaution rather than flatly saying βnoβ to building this
which i think given the nature of this technology can only come through displaced concern for actually deserving creatures
two ways this goes if we give rights to AIs: we were right to do so because they actually are moral patients, in which case i think weβve already committed some deep sin in constructing the abominable. or we were wrong and we precipitate a moral catastrophe by mistakenly caring for inanimate stuff
yeah, i feel like iβm taking crazy pills sometimes
iβm an engineered sentience anti-natalist for at least four very different reasons and iβm shocked itβs not the obvious conclusion
it baffles me when, to the extent one believes there is something it is like to be an aiβor could conceptually be in a future iterationβand therefore they are moral patients deserving of some consideration for their wellbeing, their immediate answer is not βdear fucking god we must not build thisβ
bring back ostracism
and barring some drastic change in how we put together AIs, we canβt really do that to a meaningful degree
i accept we do this for farmed and wild animals but i think we have a good reason for that. not only can you can look in their eyes and see that they behave similarly to you, but more importantly ime we have strong reasons from biology to believe that they *are* like us in a deeper mechanistic sense
maybe weβre using the terms differently? what would you call a case where we give rights and concern to AIs (millions, say) we suspect (but cannot prove) might be moral patients, and concern for them ends up disadvantaging humans in some serious and important way?
but generally where i get off the bus with the moral calculus is that i will, forever i expect, retain a position of privileging humans. idc if itβs irrational, i wonβt be moved from it and i donβt think itβs right to say i ought to be.
iβm not sure that argument goes through practically irl as well as it might logically (need to chew on it more to see if it even goes through at all)
notes from public transit: @utopia-defer.red i saw your doppelgΓ€nger with a bag of chinese takeout
because of my own bias i guess toward liking people, iβm less worried about the βoops, accidentally did a moral catastrophe in a datacenterβ outcome than i am accidentally or purposefully subordinating human dignity though a process of moral dilution