100% agree.
@aaronhuggins.com
Practice spirituality; love more. Here for a healing, pluralistic, & human community. I block toxic folks. Christian. Husband. Father. he/him ace π³οΈβπ Building EmdashJS for writers (saying for years itβs coming soon). https://aaronhuggins.com/social-media
100% agree.
The very people who SHOULD know better are suffering the cognitive effects of their own software.
Enmeshment, confabulation, dissociation, paranoia, etc.
Whatβs frustrating is that their choices will forever taint productive, healthy uses of language models.
And maybe destroy us all.
4/4
Since the insiders who won the funding already believed there was an intelligence, they had zero motivation to βcrippleβ the models.
Thing is, these folk βdogfoodβ their own product; meaning, they not only create it, they are the first users. A βpatient zeroβ for mass delusion, if you will.
3/?
The point we were making from 2016/2017 onward was about the user interface, not the technology driving the UI. Their perspective is one where they already have a belief that the system is intelligent, so they want to give it a voice and agency.
Our perspective is that LLM is not intelligent.
2/?
I and my peers were discussing it, but I also am coming from this as an outsider to FAANG and Silicon Valley.
Folks like me from outside those circles were dismissed with the argument that the underlying mechanisms were fundamentally different.
But that was always bypassing.
1/?
The very people who SHOULD know better are suffering the cognitive effects of their own software.
Enmeshment, confabulation, dissociation, paranoia, etc.
Whatβs frustrating is that their choices will forever taint productive, healthy uses of language models.
And maybe destroy us all.
4/4
Since the insiders who won the funding already believed there was an intelligence, they had zero motivation to βcrippleβ the models.
Thing is, these folk βdogfoodβ their own product; meaning, they not only create it, they are the first users. A βpatient zeroβ for mass delusion, if you will.
3/?
The point we were making from 2016/2017 onward was about the user interface, not the technology driving the UI. Their perspective is one where they already have a belief that the system is intelligent, so they want to give it a voice and agency.
Our perspective is that LLM is not intelligent.
2/?
I and my peers were discussing it, but I also am coming from this as an outsider to FAANG and Silicon Valley.
Folks like me from outside those circles were dismissed with the argument that the underlying mechanisms were fundamentally different.
But that was always bypassing.
1/?
My industry is flagrantly disregarding the established science surrounding language models and physiological and psychological impacts.
We should be ashamed and lawmakers should be clamping down on these companies.
I canβt recommend anything that argues otherwise.
Weβve known about these effects of natural language chat systems since ELIZA. Thereβs a large body of published, peer-reviewed work which already established the psychological outcomes more than 30 years ago.
Let them hobble.
Whereβs all my Scotsmen at?
But only the ones I count.
Supremacy is one helluva drug.
Also, me having flashbacks to Twitter when I would use religious language to religious people and get dragged by one side, then use plain language to speak to normies and get dragged againβ¦
And then, Iβd pull out material ethics language and some rando would tell me I was a bot.
*sigh*
Itβs almost like speaking to make a point requires knowing your audience well enough to select the right language that will create understanding between you and the listener.
Wild if true.
Many know Ada Lovelace, but too few know about women like Mary Allen Wilkes, who wrote most of the software for the first "personal" computerβthe LINC.
Here are stories of modern and historical women in tech and digital rights that inspire us. How many did you already know? #InternationalWomensDay
Since entering therapy, I am learning to put myself and my loved ones first.
That means Iβm only caring about what I want in my personal software project.
If others want what I want when Iβm done, then it will have an audience.
I write for me first, these days, and Iβm happy.
2/2
Iβm being a little trite in my quote post, but it is genuine. Before therapy, I would pay so much attention to what other people thought, or even trying to anticipate it. That drove both my entrepreneurial pursuits and my open source work.
Now, I understand that to be anxious thoughts.
1/2
βI never think about the audience.β
This broke me.
Thank you for writing.
I cannot recommend this piece enough. Salma captures exactly my own experience and puts words to feelings Iβve had that are difficult to express.
I wrote about some things I'm struggling with in the technology industry, preceded by an appreciation for folk music.
Salma, Iβm only partway through, but Iβve already wept because of how much I relate to your framing.
I too grieve and have been fighting at my job for the humanity in what we do. Iβm the last software engineer on the job not using it.
Code is poetry. Feels like they want to take that from me.
Just a reminder that for some reason no one talks about Christians United For Israel, which is 10x the size of AIPAC
Code is still poetry.
System design and implementation is still engineering.
The work we do will always be a craft which no machine can reproduce.
Model collapse is real. Language models canβt produce anything new.
I am irreplaceable, no matter any delusions suffering a person with AI psychosis.
Iβd pay at least a dollar to have seen that back when he was younger.
I think you just won.
Everybody else can go home.
You too, Johnny Depp. I see you. Costner is worse. Thatβs not an excuse for you.