This is part of a pattern I've started to notice: There are people out there who are loudly committed to off-loading their work to chatbots and come seeking my validation for their position.
>>
This is part of a pattern I've started to notice: There are people out there who are loudly committed to off-loading their work to chatbots and come seeking my validation for their position.
>>
Everyone is like ‘who said what now?’
When being a vegetarian wasn’t very common, people used to say to me, “oh you think eating animals is bad. What if you were on a desert island and the only thing to eat was a hamburger?!?!”
This is that question, but for AI.
It’s pretty telling to see all the Black folks on my skyline be like:
I wondered when they would get around to AZ.
I hope people exploring legal action still follow through.
100% this. "The number of ways in which firms like yours disrespect writers and other creative people are endless. And you require us — the people harmed — to put brakes on the harmful things YOU are doing, not just to us but to entire creative professions."
>>
neither is appealing or practical, imho. i’m more interested in adversarial work that surfaces the harms developers/vendors are causing, informing as many people as possible about the reality beneath the multibillion $$ marketing hype
3/
the unstated assumption in this type of thinking is that: 1) improving existing systems (not resisting or refusing) is the goal, and 2) appealing to developers/vendors is the way to change these broken or harmful systems
2/
people often ask me about my approach to impact and change. there’s also a common — though rarely stated —assumption that ai auditors or critics should work with developers/vendors to drive concrete changes... product retraction, guardrails, mitigations, etc...
1/
More and more I understand why Andre 3000 started taking up the flute
Blockity block block block
I’m not a math professor, but I’m pretty sure 1953 was more than 47 years ago.
At least they’re taking a break from reinventing phrenology, I guess.
It should surprise no one to know that this doesn’t really work. Synthetic participants don’t mimic the actual diversity of human thought and culture very well.
But I’m sure we’re going to use them anyway.
Thanks. I hate it.
Just leave your camera off. That’s completely within social norms.
Episode 82 is up. It features an interview with Dr Rianna Walcott of University of Maryland and BCaT Lab about her PhD thesis on Black Twitter, plus discussion of identity online and off, Black feminist theory and its influence and lots more... spotifycreators-web.app.link/e/8jHX15k0k1b
I used to waste years preparing gourmet meals. It was creative, an expression of my humanity, and fostered zen.
Thanks to McDonalds, I don't have to do that!
#Satire
But it’s clearly organized with subheadings. And it’s absolutely searchable.
The culture around syllabi has changed. They are now largely viewed in contractual terms. You are required to put in a lot of boilerplate university policy language now. Because if it’s “not in the syllabus” it’s viewed as unenforceable. Mine is now 20 pgs. (Including the full semester schedule)
LA Taco went from covering local culture and good tacos to doing some of the best on the ground community-embedded reporting about ICE. Well worth supporting.
I loved that show.
Correct. But then students would have to look at the syllabus or talk to me. Which I guess are things we don’t want them to do anymore?
This. My university keeps wanting me to make a “syllabot,” an LLM-based chatbot my students can query about the syllabus. And all I can think about is all the resources being consumed so no one has to use Ctrl+F.
How magnanimous.
OpenAI’s push to become crucial infrastructure in education should not and cannot be separated from its broader entanglements with the US military and mass surveillance that includes students and teachers.
It just means it’s a fully organic artisanal human-generated post!
This really puts the playbook to the front: As consumer you get to buy just enough storage for a terminal and all actual storage is rented from big tech.
We have spent decades using metaphors that teach us to equate computers with our minds. And now people are like “well, if it’s not a mind why does it seem like one??” And this is why we need humanities and social sciences.