I fully believe that the AI chatbots are very intentionally being pushed as an educational/social/creative “tool” to make people less able to have creative and critical thought. if we put all of that energy into something as convenient as chatgpt, whether it’s to help write an essay or use as a “therapist,” we think we’re satiating that human urge to be creative. Creativity is at its most powerful when people can pull from the discomfort and inspiration of the human experience, which can lead to critiques towards the oligarchs running the country right now. convenience will always win people over when they’re exhausted from working in this late stage of capitalism. Also, it would be incredibly easy to inject the bots with propaganda and misinformation.
May 2, 2025

Comments (1)

Make an account to reply.
image
That's been weighing heavy on my mind too. Weaken critical thinking skills for the masses, align tech companies with right-wing authoritarians, and you've got a fascist stew goin!
May 2, 2025
1

Related Recs

🤖
I know people are going to be using it a lot maybe for dumb menial stuff, but I have not had a moment where it felt compellingly as though it was going to “take over”. Everybody I’ve seen who has made that argument is somehow relying on it to be true, so I can’t trust them. I immediately judge anybody who tells me they use chatGPT for something related to their work. And I don’t think I’m being unfair. At best, something like chatGPT is a tool that people can use for brainstorming, but directly using chatGPT output is just pathetic to me. I dunno.
Jan 2, 2025
🤖
nah, i wouldn't use it like that, and i'd strongly recommend that no one does. there have been various pieces written about AI inducing what is essentially religious/spiritual psychosis in people, some of which have even lead to divorce. but, even aside from that, replacing human interaction, companionship or therapy, with a hyper-advanced predictive text chat bot will do nothing but further alienate people, especially under capitalism. i think using it to talk to someone that's deceased isn't quite as bad as the latter, but is still very bad. understanding and coping with mortality is something that humans have been doing the entire time we've existed, and a key part of the way we process death involves not being able to talk to the deceased person anymore. aside from completely removing that aspect of the experience of losing a loved one, what are the ethical implications of this, and how would one even feasibly "get their loved one into the LLM?" would you have to upload their all of their diaries to some cloud based AI service? would you have to painstakingly write their biography into the prompt field yourself? i think a lot of people are using "AI" (i hate that its even called that) for things that it should never have been used for. use it as a tool to summarize your emails and make lists, not as a therapist or girlfriend.
Jun 12, 2025
🤖
I feel like I hear a lot of older generation artists issue statements about AI destroying human creativity and that this overlooks a major facet of our current reality, which is that so much admin is required of artists themselves for creative activity of any kind at almost every professional level. So by all means use whatever tool to help you write emails, web content, newsletters, blog posts, twitter threads, marketing plans, contracts, &c and give any minutes you can save back to your art.
Feb 14, 2024

Top Recs from @spookbug

May 6, 2025
⛏️
May 23, 2025
recommendation image
🍉
The sweet soul of summer
May 31, 2025